Description
The huggingface/transformers library is vulnerable to arbitrary code execution through deserialization of untrusted data within the `load_repo_checkpoint()` function of the `TFPreTrainedModel()` class. Attackers can execute arbitrary code and commands by crafting a malicious serialized payload, exploiting the use of `pickle.load()` on data from potentially untrusted sources. This vulnerability allows for remote code execution (RCE) by deceiving victims into loading a seemingly harmless checkpoint during a normal training process, thereby enabling attackers to execute arbitrary code on the targeted machine.
Exploits (1)
nomisec
WORKING POC
by rooobeam · poc
https://github.com/rooobeam/Pickle-Deserialization-Exploit-in-Transformers
References (2)
Core 2
Core References
Exploit, Third Party Advisory
https://huntr.com/bounties/b3c36992-5264-4d7f-9906-a996efafba8f
Scores
CVSS v3
9.6
EPSS
0.2443
EPSS Percentile
96.1%
Attack Vector
NETWORK
CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:H/I:H/A:H
CISA SSVC
Vulnrichment
Exploitation
none
Automatable
no
Technical Impact
partial
Details
CWE
CWE-502
Status
published
Products (2)
huggingface/transformers
< 4.38.0
pypi/transformers
0 - 4.38.0PyPI
Published
Apr 10, 2024
Tracked Since
Feb 18, 2026