CVE-2025-32444

CRITICAL

Vllm < 0.8.5 - Insecure Deserialization

Title source: rule

Description

vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Versions starting from 0.6.5 and prior to 0.8.5, having vLLM integration with mooncake, are vulnerable to remote code execution due to using pickle based serialization over unsecured ZeroMQ sockets. The vulnerable sockets were set to listen on all network interfaces, increasing the likelihood that an attacker is able to reach the vulnerable ZeroMQ sockets to carry out an attack. vLLM instances that do not make use of the mooncake integration are not vulnerable. This issue has been patched in version 0.8.5.

Scores

CVSS v3 10.0
EPSS 0.0248
EPSS Percentile 85.1%
Attack Vector NETWORK
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H

Classification

CWE
CWE-502
Status published

Affected Products (2)

vllm/vllm < 0.8.5
pypi/vllm < 0.8.5PyPI

Timeline

Published Apr 30, 2025
Tracked Since Feb 18, 2026