CVE-2024-12704

HIGH

Llamaindex < 0.12.6 - Infinite Loop

Title source: rule
STIX 2.1

Description

A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.

Scores

CVSS v3 7.5
EPSS 0.0035
EPSS Percentile 57.5%
Attack Vector NETWORK
CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H

CISA SSVC

Vulnrichment
Exploitation poc
Automatable yes
Technical Impact partial

Details

CWE
CWE-835
Status published
Products (3)
llamaindex/llamaindex 0.12.5
pypi/llama-index-core 0 - 0.12.6PyPI
pypi/llama_index 0 - 0.12.6PyPI
Published Mar 20, 2025
Tracked Since Feb 18, 2026