CVE-2026-27940

HIGH

llama.cpp <b8146 - Memory Corruption

Title source: llm

Description

llama.cpp is an inference of several LLM models in C/C++. Prior to b8146, the gguf_init_from_file_impl() in gguf.cpp is vulnerable to an Integer overflow, leading to an undersized heap allocation. Using the subsequent fread() writes 528+ bytes of attacker-controlled data past the buffer boundary. This is a bypass of a similar bug in the same file - CVE-2025-53630, but the fix overlooked some areas. This vulnerability is fixed in b8146.

Exploits (1)

nomisec STUB
by ngtuonghung · poc
https://github.com/ngtuonghung/CVE-2026-27940

Scores

CVSS v3 7.8
EPSS 0.0001
EPSS Percentile 3.4%
Attack Vector LOCAL
CVSS:3.1/AV:L/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H

Details

CWE
CWE-122 CWE-190
Status published
Published Mar 12, 2026
Tracked Since Mar 13, 2026