CVE-2025-48944

NameCVE-2025-48944
DescriptionvLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue.
SourceCVE (at NVD; CERT, LWN, oss-sec, fulldisc, Red Hat, Ubuntu, Gentoo, SUSE bugzilla/CVE, GitHub advisories/code/issues, web search, more)
Debian Bugs1095237

The information below is based on the following data on fixed versions.

PackageTypeReleaseFixed VersionUrgencyOriginDebian Bugs
vllmITP1095237

Search for package or bug name: Reporting problems