CVE-2026-25960
vLLM vulnerability analysis and mitigation

vLLM is an inference and serving engine for large language models (LLMs). The SSRF protection fix for CVE-2026-24779 add in 0.15.1 can be bypassed in the load_from_url_async method due to inconsistent URL parsing behavior between the validation layer and the actual HTTP client. The SSRF fix uses urllib3.util.parse_url() to validate and extract the hostname from user-provided URLs. However, load_from_url_async uses aiohttp for making the actual HTTP requests, and aiohttp internally uses the yarl library for URL parsing. This vulnerability in 0.17.0.


SourceNVD

Related vLLM vulnerabilities:

CVE ID

Severity

Score

Technologies

Component name

CISA KEV exploit

Has fix

Published date

CVE-2026-25960CRITICAL9.8
  • vLLMvLLM
  • vllm
NoYesMar 09, 2026
CVE-2026-27893HIGH8.8
  • ChainguardChainguard
  • vllm-openai-cuda-12.9
NoYesMar 27, 2026
CVE-2026-34756MEDIUM6.5
  • ChainguardChainguard
  • py3-vllm-cuda-12.4
NoYesApr 06, 2026
CVE-2026-34755MEDIUM6.5
  • ChainguardChainguard
  • py3-vllm-cuda-12.4
NoYesApr 06, 2026
CVE-2026-34753MEDIUM5.4
  • ChainguardChainguard
  • vllm-openai-cuda-12.9
NoYesApr 06, 2026

Free Vulnerability Assessment

Benchmark your Cloud Security Posture

Evaluate your cloud security practices across 9 security domains to benchmark your risk level and identify gaps in your defenses.

Request assessment

Get a personalized demo

Ready to see Wiz in action?

"Best User Experience I have ever seen, provides full visibility to cloud workloads."
David EstlickCISO
"Wiz provides a single pane of glass to see what is going on in our cloud environments."
Adam FletcherChief Security Officer
"We know that if Wiz identifies something as critical, it actually is."
Greg PoniatowskiHead of Threat and Vulnerability Management