In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.

Project Subscriptions

Vendors Products
Langchain Subscribe
Langchain Subscribe
Advisories
Source ID Title
EUVD EUVD EUVD-2023-0118 In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
Github GHSA Github GHSA GHSA-fprp-p869-w6q2 LangChain vulnerable to code injection
Fixes

Solution

No solution given by the vendor.


Workaround

No workaround given by the vendor.

History

Wed, 12 Feb 2025 17:15:00 +0000

Type Values Removed Values Added
Metrics ssvc

{'options': {'Automatable': 'yes', 'Exploitation': 'none', 'Technical Impact': 'total'}, 'version': '2.0.3'}


Projects

Sign in to view the affected projects.

cve-icon MITRE

Status: PUBLISHED

Assigner: mitre

Published:

Updated: 2025-02-12T16:24:39.291Z

Reserved: 2023-04-05T00:00:00.000Z

Link: CVE-2023-29374

cve-icon Vulnrichment

Updated: 2024-08-02T14:07:45.736Z

cve-icon NVD

Status : Modified

Published: 2023-04-05T02:15:37.340

Modified: 2025-02-12T17:15:18.260

Link: CVE-2023-29374

cve-icon Redhat

No data.

cve-icon OpenCVE Enrichment

No data.

Weaknesses