In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
Metrics
Affected Vendors & Products
References
History
No history.
MITRE
Status: PUBLISHED
Assigner: mitre
Published: 2023-04-05T00:00:00
Updated: 2024-08-02T14:07:45.736Z
Reserved: 2023-04-05T00:00:00
Link: CVE-2023-29374
Vulnrichment
No data.
NVD
Status : Modified
Published: 2023-04-05T02:15:37.340
Modified: 2024-11-21T07:56:57.473
Link: CVE-2023-29374
Redhat
No data.