AI Score
Confidence
High
EPSS
Percentile
56.5%
In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
github.com/hwchase17/langchain/issues/1026
github.com/hwchase17/langchain/issues/814
github.com/hwchase17/langchain/pull/1119
twitter.com/rharang/status/1641899743608463365/photo/1