Description
The flash-attention project thru commit e724e2588cbe754beb97cf7c011b5e7e34119e62 (2025-13-04) contains a code injection vulnerability (CWE-94) in its training script. The script registers the Python eval() function as a Hydra configuration resolver under the name eval. This allows configuration files to execute arbitrary Python code via the ${eval:...} syntax. An attacker can exploit this by providing a malicious configuration file, leading to arbitrary code execution when the training script is run with that configuration.
Published: 2026-05-11
Score: 7.3 High
EPSS: < 1% Very Low
KEV: No
Impact: n/a
Action: n/a
AI Analysis

Impact

The vulnerability is a code injection flaw (CWE-95) located in the training script of the flash‑attention project. The script exposes the Python eval() function as a Hydra configuration resolver named eval, enabling the use of the syntax ${eval:…} in configuration files. When such a file is processed, the eval resolver executes the contained Python expression. This mechanism allows an attacker to run arbitrary Python code during training, potentially compromising the system that runs the training script. The analysis is strictly based on the provided description and does not assume additional impact beyond the injection described.

Affected Systems

Affected components include the flash‑attention project and its training scripts that register the eval resolver. Any environment where the training script is executed with user‑supplied configuration files is at risk. Specific product names and versions are not listed, but the commit reference e724e2588cbe754beb97cf7c011b5e7e34119e62 (2025‑13‑04) identifies the state of the code at the time of disclosure. Users running older, unpatched copies of the project for training or experimentation should consider themselves potentially vulnerable.

Risk and Exploitability

The EPSS score is less than 1%, and the vulnerability is not cataloged in CISA KEV, indicating very low publicly known exploitation activity so far. However, the CVE description implies an arbitrary code execution capability; if an attacker can supply a malicious configuration file, the vulnerability can be exercised. The CVSS score is 7.3, indicating a high severity. The likely attack vector is a local or privileged attacker who can provide or modify configuration files that are then processed by the training script. No additional public exploitation information is available to date.

Generated by OpenCVE AI on May 12, 2026 at 23:32 UTC.

Remediation

No vendor fix or workaround currently provided.

OpenCVE Recommended Actions

  • Update to a newer commit of flash‑attention that removes the eval resolver or disables it in the Hydra configuration handler.
  • Edit the training script to delete or rename the eval resolver so that configuration files cannot trigger eval execution.
  • Implement validation or sanitization for configuration files before they are parsed, rejecting any entries that include the eval syntax or limiting the resolver set to safe, non-executable functions.

Generated by OpenCVE AI on May 12, 2026 at 23:32 UTC.

Tracking

Sign in to view the affected projects.

Advisories

No advisories yet.

History

Wed, 13 May 2026 00:00:00 +0000

Type Values Removed Values Added
Title flash‑attention Training Script Allows Arbitrary Code Execution via eval Resolver

Tue, 12 May 2026 22:45:00 +0000

Type Values Removed Values Added
Title Code Injection via eval Resolver in flash‑attention Training Script
Weaknesses CWE-94

Tue, 12 May 2026 20:15:00 +0000

Type Values Removed Values Added
Weaknesses CWE-95
Metrics cvssV3_1

{'score': 7.3, 'vector': 'CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:L/A:L'}

ssvc

{'options': {'Automatable': 'yes', 'Exploitation': 'none', 'Technical Impact': 'partial'}, 'version': '2.0.3'}


Tue, 12 May 2026 10:45:00 +0000

Type Values Removed Values Added
First Time appeared Dao-ailab
Dao-ailab flash-attention
Vendors & Products Dao-ailab
Dao-ailab flash-attention

Mon, 11 May 2026 18:15:00 +0000

Type Values Removed Values Added
Title Code Injection via eval Resolver in flash‑attention Training Script
Weaknesses CWE-94

Mon, 11 May 2026 16:30:00 +0000

Type Values Removed Values Added
Description The flash-attention project thru commit e724e2588cbe754beb97cf7c011b5e7e34119e62 (2025-13-04) contains a code injection vulnerability (CWE-94) in its training script. The script registers the Python eval() function as a Hydra configuration resolver under the name eval. This allows configuration files to execute arbitrary Python code via the ${eval:...} syntax. An attacker can exploit this by providing a malicious configuration file, leading to arbitrary code execution when the training script is run with that configuration.
References

Subscriptions

Dao-ailab Flash-attention
cve-icon MITRE

Status: PUBLISHED

Assigner: mitre

Published:

Updated: 2026-05-12T18:59:01.836Z

Reserved: 2026-03-09T00:00:00.000Z

Link: CVE-2026-31254

cve-icon Vulnrichment

Updated: 2026-05-12T18:58:55.609Z

cve-icon NVD

Status : Deferred

Published: 2026-05-11T17:16:20.423

Modified: 2026-05-12T20:16:34.317

Link: CVE-2026-31254

cve-icon Redhat

No data.

cve-icon OpenCVE Enrichment

Updated: 2026-05-12T23:45:25Z

Weaknesses