OpenAI Rotates macOS Certificates Following Axios Supply Chain Breach
openai
| Source: Mastodon | Original article
OpenAI announced on Tuesday that it has rotated all macOS code‑signing certificates after a malicious version of the open‑source library Axios slipped into its continuous‑integration pipeline. The compromised package was downloaded during a routine build, triggering a broader software‑supply‑chain attack that could have allowed a forged binary to run on users’ machines. OpenAI’s security team revoked the affected certificates and issued new ones, urging developers and end‑users to update any OpenAI‑branded macOS applications before the old certificates are blocked in May 2026.
The incident matters because macOS code‑signing certificates are the trust anchor that lets the operating system verify an app’s authenticity. If an attacker can sign a malicious binary with a valid certificate, the app can bypass Gatekeeper and execute with the same privileges as a legitimate program. Although OpenAI says no user data or internal systems were accessed, the breach exposed a critical weakness in the company’s dependency management and highlighted the growing risk of third‑party libraries being weaponised in CI environments.
OpenAI’s swift rotation mirrors industry best practice after similar supply‑chain compromises, such as the 2023 SolarWinds and 2024 Log4j incidents, and underscores the need for tighter verification of build‑time dependencies. The company has also pledged to audit its CI workflows and to work with Axios maintainers to patch the vulnerable release.
What to watch next: Apple’s security response, including any additional notarisation checks for affected apps, will be closely monitored. OpenAI is expected to publish a detailed post‑mortem in the coming weeks, and regulators may scrutinise the incident under emerging EU and US software‑supply‑chain guidelines. Developers using OpenAI’s macOS SDK should verify they are running the latest signed binaries and review their own dependency‑checking processes to avoid similar exposure.
Sources
Back to AIPULSEN