Unauthorized group has gained access to Anthropic’s exclusive cyber tool Mythos, report claims

TechCrunch / 4/22/2026

📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical UsageIndustry & Market Moves

Key Points

  • An unauthorized group reportedly obtained access to Anthropic’s cybersecurity tool “Mythos,” which had been announced as an enterprise security product.
  • Bloomberg claims the group accessed Mythos via a third-party vendor and used multiple strategies, including leveraging access held by someone interviewed by Bloomberg.
  • Anthropic says it is investigating the claim and has found no evidence so far that any unauthorized activity impacted Anthropic’s own systems.
  • The group allegedly used Mythos regularly after gaining access, and it reportedly provided Bloomberg with screenshots and a live demonstration.
  • The reporting suggests the access may have occurred immediately after Mythos’ public announcement, with the group attempting to determine Mythos’ online location based on Anthropic’s prior model formats.

A group of unauthorized users has reportedly gained access to Mythos, the cybersecurity tool recently announced by Anthropic.

Much has been made of Mythos and its purported power — an AI product designed for enterprise security that, in the wrong hands, could become a potent hacking tool, according to the company. Now Bloomberg has reported that a “private online forum,” the members of which have not been publicly identified, has managed to gain access to the tool through a third-party vendor.

“We’re investigating a report claiming unauthorized access to Claude Mythos Preview through one of our third-party vendor environments,” an Anthropic spokesperson told TechCrunch. The company said that, so far, it has found no evidence that the supposedly unauthorized activity has impacted Anthropic’s systems in any way.

The unauthorized group tried a number of different strategies to gain access to the model, including using “access” enjoyed by the person who was interviewed by Bloomberg. That person is currently employed at a third-party contractor that works for Anthropic, the outlet reported.

Members of the group are part of a Discord channel that seeks out information about unreleased AI models, the outlet reported. The group has been using Mythos regularly since gaining access to it, and provided evidence to Bloomberg in the form of screenshots and a live demonstration of the software.

Bloomberg reports that the group, which supposedly gained access to the tool on the same day it was publicly announced, “made an educated guess about the model’s online location based on knowledge about the format Anthropic has used for other models.” The group in question is “interested in playing around with new models, not wreaking havoc with them,” the source told the outlet.

Mythos was released to a select number of vendors, including big names like Apple, as part of an initiative called Project Glasswing. The limited release of the model was designed to prevent its use by bad actors. The tool could be weaponized against corporate security instead of bolstering it, Anthropic said.

Techcrunch event

Meet your next investor or portfolio startup at Disrupt


Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $410.

Meet your next investor or portfolio startup at Disrupt


Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $410.

San Francisco, CA | October 13-15, 2026

If true, unauthorized use of Mythos could spell trouble for Anthropic, which provided the exclusive release to allay the company’s concern for enterprise security.