Mythos AI threat prompts Bessent and Powell to summon bank CEOs for urgent talks

The fear of Mythos AI is real: enough for US regulators to call an urgent meeting and evaluate what Anthropic’s advanced AI model could mean for banks.

The meeting took place on Tuesday, and Treasury Secretary Scott Bessent and Federal Reserve Chairman Jerome Powell sat down with CEOs of Wall Street banks to discuss potential cybersecurity risks related to Mythos, people familiar with the matter told Bloomberg.

Participants included CEOs of Citigroup Inc, Morgan Stanley, Bank of America Corp., Wells Fargo & Co. and Goldman Sachs Group Inc. All of these are designated as systemically important, meaning disruptions to their operations could have global repercussions.

Mythos, an advanced artificial intelligence model developed by Anthropic, is designed to identify and exploit vulnerabilities in software systems when prompted. Unlike typical consumer-facing AI tools, Mythos is geared toward software engineering and cybersecurity tasks. Its specialty is identifying critical software vulnerabilities and bugs, but it can also create sophisticated exploits.

The episode highlights a fundamental shift in how regulators frame AI risk, not simply as a technological challenge, but as a potential catalyst for systemic events.

This has already raised red flags in the crypto sector, where experts are concerned that Mythos’ ability to discover and exploit zero-day vulnerabilities in real time at a low cost poses a risk to DeFi infrastructure.

Anthropic has therefore taken a cautious approach and launched the product only to a small group of large technology and financial companies under “Project Glasswing.”

Anthropic has previously revealed that it consulted with US officials prior to the launch of Mythos regarding both its defensive and offensive cyber capabilities. The company is also separately involved in a legal dispute with the Pentagon, which has designated it as a supply chain risk, a classification that Anthropic is challenging in court.

Leave a Comment

Your email address will not be published. Required fields are marked *