menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Anthropic just made AI scarier

12 0
22.04.2026

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Anthropic just made AI scarier

Why the company’s new AI model is a cybersecurity nightmare.

How powerful is AI? Enough that Anthropic, a leading AI company, announced earlier this month that its latest AI model, Claude Mythos Preview, would be available only to a limited number of businesses due to security concerns — at least for now.

Claude Mythos Preview was designed for general use, Anthropic says, but during testing, the company found it extremely effective at identifying vulnerabilities in the security systems of all types of software, creating potentially massive security concerns.

So far, Anthropic is sharing the Mythos Preview model with a handful of major tech companies and banks through a program called Project Glasswing, intended to give them an opportunity to shore up any existing security vulnerabilities and get ahead of potential hacking attempts that the model could identify.

To get a better sense of what Claude Mythos Preview represents and the potential threat it brings to online security, Today, Explained co-host Sean Rameswaram spoke with Hayden Field, senior AI reporter at The Verge.

Below is an excerpt of their conversation, edited for length and clarity. You can hear the full episode wherever you get podcasts — including Apple Podcasts, Pandora, and Spotify.

What is Claude Mythos?

Mythos is [Anthropic’s] newest AI model that they designed to be a general-purpose AI model like any other. But what they realized when they were working on it was that it had these special skills that they........

© Vox