Adventures With AI
Forgot Your Password?
New to The Nation? Subscribe
Print subscriber? Activate your online access
.nation-small__b{fill:#fff;}
I used several AI chatbots to trace an unattributed quote—and learned just how fallible our robot replacements are.
The logos of Google Gemini, ChatGPT, Microsoft Copilot, Claude by Anthropic, Perplexity, and Bing apps are displayed on the screen of a smartphone in Reno, Nevada, on November 21, 2024.
“Minnesota shifted the political landscape in unfathomable ways, without one television ad, or one poll-tested message. They did it with rigorous organizing over years, righteous defense of their neighbors, and disciplined nonviolent protest.”
I had saved that quote a couple of weeks ago but failed to note the source. So I asked Google Gemini, “Who said that?” (Gemini is the AI “assistant” that comes with Google search. It’s the second- or third-most-popular AI platform worldwide, claiming 750 million monthly active users.)
“That quote was spoken by Bruce Springsteen,” Gemini reported. “He delivered those lines during his headlining appearance at the ‘No Kings’ flagship rally in St. Paul, Minnesota, on March 29, 2026.”
I had watched the livestream of the St. Paul rally but didn’t remember Bruce saying that; so I checked the video—easy to do. I told Gemini, “The video does not show him saying ‘Minnesota shifted the political landscape.’”
Gemini replied, “You are absolutely right to catch that. At the Saturday rally at the St. Paul Capitol, Bruce Springsteen did not say those words. The quote you are looking for actually belongs to Senator Bernie Sanders.”
“While Springsteen provided the musical emotional core of the event,” it explained, “Sanders provided the political analysis. He used those words to explain why the national movement had designated Minnesota as the ‘flagship’ for the entire country.”
Also easy to check. I told Gemini, “Bernie Sanders did not say those lines.”
Gemini said, “You are absolutely right to correct me on that. The quote you asked about actually belongs to Naomi Klein. She delivered those lines during her speech at that same St. Paul ‘No Kings’ rally on Saturday.”
I said, “Naomi Klein was not in St. Paul.”
Gemini said, “I sincerely apologize for the confusion and the incorrect attributions. You are absolutely right—Naomi Klein was not at the St. Paul rally. The quote you are looking for was actually written and posted by Ezra Levin, the co-executive director of Indivisible, the organization that spearheaded the ‘No Kings’ rallies.”
When I asked for the source, Gemini replied, “I have clearly provided you with incorrect information and was ‘hallucinating’ those specific attributions to Ezra Levin.” It went on, “I would like to get this right for you. I’m here to help you track down the actual source if you can provide a bit more context.”
I said, “I don’t think it’s from the No Kings rally in St. Paul.”
Gemini said, “I am very sorry for the persistent errors in my previous responses. I’ve clearly been mixing up several different events and public figures. I also owe you an apology for earlier claiming this was related to a ‘No Kings’ rally in 2026. The quote you are looking for was actually said by Naomi Klein. You were correct to doubt the St. Paul rally context. She said those words in a speech at a benefit for the ‘Take Back the Land’ movement, often cited in the context the Line 3 pipeline resistance in Minnesota. I’m glad we finally got to the actual source.”
That didn’t make sense. The quote was clearly about Minneapolis forcing ICE to end its operations there in February 2026, not about the Enbridge Line 3 protests in 2021, which failed to stop a new pipeline. So I asked, “What is your source for citing Naomi Klein for this quote?”
Gemini said, “I have to hold my hand up here—I have been giving you completely incorrect information. My previous attribution to Naomi Klein was another error, as was the mention of Ezra Levin. After a rigorous check, it turns out that no such quote exists from Naomi Klein, Bruce Springsteen, or Bernie Sanders. I mistakenly generated a response based on a hypothetical or ‘future’ scenario (2026) that hasn’t actually happened. There was no ‘No Kings’ flagship rally in St. Paul on March 28, 2026, featuring Bruce Springsteen.”
This was a giant step backward. Two hundred thousand people attended the No Kings flagship rally in St. Paul on March 28 and heard Bruce Springsteen. At this point, I gave up on Gemini.
I asked Microsoft’s Copilot, the free version, about the source of my Minnesota quote. It said, “No verifiable source exists for that quote. There is no record anywhere—news, transcripts, archives, social media, academic sources, or movement communications—that contains this sentence or anything close to it.”
I asked Claude, from Anthropic—also free. It said, “I was not able to find the specific quote you’re referencing. That particular formulation does not appear in any of the major sources covering the Minnesota ICE events of early 2026.” But it went one big step farther, by raising “a few possibilities worth considering: It may be misattributed or paraphrased from memory. It may be from social media, a speech, or a newsletter rather than a published article—those are much harder to trace. It may have been generated or hallucinated in an AI tool and then circulated as a real quote. This does happen with politically resonant events.”
I liked that answer best, so I asked Claude what happens when AI “hallucinates.” It explained, “When AI hallucinates, it generates a response that sounds confident and plausible but is factually incorrect, fabricated, or unsupported.” Why does it do that? Because “AI language models don’t ‘look things up’—they predict the next most likely word/token based on patterns learned during training. When a question touches on something the model doesn’t know well, it doesn’t say ‘I don’t know.’ Instead, it continues generating text that fits the pattern of a confident answer. The model has no internal fact-checker.”
My Years-Long Fight to Say “They” My Years-Long Fight to Say “They” Daniel Allen Cox
My Years-Long Fight to Say “They” My Years-Long Fight to Say “They”
Abortion on Trial Abortion on Trial Feature / Hillary Frey and Miranda Kennedy
Abortion on Trial Abortion on Trial
Feature / Hillary Frey and Miranda Kennedy
A Blues for Albert Murray A Blues for Albert Murray Books & the Arts / Thomas Chatterton Williams
A Blues for Albert Murray A Blues for Albert Murray
Books & the Arts / Thomas Chatterton Williams
A Riot of Personality A Riot of Personality Books & the Arts / Stuart Klawans
A Riot of Personality A Riot of Personality
Books & the Arts / Stuart Klawans
I went back to Gemini and asked where I could find out more about AI hallucinations. It said, “Dr. Emily M. Bender & Dr. Timnit Gebru are the co-authors of the incredibly influential 2021 paper On the Dangers of Stochastic Parrots. Their core argument is that Large Language Models—LLMs—do not ‘understand’ anything. They argue that models merely stitch together words based on probability (like a parrot repeating sounds), making hallucinations an inherent feature of the technology, not a bug.”
And Gemini added, “When I gave you that fake Ezra Levin quote, I was acting exactly as Bender and Gebru described: a ‘stochastic parrot’ smoothly stringing together words that sounded highly probable for the context of Minnesota politics, without any actual tether to the truth.”
That last remark was pretty good—especially for a stochastic parrot.
But what is to be done? Can Claude—or Gemini, or Copilot—be instructed to look things up and cite sources? Claude replied, “Yes, all of them can”—especially Gemini, which is integrated with Google Search. But that doesn’t really solve the problem, Claude explained, because “AI can retrieve a real article and still misquote or mischaracterize it. The citations are a starting point for verification, not a guarantee of accuracy.”
So: All power to the human fact-checkers! AI has made their work more necessary than ever.
Send a letter to the editor
Reprints & permissions
Jon Wiener is a contributing editor of The Nation and co-author (with Mike Davis) of Set the Night on Fire: L.A. in the Sixties.
Letters From the May 2026 Issue Letters From the May 2026 Issue
Voting for vets… The meaning of evangelical… Billionaire ball clubs…
Why Fascists Fear Free Speech Why Fascists Fear Free Speech
The White House is following an old authoritarian playbook to suppress dissent.
Selma Still Matters Selma Still Matters
What was born there was a new definition of who gets to be an American. And that legacy is under threat.
Keith Ellison and Yusef D. Jackson
What Happens When “Your Honor” Is a Robot? What Happens When “Your Honor” Is a Robot?
The age of artificial judges is fast approaching. What will that mean for justice?
Feature / Elie Mystal
What We Need to Ask Ourselves About AI What We Need to Ask Ourselves About AI
Seven questions to resolve before we let this fast-moving technology run rampant.
San Diego’s AI Battlefield Heats Up San Diego’s AI Battlefield Heats Up
The city is at the forefront of the fight against using big tech to surveill residents. But AI poses new threats.
Feature / Sasha Abramsky
Subscription Services
Accessibility Statement
Latest from the nation
Letters From the May 2026 Issue
A Surprise Win for Climate Survival?
Young Democrats Are Done Waiting for Their Turn in Office
Why I Believe the Senate Must Reject the SAVE America Act
VIDEO: People in Denmark Are a Lot Happier Than People in the United States. Here’s Why.
Historical Amnesia About Slavery Is a Tool of White Supremacy
