More stories won’t save us
In an AI-driven media landscape, the stories that stand out will be the ones only humans can truly produce.
I feel like everyone switched overnight to talking about the AI bubble to the AI inflection point. Claude Code and the current agent push has everyone both excited and nervous, but while AI enthusiasts are more productive than ever, journalists have good reason to be suspicious of the pressure to produce more.
More about what this moment means for the news media in today’s column, but first a reminder that the Revved Up conference for media leaders is coming up fast, and The Media Copilot is thrilled to be a media partner. If you haven’t signed up yet, you only need to scroll down to find out why you should 👇
A MESSAGE FROM THE MEDIA COPILOT
The Media Copilot is partnering with H2K Labs for RevvedUp 2026, and it’s shaping up to be one of the more substantive gatherings for people focused on the future of marketing, media, and technology.
I’ll be on site covering the event and hosting a live, on-stage conversation with Stephanie Cohen, chief strategy officer at Cloudflare, about trust, performance, and what the open web looks like in an AI-saturated environment.
If you’re planning to attend, use my tracking link to register and use the code RUPMEDIACOPILOT for 10% off. Strong lineup, thoughtful discussions, and a room full of operators building what comes next.
Is journalism about to experience its ‘AI inflection point?’

At the best of times, it’s tough to separate AI news from AI hype. But the latest rush around agents, triggered when a plethora of developers went on holiday benders with Claude Code, feels like a real shift. Between the viral freakout over Moltbook, the agent social network, and the Super Bowl ad slap fight between OpenAI and Anthropic, AI has jumped to a new tier of mainstream attention.
Talk of the “AI bubble” has basically evaporated, replaced by the industry’s favorite new term: the AI “inflection point.” That’s said to be the moment when AI in general, and agents in particular, start swallowing big chunks of knowledge work—with consequences that spill into the economy, hiring, and how entire companies function. If you want a tell for how seriously this is being taken, look no further than the recent SaaS sell-off.
For journalists, this kind of noise has a familiar side effect. Mix relentless AI coverage with the steady drumbeat of layoffs in media, and you get the same old pressure wearing a new outfit: do more. When newsrooms shrink and AI tools get pitched as productivity machines, it’s easy to conclude the “right” response is higher output.
But AI isn’t only changing how stories get produced; it’s changing how stories get discovered. So the urge to use AI to do “more with less”—which, in practice, often means publishing the same kinds of pieces faster and more frequently—aims straight at the wrong target.
That’s because of a contradiction in how AI systems surface information.
Read the rest at mediacopilot.ai
Fake news at machine speed: Alex Mahadevan on AI’s impact on media trust
AI is already embedded in how people discover and consume news, from search to chat interfaces to automated summaries. So the question is no longer whether journalism will be shaped by AI. It’s how newsrooms maintain trust while experimenting responsibly.
In this episode of The Media Copilot podcast, Pete Pachal sits down with Alex Mahadevan, Director of MediaWise and a faculty member at Poynter, to unpack what media literacy looks like now that anyone can generate convincing content at scale. Alex shares how his background in data and local journalism shaped his approach to tools, why public-facing AI ethics policies matter, and what it will take for news organizations to bring audiences along for the next phase of the information ecosystem.
The Chatbox
All the AI news that matters to media*
Cleveland.com tests journalism’s third rail

Cleveland.com editor Chris Quinn is defending a workflow that has AI writing news stories while reporters focus on gathering information. Quinn says removing writing from reporters’ workloads frees up an extra day per week for them to meet sources and conduct interviews. The system uses an “AI rewrite specialist” to turn reporters’ raw material into drafts, which human editors then fact-check before publication. Journalism educators pushed back hard, with Missouri professor emeritus Stacey Woelfel arguing that writing is inseparable from reporting because the shape of a story informs how it’s gathered. Quinn blamed journalism schools for teaching students to fear AI, but the criticism raises a real question about what gets lost when you treat writing as a task to automate rather than a core part of how journalists think. As newsrooms get squeezed, expect more to test this boundary. (AI-assisted)
When the AI expert falls for AI
Ars Technica pulled a story after readers spotted fabricated quotes generated by AI tools, in a case that would be funny if it weren’t so instructive. Senior AI reporter Benj Edwards used a Claude Code-based tool and ChatGPT to extract quotes from a short blog post while sick with COVID, and the tools hallucinated paraphrased versions of things the source never actually said. The source material was two pages of plain English, making the decision to use AI for such a basic task all the more puzzling. Editor-in-chief Ken Fisher called it a serious failure of Ars’s standards, and the outlet pulled the entire piece rather than issuing corrections. The takeaway is sobering: even reporters who cover AI’s shortcomings every day aren’t immune to them. (AI-assisted)
*AI-assisted news items are drafted with AI and then carefully edited by Media Copilot editors.



