Using artificial intelligence in place of real journalism is an issue hitting a fever pitch, and frankly, it feels ick.
If you know a thing about AI, you know it’s capable of making up stories and presenting them as fact. The problem is that some news outlets are replacing educated and seasoned journalists with AI’s fake outputs.
It’s one thing to put out garbage. It becomes dangerous when swaths of people don’t know any better and take what they read from AI as fact, no questions asked.
Cleveland Plain Dealer editor Chris Quinn defended his paper’s use of AI in place of journalists in some roles, suggesting that journalism professors aren’t preparing students for Journalism 2026. I don’t debate that AI can save time culling stories, but to say it in any way replaces how to judge a story’s value is off-base. I also agree with him when he suggests that aspiring journalists learn to understand government, business and non-profits. But skip journalism classes? No.
I learned more about journalism and PR in the field than in school, but class time adds priceless thought-training for the human database. If anything, I wish I’d gone to law school before communications classes because they encompass everything and more than Quinn recommends. When people ask what to study to succeed in a communications role, I tell them to be curious and study anything that strikes them, regardless of industry; ask questions about everything to be a well-rounded journalist or PR specialist. Learn to sniff out a story and produce it in ways your audience understands and appreciates.
That last thing — that’s a muscle to work from Day One. AI can stunt growth that would otherwise grow stronger with experience. It’s like if you sat all day and never used your legs. Eventually, they lose muscle.
The AI-above-all model raises too many questions, especially for consumers. If you have doubts, consider this a wake-up call: “Ars Technica published an article containing fabricated quotations generated by an AI tool and attributed to a source who did not say them. That is a serious failure of our standards…”
Hey, at least Ars Technica admitted the error.
Artificial intelligence is not journalism
AI is a digital machine that spits out stuff. Some of what you read from AI might be fact; other content, as Merriam-Webster aptly highlighted, is slop. It’s made up. It’s lazy. It’s lazy in any industry, but it’s particularly irresponsible if you call yourself a journalist.
AI can fill space. Real, fact-based storytelling on which journalism is based, holds sources and creators accountable. Real journalism makes you react a certain way — to feel. It will educate you and make you think. The other could well make you dumber and ill-informed.
Human-generated journalism is a gift
AI can’t strike up a conversation on a bus and learn a story like the LA Times’ Sam Farmer did at the Winter Olympic Games in Milan Cortina. AI doesn’t know powerful prose from digital slop. Sam’s piece about an Olympic hopeful who didn’t compete for reasons that didn’t involve performance proves that.
••••••••
Some trusted outlets that don’t produce pieces with AI: https://bsky.app/starter-pack-short/8cn1XfT
©2026 Gail Sideman; gpublicity.com
###