AI aids journalists, but truth still needs human judgment, rigor and conscience
What is the unique value proposition of journalism? For me, the answer was always obvious: Journalists are supposed to be a force for good. They are the people who leave the comfort of newsrooms to stand in war zones, listen to eyewitnesses, sift through documents and, after hours, days, sometimes weeks, arrive at a conclusion shaped by judgment, experience and integrity. They are the human filter between events and truth. We trust them not because they are perfect, but because they bother to show up.
Years ago in Paris, I was invited, very politely, to attend the morning editorial meeting of a major newspaper. Twenty editors sat around a long wooden table, exchanging dry jokes and raised eyebrows. Then the editor-in-chief appeared wearing a bowtie, which felt like walking into a caricature of journalism. Let me add, the good kind. For 30 minutes, I watched a newspaper assemble itself: ideas sharpened, claims interrogated, weak angles mercilessly dismissed. That room was a reminder that journalism, at its best, is a craft built on discipline, friction and human scrutiny. Not something you can replace with a prompt.
Today, that craft is under strain. Artificial intelligence is everywhere. When I travel in a Francophone country and face a handwritten menu full of mystery dishes, I snap a photo and ask ChatGPT what I’m about to ingest. And yes, I did laugh at the Instagram post that said: "If you ever feel stupid, remember you finished college before ChatGPT.” As long as AI remains an assistant, it is delightful.
The problem begins when AI is invited to perform vital tasks in reality.
Take the recent Der Spiegel incident. One of Europe’s most prestigious publications accidentally published an article containing a raw ChatGPT prompt: "If you like, I can adjust tone and detail.” There it sat, right in the middle of the story, blinking like a neon sign that read: "Nobody read this before publishing.” It passed every gatekeeper (reporters, editors, copy editors, production staff) and not a single person noticed. This wasn’t a typo. This was a newsroom forgetting what its job is.
But here’s the uncomfortable truth: The real problem is not AI. The real problem is what financial pressure has turned journalism into. Foreign bureaus have closed. Travel budgets evaporated. Senior editors replaced with "leaner” and "faster” (read: cheaper) structures. Reporters are expected to monitor social media, rewrite agency copy, churn out multiple stories a day and sprint until burnout. Under those conditions, AI stops being an assistant and becomes a crutch, one that slowly replaces the muscle it was meant to support.
Tools don’t replace people. People replace themselves when they stop doing the work. And this is where journalism faces an existential crisis. If journalists quietly outsource core functions (fact-checking, editing, even parts of reporting) to an AI model, without disclosure or oversight, the profession hollows out from within. Trust erodes. Accountability dissolves. The very thing journalism is meant to defend – reality – becomes optional.
AI can absolutely support journalism. It can fix grammar, find old articles, help organize notes and translate interviews at midnight. It can make good journalists better. What it cannot do is decide what is true. It cannot detect manipulation, smell a lie, or recognize when silence in a war zone signals that something terrible is unfolding. It cannot replace human judgment or human responsibility.
Journalism’s value lies not in speed or efficiency but in conscience, skepticism and moral clarity. If the industry forgets that, if it trades rigor for convenience, then it won’t be AI that destroys journalism. It will be human negligence, one unchecked paragraph at a time.