You can read this story in German here.

Editor’s note: “Roadrunner,” a documentary about a celebrity chef and author Anthony Bourdain, uses AI to have the famous chef narrate things he never said. 

This raises the question of how we should deal with digital documents. It also presents a new challenge for journalists: How should journalistic research evolve so we can distinguish between what is real and what is fake or better: what is the fake past and what is real history. 

The following article is the first in a 3-part series on how journalist research and fact-checking tools need to adapt to a new reality.


  • Part 1 – The problem: Why today’s methods in Journalism are no longer sufficient.
  • Part 2 – The skills: What future skills do editors and journalists need.
  • Part 3 – The structures: What should processes and division of labour look like in editorial offices and newsrooms. 

Ulm is a town in Southwest Germany with a population of around 120,000. It’s also the home of Zerforschung, a “friendly collective of people who enjoy taking apart technology to understand how it works.”

“We look at technology intensively until we understand it”, says Thomas Sänger, a self-described “software archaeologist”. The group is made up of half a dozen people in their 20s, whose analytical minds have recently focused on something quite far from archeology – namely an 18-month old food delivery start-up called Gorillas. 

Gorillas promise to bring your food in 10 minutes. The Berlin-based firm is growing fast – a recent funding round raised $35 million and valued it at $1 billion. It already operates in 21 cities in Germany and seven other European countries, including Spain, Italy, France and the UK.

Ulm is not one of the places where Gorillas is active. But this didn’t stop the members of Zerforschung, who installed the app and started taking it apart. They soon raised some concerning questions over Gorillas’ privacy policies.

“In total, we were able to retrieve the data of over 1,000,000 orders, the associated 200,000 customers as well as drivers”, Zerforschung explains on its website. The group further adds: “with every new app we install, we take a quick look at the data traffic with mitmproxy”. 

The mitmproxy is a tool to intercept HTTP and HTTPS requests, whose name comes from “Man-in-the-middle”. Among others, it allows you to see with which other places or servers an app is communicating with online and the contents of what is being communicated. 

Zerforschung noticed the app loads and returns data from two different Google Cloud directories. Normally, these directories can’t easily be accessed. In this case of the Gorillas app, however, they could. 

Follow the data

One directory, the “gorillas-public” folder, contained more than just the typical product photos you would expect to find there. There were also pictures of front doors and doorbell signs. “The photos come from drivers who are apparently supposed to take photos after they have delivered an order. There is nothing about this in the privacy policy,” says Zerforschung.

The other directory is called “Eddress”. “After simple research, it turned out that Eddress is a company based in Pakistan and Lebanon that offers white-label courier software, i.e. software that is deliberately kept neutral, which the individual delivery services can then design to suit their own company” reported Zerforschung. 

German media widely covered the problems uncovered by Zerforschung. A lot of it was typical gloating about start-ups and their lax standards. Others noted how internationally connected the economy and start-ups already are – and how data tools can find such connections. But perhaps more important than the coverage itself was how a volunteer group like Zerforschung made it possible.

[Do you like this article? Subscribe to The Fix Weekly Newsletter and receive our
best stories, as well as exclusive opinions, job offers and opportunities, directly in your inbox

Suspended in the analogue world

The case shows how important it is for IT-savvy people and journalists to work together – and how relevant technology know-how is for journalists. But is that enough? As more of these issues – like data protection, IT infrastructure, and the digital economy – come to the fore, the more urgent it becomes for journalists and editorial offices to also expand their own research spaces.

As former IBM-Germany CEO Gunter Dueck explains in his lectures, the internet has long since become “the operating system of our society”. In addition to the traditional, familiar analogue social space, a second, digital social space has emerged – and the two are becoming increasingly difficult to separate. The internet and its communication structures are not a channel like television, which can simply be switched off.   

In the future, journalists must therefore be qualified to work in this space. We cannot just leave this to dedicated external leisure experts like the members of Zerforschung, or to civil society, NGOs, consumer protection organisations or universities. But in order to do that, we need new skills.

These new skills are not only needed in the context of data or consumer protection. We also need this new journalistic expertise in all other aspects of society – including the question of cultural history and identity. This is clearly shown by a documentary released at the end of june. 

More From The Fix: Artificial Intelligence in media: Automated content opportunities and risks

AI-created historical facts?     

“Roadrunner” traces the life of Anthony Bourdain. The celebrity chef, who came into his own at New York restaurant Les Halles, was famous for his storytelling in books like “Kitchen Confidential: Adventures in the Culinary Underbelly” and his travel and food reporting shows. On 8 June 2018, he committed suicide on a production trip in France.

“Roadrunner” mixes exclusive authentic footage and sound with an artificial rendering of Bourdain’s voice made with new AI-based technology. Filmmaker Morgan Neville used this method to allow Bourdain to voice in the movie what he may have never explicitly said aloud during his life – sentences, for instance, from emails Bourdain had written to his friend David Choe.

Neville said there were two other such uses of the technology in the film, but refused to specify them. “If you watch the film […], you probably don’t know what the other lines are that were spoken by the A.I., and you’re not going to know”, Neville told The New Yorker magazine in July.  “We can have a documentary-ethics panel about it later”. 

One might argue that this method is just as legitimate as, say, the Gonzo Journalism or New Journalism of Hunter S. Thompson, who always described his own impressions, experiences, associations, and fantasies. 

But if you read Thompson’s texts, it quickly becomes clear that he is at the interface between journalism, essay, and literature. Documentaries carry a different promise – the promise to state something real about the past.

Neville’s decision sparked a heated debate: Filmmaker Alan Barker, who lectures on the ethics of documentary filmmaking, took an extremely critical view: “There is an unwritten social contract that documentaries are made up of facts, not fabrications. The Bourdain case is particularly bad because so much of the information in a voice recording is not verbal. That Morgan apparently concealed the forgery until confronted by a reporter makes matters much worse”, he said

This is not a case about some deep-fakes of a few celebrities. Documentaries construct our image of the past – and thus our image of the present and the future. That determines our behavior, our values, and fears. This is what George Orwell described in his classic novel, “1984”.

More on Wikipedia: The wiki page about the feature film documents – part of the discussion about use of artificial intelligence

New methods and new tools for journalists

What would happen now if we could no longer rely on our school books or textbooks, which may be all digital in the future? If we couldn’t be sure for example who was possibly removed or added to a photo, who said what or didn’t say what, who laughed, mocked, or cried? Tomorrow could already be the day – and maybe it is already today?

But what if our media, editors and journalists don’t have the knowledge, methods and tools (or don’t master them) to recognise such fakes, expose them and contrast the fake with a true past or a true present? 

Simple fact-checkers are no longer sufficient here. And who can use analogue means to check what Anthony Bourdain really said during his lifetime? So, once again, journalism must evolve and acquire the skills to carry out its mission as a watchdog in the digital world as well. 

How this can happen will be the focus of the next two episodes in this series.

More From The Fix: AI and journalism ethics: a conversation with Mirabelle Jones

Photo by Frankie on Unsplash