It was a blunder that left ABC News red-faced and stirred a debate on the trustworthiness of the media. In October 2019 the US channel’s ‘World News Tonight Sunday’ programme aired explosive footage “appearing to show Turkey’s military bombing Kurd civilians”. A day later ABC issued an apology when it turned out the clip was actually shot at a Kentucky shooting range.
The error underlined just how damaging missteps can be in a time when the mainstream media is battling increased cynicism fuelled by identity politics. Even though ABC quickly fessed up to the mistake, it didn’t stop partisan opportunists from accusing the broadcaster of bias reporting to fit an agenda.
Blunders – and the accusations that follow – chip away at the trust the public place in the media. In January, Edelman’s annual Trust Barometer revealed 56 per cent of Americans agree “reporters are purposely trying to mislead people by saying things they know are false or gross exaggerations.” It’s a febrile environment that puts extra pressure on newsroom fact-checkers – pressure some believe could be alleviated greatly by new technology.
“Without technology we will not fully regain trust in information,” says Kelly Withers, CEO and co-founder of Affatar, a New Zealand-based company that has developed an app that establishes the provenance of videos and photos. “Unless you have good data it doesn’t matter what you build on top of it. It will be rubbish in, rubbish out. If you have a reliable source of the base information then automated fact-checking is possible, and is possible to do on a scalable and reliable way.”
Keeping it real
When someone takes a photo or a recording using the Affatar app, it’s embedded with a QR code and filed away using Blockchain. Scanning the code with your phone (whether you see the photo or footage in print, on TV or on a website) takes you through to the Affatar portal where you can find the original, undoctored version of the material alongside the date and place it was taken and the username of the source.
Having quick access to the unadulterated source material benefits the media in several ways. Newsrooms will be able to check the authenticity of crowd-sourced videos, which have increased as Covid stops many journalists from travelling. Thanks to a mechanism in the app, media groups can then approach the person behind the footage to secure the rites to use it on their own platforms.
For news organisations, using footage or photos with an Affatar QR code could help restore confidence among members of the public who have lost faith in journalists, and perhaps instill some skepticism in those who believe everything they read on social media.
“What we’re hoping to do is allow people to fact-check on their own,” says Withers. “So if you read an article that you don’t necessarily agree with, instead of saying it’s fake news, you can check the source material. If it’s an interview, you can check what has been taken out of context and what else was said.
“Once the material is stored using the app, you will always have access to the original version because it has been designed in a way that it cannot be corrupted or modified. By empowering consumers to self fact-check, they can regain trust in the media they are consuming and will be more likely to support it financially and politically.”
The next step for Affatar is developing a way to tag news articles, videos or podcasts with a QR code that, when scanned, brings up a list of references and allows users a deep dive into the background of stories being published.
The company is also looking into “white labelling” its products and getting news organisations to embed Affatar technology into their own, allowing, say, New York Times readers to submit material that will be tagged with a QR code using the NYT app.
Testing things in the laboratory
How to use automated fact-checking is a question the industry and academics have been wrestling with for a while. In 2017 the Reporters Lab at Duke University in North Carolina launched a $1.2 million, two-year initiative bringing together journalists, developers and academics to build apps that help fact-checkers do their work.
The Duke Tech & Check Cooperative’s new tools included Squash, a groundbreaking experimental video app that displays relevant fact-checks during a speech or a debate when an elected official repeats a claim that has been checked before.
Despite some real success, the team announced in 2019 it had come to the “sobering realisation”.
“For all that progress, we’ve realised that human help is still vital,” Bill Adair and Mark Stencel, co-directors at the Duke University Reporters’ Lab, wrote on the NiemanLab website. “It’s an important discovery not just for automation in fact-checking, but for similar efforts in other journalistic genres.
“We’ve found that artificial Intelligence is smart, but it’s not yet smart enough to make final decisions or avoid the robotic repetition that is an unfortunate trait of, um, robots.”
Something worth investigating
Out in the field, newspapers have flirted with the idea of robot subbing. “We experimented with automated fact-checking a few years ago but the technology is not where it needs to be to be able to do things in real time,” says Cameron Barr, a managing editor at The Washington Post. “The goal is something that can analyse a politician’s speech in real time and then offer readers information that shows the falsity of what they are saying. That’s a worthy goal and people should continue to push toward it, but we are not there yet.”
With video and audio playing an important role as evidence in investigative journalism, The Post has boosted its fact-checking resources to counter the amount of misinformation that’s out there.
“Fact-checking is one of the most challenging roles because there is an ever-expanding variety of ways of deceiving others. We have to keep up with that. It’s part of our mandate,” says Barr.
“So we have invested a lot in our visual forensics team, which pairs people from our video department with investigative reporters. We seek to accumulate all the available video from an event and then pull that together in one place from a variety of perspectives to show people what the facts are. Readers respond with a great deal of enthusiasm to the fact that they are seeing things with their own eyes.”
Over at ProPublica, the New York-based non-profit that produces investigative journalism, the number of fake videos and stories circulating has put everyone on high alert.
“Misinformation and deepfakes mean we have to be more skeptical than ever in sorting truth from fiction,” says copy editor Colleen Barry. “And the fact that so much news breaks on social media means we’re constantly asking ourselves if someone is who they say they are, if they can be trusted, and how we should make our doubts or uncertainty clear to readers so they can judge for themselves.”
Barry points out that ever-evolving technology is a help as well as a hindrance.
“We have access to more information than ever before. It’s so much easier than it was even 10 or 20 years ago to dig up old newspaper articles, find court records, download vital data sets or track down other people who witnessed an event, which can help journalists find additional sources or put unverified information in context.”
While Barry has not used any automated fact-checking, she can see it playing a role at ProPublica one day. “Assuming it’s reliable, then I think it will likely get added to the fact-checking arsenal,” she says.
Get stories like these directly in your inbox every week
Click here to subscribe to our (free) FIPP World Newsletter
Not so fast
Some news outlets will need more convincing when it comes to automated fact-checking. At German publishing house Axel Springer, home to multimedia news brands like Bild and Die Welt, human fact-checkers remain in the driving seat.
“Fact-checking, scrutinising sources and evaluating contradicting information has always been and will remain the job of trained and experienced journalists,” a spokesperson says. “We at Axel Springer are firmly convinced that there is no better means to fight fake news and disinformation than letting journalists do their job.
“Technology can play an important role in supporting journalists to verify information, but we are convinced that it won’t replace the role of journalists as the ultimate instance of fact-checking within their organisation.”
Companies like Affatar will keep making their case, however, consulting with journalists to improve their software and reaching out to big media groups.
“Adoption by media agencies is at the early stages of our journey,” Withers admits, but warns that time is of the essence in the fight to instill more trust in the mainstream media.
“Because we are addressing this problem so late in the game, trust has been lost and people are more than willing to consume coverage from social media where things aren’t fact checked,” he says. “Media companies are trying to combat that by trying to move at the same speed and you are never going to be able to do that in a reliable way. So mistakes happen that erode trust even more.
“Once trust is lost, it’s a lot harder to regain than it was to gain it in the first place. You can’t just return to the norm and think people will trust you again. You have to go above and beyond to show that you’re doing more and that you won’t make the same mistakes again.
“The fact that there is so little trust in the media is terrifying. If the experts don’t have any authority then the crackpots on Facebook have the same level of authority and that’s why you have this incredible surge, especially in America, of misinformation and conspiracy theories.
“We are trying to restore objective truth. If you go back to the original source material you can ensure that things like misinformation and deepfake videos aren’t leading us down dark rabbit holes.”