Media tech regulation in a post-Trump world

In this Week’s Kick-off, it’s clear that big tech regulation is starting to dominate the media psyche in 2021. Since the attack on the US Capitol on 6 January, and subsequent removal of the former US President from social media sites, debates around freedom of speech, privacy concerns, and content accountability, have heated up.

And it’s not just social. Google now finds its relationship with the traditional press increasingly fraught, to the point of reportedly threatening to withdraw from Australia altogether.

So as the post-Trump era begins to unfold before our eyes, just where does the land lie with regards to media tech regulation? Broadly speaking, there are currently four main areas of contention that will need to be resolved:

  1. Twitter vs. The People
  2. Facebook vs. Everybody
  3. Google vs. Publishers
  4. Media vs. Politics

And before we take a brief look at each of them, it’s worth revisiting a timeline of some of the more recent events that have brought us to where we are today:

All of which has created a 2021 patchwork of criss-crossing media tech issues that will need to be resolved. And revisiting our initial list of four key areas above, let’s take a brief introductory look at each of them individually. 

Twitter vs. The People

It’s useful to conceptualise the current Twitter debate within a ‘freedom of speech’ context, because it’s not actually that big a channel within the wider social media landscape. There are currently estimated to be around 330 million monthly (not daily) Twitter users in the world (compare that with Facebook’s 2.7 billion), and the platform is still considered by some to be a micro-blogging service, since a much smaller number of users account for vast amounts of its content.

So in some ways the debate surrounding Twitter is more of a symbolic one, weighing up the balance between the protection of free speech vs. the policing of hate speech. Angela Merkel’s recent position in response to former President Trump’s ban from the platform tells you everything you need to know about this aspect of the debate: “The right to freedom of opinion is of fundamental importance,” a spokesperson said on behalf of the German Chancellor on Jan 11.

But of course while you are entitled to your own opinion, you are not entitled to your own facts, and so the question becomes how can (and how must) we apply theoretical discourse about freedom of speech in the real (digital) world. Donald Trump did lie about winning the 2020 US Election repeatedly on Twitter. People did die on the subsequent attack that took place on the Capitol building. And real-life actions (and words) must have consequences, even if they are spoken on digital channels. Therefore going forward, regulation surrounding Twitter will likely focus on how best to draw the distinction between opinion and fact, and where the land should lie more generally between being able to speak freely, versus being able to shout ‘Fire!’ in a crowded theatre.

How do we police a digital realm as large as this, in the same way that we have traditionally policed physical territories?

Facebook vs. Everybody

While freedom of speech is an important part of tech regulation, it is as discussed not always the most tangible or immediate issue that can be tackled on the ground. In response to the afore-mentioned Capitol attack, FIPP CEO James Hewes emphasised that for him, the real consideration at play here is less to do with free speech and more about the obvious need for digital platforms to be regulated as publishers.

Accountability leads to responsibility, and it’s from there that definitive rules, processes, consequences, and ultimately regulatory structure can be drawn. If for example, anti-vax information is being published and shared across Facebook’s pages – pages from which it generates advertising revenue – then the platform must take responsibility for this dissemination of fake news, and lead the efforts to remove and counteract it.  

But for Facebook, a digital realm let’s not forget with a population of 2.7 billion people, the sheer size and scope of the platform along with accompanying branches like Instagram and WhatsApp, means that regulation must go much further than simply stopping the spread of hate speech and fake news. Issues surrounding privacy, security, political influence, bias, competition, advertising standards, copyright, royalties, and others are not going away anytime soon. So here the question becomes: How do we police a digital realm as large as this, in the same way that we have traditionally policed physical territories?

Google vs. Publishers

In the case of Google, the argument about who owns the content is actually being made in the opposite direction, because we are talking primarily about professional publishers. Relations between Google and the traditional industry have longsince been complicated. On the one hand, it is of course accepted that Google provides one of the main traffic sources to publisher websites. But increasingly, there is also an acknowledgement that the tech giant attracts eyeballs in the first place partly because people want to access the news, and so in-turn have their part to play in helping to fund a free and fair press.  

The situation reached boiling point when recently, the BBC reported that Google “has threatened to remove its search engine from Australia over the nation’s attempt to make the tech giant share royalties with news publishers.” While Australia is getting the stick, Europe appears to be being offered the carrot, with an association of French publishers reportedly having already struck a deal with Google over content rights.

What all of this really highlights is that content and distribution continue to have a symbiotic relationship. Without content, there is nothing to distribute, and without distribution channels, quality content would never get seen. So again an agreement must be reached, and without proper checks and balances on the operational integrity of both sides of the industry, that will be very difficult.

Media tech has reached an age of maturity. It must now be given a seat at the grown-ups table.

Media vs. Politics

If we look at the discussions around free speech on Twitter, the wider question of how to police digital realms in a physical world presented by Facebook, and the need to foster a healthy symbiotic relationship between Google and a free and fair press, all of these issues represent the emergence of a whole new dimension in our socio-political structures. And unfortunately on January 6, that dimension burst right through the doors of the Capitol building, and five people lost their lives.

Furthermore, government jurisdictions are limited by national borders. Digital media is not. If Twitter locks out China’s embassy for posting a dehumanising tweet, or the Australian government contests intellectual property practices that Google applies happily in other countries around the world, such issues carry real geo-political repercussions. We are at a point now more than ever before, where digital media is influencing national politics, and those politics are in-turn helping to shape our shared global media.

While it’s tempting to think that a media unbound by government jurisdictions can have only a liberating effect on national politics, we have also seen in more recent years that the opposite can also be true. Like Google’s relationship with the traditional press, the wider media tech industry’s relationship with politics is now a symbiotic one. One will influence the other. And vice versa. It’s therefore crucial that we find ways to allow media tech to operate alongside national and international politics, in a way that distinguishes the bad impacts from the good.

Ultimately, what the issues in all of these areas show, is that in 2021 media tech has reached an age of maturity. It must now be given a seat at the grown-ups table, to officially help shape the social, political, and economic discourse that it is already influencing so significantly. But with great power comes great responsibility. And for those platforms that came to prominence following the widespread adoption of the internet at the turn of the century, and helped to bring greater information and communications equality to communities around the world, they too must now play their part in ensuring that this power is used for good, and not hijacked by evil.  

Topics

Your first step to joining FIPP's global community of media leaders

Sign up to FIPP World x