[EOE Archive, TE article] Should the Tech Giants be Liable for Content?
Should the Tech Giants be Liable for Content?
Tech platforms are not neutral.
Turning them into censors does not solve that problem
Google marked its 20th birthday last week. It celebrated in fitting style - being lambasted by politicians in Washington. Its failure to send a senior executive to a congressional hearing, over Russian use of tech platforms to meddle in the presidential election in 2016, was tone-deaf. Like Facebook and Twitter, whose top brass did not show up, Google wields too much influence to avoid public scrutiny. A vital debate is under way over whether and how tech platforms should be held responsible for the content they carry. Angering legislators increases the chance of a bad outcome.
Back when Google, Facebook, Twitter and others were babies, the answer that politicians gave on the question of content liability was clear. Laws such as America's Communications Decency Act (CDA), passed in 1996, largely shielded online firms from responsibility for their user's actions. Lawmakers reasoned that the fledging online industry needed to be protected from costly lawsuits. They were to be thought of more as telecoms providers, neutral venues on which customers could communicate with each other.
That position is hard to maintain today. Online giants no longer need protection: they are among the world's most successful and influential firms. Nearly half of American adults get some of their news on Facebook; Youtube, Google's video-streaming service, has 1.9bn monthly logged-on users, who watch around 1bn hours of video every day. To complaints about trolling, fake news and extremist videos, the old defense of neutrality rings hollow. The platforms' algorithms curate the flow of content; they help decide what users see.
The pendulum is thus swinging the other way. Lawmakers are eroding the idea that the platforms have no responsibility for content. Earlier this year America passed the SESTA act, which has the worthy aim of cracking down on sex trafficking; the Department of Justice this week said it would look into the platforms’ impact on free speech. In Germany the platforms have strict deadlines to take down hate speech. The tech giants themselves increasingly accept responsibility for what appears on their pages, hiring armies of moderators to remove offending content (see article).
This new interventionism carries two big dangers. One is that it will entrench the dominance of the giants, because startups will not be able to afford the burden of policing their platforms or to shoulder the risk of lawsuits. The other is that the tech titans become “ministries of truth”, acting as arbiters of what billions of people around the world see—and what they do not. This is no idle worry. Facebook and YouTube have banned Alex Jones, a notorious peddler of conspiracy theories. Loathsome as Mr Jones’s ideas are, defenders of free speech ought to squirm at the notion that a small set of like-minded executives in Silicon Valley are deciding what is seen by an audience of billions.
The weight given to free speech and the responsibilities of the platforms vary between countries. But three principles ought to guide the actions of legislators and the platforms themselves. The first is that free speech comes in many flavours. The debate over the platforms is a melange of concerns, from online bullying to political misinformation. These worries demand different responses. The case for holding the tech firms directly responsible for what they carry is clear for illegal content. Content that may be deemed political is far harder to deal with—the risk is both that platforms host material that is beyond the pale and take down material that should be aired.
The second is that it is wrong to try to engineer a particular outcome for content. Trying to achieve a balanced news feed, say, is not simply antithetical to the giants’ business model, which promises personalised content. It is also a definitional quagmire, in which honest differences of political view must be categorised. Tech firms would be forced to act as censors. It would be better to make platforms accountable for their procedures: clarify the criteria applied to restrict content; recruit advisory bodies and user representatives to help ensure that these criteria are applied; give users scope to appeal against decisions. They also need to open their algorithms and data to independent scrutiny, under controlled conditions. Only then can society evaluate whether a platform is discriminating against content or whether material causes harm.
Arbiters and arbitrariness
The third principle is that small firms should be treated differently from large ones. The original rationale of the CDA made sense, but the firms that need protection now are those that seek to challenge the big tech platforms. If rules are drawn up to impose liability on online firms, they ought to contain exemptions for those below a certain size and reach. Google and its confrères have achieved extraordinary things in their short lives. But their bosses would be getting a lot less heat from Capitol Hill if they had more competition.
Tech platforms are not neutral.
Turning them into censors does not solve that problem
Google marked its 20th birthday last week. It celebrated in fitting style - being lambasted by politicians in Washington. Its failure to send a senior executive to a congressional hearing, over Russian use of tech platforms to meddle in the presidential election in 2016, was tone-deaf. Like Facebook and Twitter, whose top brass did not show up, Google wields too much influence to avoid public scrutiny. A vital debate is under way over whether and how tech platforms should be held responsible for the content they carry. Angering legislators increases the chance of a bad outcome.
Back when Google, Facebook, Twitter and others were babies, the answer that politicians gave on the question of content liability was clear. Laws such as America's Communications Decency Act (CDA), passed in 1996, largely shielded online firms from responsibility for their user's actions. Lawmakers reasoned that the fledging online industry needed to be protected from costly lawsuits. They were to be thought of more as telecoms providers, neutral venues on which customers could communicate with each other.
That position is hard to maintain today. Online giants no longer need protection: they are among the world's most successful and influential firms. Nearly half of American adults get some of their news on Facebook; Youtube, Google's video-streaming service, has 1.9bn monthly logged-on users, who watch around 1bn hours of video every day. To complaints about trolling, fake news and extremist videos, the old defense of neutrality rings hollow. The platforms' algorithms curate the flow of content; they help decide what users see.
The pendulum is thus swinging the other way. Lawmakers are eroding the idea that the platforms have no responsibility for content. Earlier this year America passed the SESTA act, which has the worthy aim of cracking down on sex trafficking; the Department of Justice this week said it would look into the platforms’ impact on free speech. In Germany the platforms have strict deadlines to take down hate speech. The tech giants themselves increasingly accept responsibility for what appears on their pages, hiring armies of moderators to remove offending content (see article).
This new interventionism carries two big dangers. One is that it will entrench the dominance of the giants, because startups will not be able to afford the burden of policing their platforms or to shoulder the risk of lawsuits. The other is that the tech titans become “ministries of truth”, acting as arbiters of what billions of people around the world see—and what they do not. This is no idle worry. Facebook and YouTube have banned Alex Jones, a notorious peddler of conspiracy theories. Loathsome as Mr Jones’s ideas are, defenders of free speech ought to squirm at the notion that a small set of like-minded executives in Silicon Valley are deciding what is seen by an audience of billions.
The weight given to free speech and the responsibilities of the platforms vary between countries. But three principles ought to guide the actions of legislators and the platforms themselves. The first is that free speech comes in many flavours. The debate over the platforms is a melange of concerns, from online bullying to political misinformation. These worries demand different responses. The case for holding the tech firms directly responsible for what they carry is clear for illegal content. Content that may be deemed political is far harder to deal with—the risk is both that platforms host material that is beyond the pale and take down material that should be aired.
The second is that it is wrong to try to engineer a particular outcome for content. Trying to achieve a balanced news feed, say, is not simply antithetical to the giants’ business model, which promises personalised content. It is also a definitional quagmire, in which honest differences of political view must be categorised. Tech firms would be forced to act as censors. It would be better to make platforms accountable for their procedures: clarify the criteria applied to restrict content; recruit advisory bodies and user representatives to help ensure that these criteria are applied; give users scope to appeal against decisions. They also need to open their algorithms and data to independent scrutiny, under controlled conditions. Only then can society evaluate whether a platform is discriminating against content or whether material causes harm.
Arbiters and arbitrariness
The third principle is that small firms should be treated differently from large ones. The original rationale of the CDA made sense, but the firms that need protection now are those that seek to challenge the big tech platforms. If rules are drawn up to impose liability on online firms, they ought to contain exemptions for those below a certain size and reach. Google and its confrères have achieved extraordinary things in their short lives. But their bosses would be getting a lot less heat from Capitol Hill if they had more competition.
Comments
Post a Comment