The U.K. government emphasized press freedom this month when it published the draft online safety bill for social media companies, pledging that the bill would protect both “citizen journalism” and “recognized news publishers” from censorship. Vocal segments of the media not only welcomed the legislation, but actively campaigned for it. When Oliver Dowden, secretary of state for the government’s Digital, Culture, Media and Sport department, previewed the bill in the Telegraph newspaper, he credited the publication’s Duty of Care campaign for leading the charge on online safety.
Others in the media, along with free speech groups, have taken the opposite approach, saying the worst content, like child sexual abuse material, is already prohibited under other U.K. laws. By incentivizing platforms to manage other kinds of information, they say, the new bill consolidates a troubling global trend of states requiring platforms motivated by commercial interests – not the public’s – to decide what users share. London-based digital rights advocates at Global Partners Digital called it “the most comprehensive and demanding piece of online content regulation in the world.”
The draft is yet to undergo pre-legislative parliamentary scrutiny and its impact may not be evident for some time. Still, some experts told CPJ the bill’s exemptions for journalism would not be enough to mitigate the overall impact on digital speech. Worse, privacy groups say it could undermine end-to-end encryption, a safety feature for online communications that CPJ and others recommend for journalists and their sources.
“Once a presumption takes root that online speech is dangerous, that will inevitably spread beyond individual users’ posts to online communication generally, including the press,” Graham Smith, an internet law expert who has critiqued the concept of an online duty of care, told CPJ by email this month.
The bill seeks to protect people who use social media by introducing a “duty of care” for large platforms, meaning they should control harmful content or face penalties exacted by the government-appointed regulator, Ofcom. In addition to removing illegal content, the bill tasks the biggest platforms with vaguer responsibilities, requiring them to demonstrate that they are capable of protecting posts “of democratic importance” while managing others that are lawful but harmful, as the Guardian has noted. The draft defines “harmful” as something that risks having “a significant adverse physical or psychological impact” on someone with “ordinary sensibilities,” according to CPJ’s review. The bill’s scope appears to include direct messages via social media, and even video calling platforms like Zoom, according to news reports.
In 2018, the U.N. special rapporteur on freedom of expression said that tying heavy penalties to content regulation would chill freedom of expression, and the bill’s detractors object to the state requiring websites to manage information under threat of heavy penalties. Reuters reports that large platforms could face fines of up to £18 million (US$25 million) or be blocked in the U.K. if they fail to comply, while senior managers could face criminal action. CPJ has found that a similar framework in Turkey — modeled on a German law — is used to prevent news from circulating online, while stringent social media regulations passed secretly in Pakistan last year appear to have lifted the term “online harm” directly from the U.K. government’s 2019 white paper on the topic.
“A free press is at the heart of British democracy,” a DCMS spokesperson said in a statement provided to CPJ by email. “We have included strong and targeted protections in the Online Safety Bill to protect journalism and enhance free speech online.”
The spokesperson also indicated that the bill would prevent platforms from moderating journalistic or legal content arbitrarily by requiring them to enforce consistent public policies.
“We will not allow tech firms to censor content on a whim, the Bill’s focus is making sure they are accountable for removing illegal content, protecting children and keeping their promises to users,” the statement said. “Adults will not be prevented from accessing or posting legal content, and social media companies will have specific duties to safeguard people’s access to news publishers’ articles when shared on their sites.”
The bill exempts news publishers’ own websites, including comments, the industry-focused Press Gazette reported. Section 14 of the draft obliges platforms to protect journalism by offering an expedited appeal for takedowns if the person involved considers that the material was journalistic. Ofcom will issue codes of practice, which DCMS said by email would require the companies to publish assessments covering how they upheld free expression – a move towards transparency that observers like Global Partners Digital welcomed.
Yet questions remain about how these protections will work in practice, according to CPJ interviews.
“They’ve provided a definition of news-related material, but all that the act requires is that the platform consider the freedom of journalistic content when they make their [own] decision,” Lexie Kirkconnell-Kawana, head of regulation at IMPRESS, the U.K.’s independent self-regulatory body for smaller print titles, told CPJ in a video call. How platforms will interpret that isn’t clear yet, she said. “It might be years before we see what’s going on behind the curtain.”
“No one should be making a pre-judgement about whether journalistic content should be read by the public,” Peter Wright, the editor emeritus of DMG Media, which publishes The Daily Mail and other prominent UK newspapers, told CPJ. “It certainly shouldn’t be commercial organizations overseen by a state regulator.”
News publishers would have to be registered in the U.K. and operating under a code of standards to qualify for the protections, according to Richard Wingfield, head of legal at Global Partners Digital, who spoke with CPJ by phone.
“If I write my own code, will I be exempt?” Kirkconnell-Kawana wondered. “We don’t want to see bad faith actors meeting low thresholds under the guise of being a news provider.”
Wingfield, on the other hand, queried the extent of protections for citizen journalists. One of the bill’s requirements for protected content is that it be “generated for the purposes of journalism,” he said, but how that should be understood isn’t clear.
In response to CPJ’s emailed questions about Ofcom’s role in implementing the legislation and its independence from the government, Ofcom referred CPJ to an online statement citing its “extensive experience of tackling harmful content and supporting freedom of expression, through our role regulating TV and radio programs.”
“We won’t censor the web or social media,” or “be responsible for regulating or moderating individual pieces of online content,” the statement said.
Aside from the content moderation questions, the fact that the bill encompasses private messages means it could undermine digital privacy and anonymity, according to Heather Burns, policy manager at the U.K. campaigning organization Open Rights Group. The bill does not ban end-to-end encryption – which keeps anyone but the sender and recipient from reading a private message – or anonymous internet use, which Dowden has acknowledged in parliament “is very important for some people.” Yet it’s not clear how companies that can’t read direct messages will comply with requirements to control their contents.
“The government has adopted the tack that end-to-end encryption is an enabler of child abuse,” Burns said in a phone call shortly before the draft bill was published. Home Secretary Priti Patel was widely reported in April saying Facebook’s plans to extend end-to-end encryption from WhatsApp to all of its messaging products was jeopardizing efforts to combat child abuse. In a March report on child safety, the Centre for Social Justice, a think tank headed by former Home Secretary Savid Javid, advised Ofcom to retroactively sanction companies who introduced “high risk design features” such as end-to-end encryption before the online safety bill became law.
“Right now, we don’t know how to build scanning systems that work with [end-to-end encrypted] messengers, let alone build them in ways that aren’t subject to abuse,” Matthew Green, who teaches cryptography as an associate professor at Johns Hopkins University, told CPJ by email.
“The Home Secretary has been clear that industry must step-up to meet the evolving threat,”
a Home Office spokesperson said in a statement emailed to CPJ. “End-to-end encryption poses an unacceptable risk to user safety and society. It would prevent any access to messaging content and severely erode tech companies’ ability to tackle the most serious illegal content on their own platforms, including child abuse and terrorism.”
DCMS confirmed to CPJ by email that companies using end-to-end encryption were not exempt from the duty of care, and would have to demonstrate to Ofcom how they are managing risk to their users or face action.
“If journalists cannot send secure messages because the U.K. government assumes they’re trading child abuse images, that will put people’s lives at risk,” Burns, of Open Rights Group, told CPJ.
This content originally appeared on Committee to Protect Journalists and was authored by Madeline Earp/CPJ Consultant Technology Editor.
Madeline Earp/CPJ Consultant Technology Editor | Radio Free (2021-05-25T15:21:59+00:00) UK online safety bill raises censorship concerns and questions on future of encryption. Retrieved from https://www.radiofree.org/2021/05/25/uk-online-safety-bill-raises-censorship-concerns-and-questions-on-future-of-encryption/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.