Case Studies

Online Harms White Paper

by Mark Rowe

Companies with the largest online presences and high-risk features, likely to include Facebook, TikTok, Instagram and Twitter, will need to assess the risk of legal content or activity on their services with “a reasonably foreseeable risk of causing significant physical or psychological harm to adults”. Those harms notably include terrorism, and child sexual exploitation (CSE); hate crime and sale of illegal drugs and weapons.

Those ‘category one’ tech firms will then need to make clear what type of “legal but harmful” content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently. Examples of category two firms will be platforms which host dating services or pornography and private messaging apps. All companies will need mechanisms so people can report harmful content or activity while also being able to appeal the takedown of content. The companies will be required to publish transparency reports about the steps they are taking to tackle online harms. This is all part of a UK Government Online Harms White Paper, ahead of proposed law, an Online Safety Bill, in 2021.

The Government has brought out voluntary and non-binding interim codes about terrorism and CSE content, for companies begin to make changes, until the media regulator Ofcom issues its statutory codes of practice. A duty of care will apply to disinformation and misinformation that could cause harm to individuals, such as anti-vaccination content. The legislation will introduce additional provisions targeted at building understanding and driving action to tackle disinformation and misinformation.

In a foreword to the white paper, Oliver Dowden, Secretary of State for Digital, Culture, Media and Sport (DCMS) and Home Secretary Priti Patel said: “Our criminal law must also be fit for the digital age and provide the protections that victims deserve. The Law Commission is currently reviewing whether new offences are necessary to deal with emerging issues such as cyber-flashing and ‘pile-on’ harassment.”

You can read the white paper online at gov.uk.

Dame Melanie Dawes, Ofcom Chief Executive, said: “Being online brings huge benefits, but four in five people have concerns about it. That shows the need for sensible, balanced rules that protect users from serious harm, but also recognise the great things about online, including free expression. We’re gearing up for the task by acquiring new technology and data skills, and we’ll work with Parliament as it finalises the plans.”

Comments

Robin Wilton, Director, Internet Trust at the Internet Society, called the proposal fatally flawed. “It threatens to put citizens, businesses and national security at risk while failing to meet its stated aims.

“The problem at the core of the proposal is that it still leaves doubt about what is allowed and what is not. It even sets out to penalise behaviour that is “legal but harmful”, while failing to define what a “harm” is. As a result, everyone this proposal claims to serve will be less secure. Law-abiding citizens won’t know if their online activities are being inspected, or by whom. This undermines vital trust in the Internet, when it is more important than ever to our daily lives. Businesses won’t know if their services are legal or not, which will discourage investment, innovation, and access to the full benefits of the Internet.

“National security will also be compromised, because the proposals (without admitting it) imply that all encrypted data can be unlocked at will. There is no way “to require companies to use technology to monitor, identify and remove” content on end-to-end encrypted services without compromising the security of everyone using that service. The proposals require companies to find encryption technology that serves both goals, even though such technology simply does not – and cannot – exist.

“What’s more, communications like encrypted email are exempt from these rules – leaving malicious actors and criminals with an easy alternative if they want to communicate in secret. The Online Harms bill won’t prevent online harms, it will just move them around.

“As UK businesses and citizens rely on the Internet more than ever in the midst of a global pandemic, weakening end-to-end encryption will do far more harm than good. It would also stifle innovation and further weaken the economy, at a time when the government is claiming Brexit will make British businesses – both domestic and international – more free and able to innovate.”

Ian Stevenson, Ian Stevenson, CEO at Cyan and Chair of the Online Safety Tech Industry Association (OSTIA) welcomed what he called very tangible action by the UK Government. “Stringent financial penalties are necessary and we wholeheartedly welcome this for tech companies who do not act to prevent and swiftly remove illegal content relating to child abuse and terrorism. Action must be taken now, particularly as organisations like the IWF reported another increase in the numbers of online child sexual exploitation imagery this year, which is directly related to COVID-19 and the increase in online use.

“Emerging technologies are out there already that can make a massive difference to stopping and blocking this imagery before it even gets out into the public domain. We urge innovative tech companies, government bodies and industry associations to continue to work together, and collaborate closely to share ideas to combat this significant problem – with associations such as OSTIA paving the way. We support the government setting this standard for online safety and leading the charge.”

Related News

Newsletter

Subscribe to our weekly newsletter to stay on top of security news and events.

© 2024 Professional Security Magazine. All rights reserved.

Website by MSEC Marketing