DCMS has published the first draft of its ‘Online Safety Bill’ aiming to establish a framework to tackle harmful content online and provide safer digital environments for all audiences.
New laws tackling online abuse and harmful content were first proposed in 2019, following the publication of the government’s white paper on online harms – a mandate which would be carried as 2019 General Election pledge by PM Boris Johnson to ‘make Britain the safest place to be online’.
Publishing its draft proposal, DCMS branded the bill as a ‘landmark law’ that “will help protect young people and clamp down on racist abuse online, while safeguarding freedom of expression.”
Central to its legislation, the bill proposes that duty-of-care requirements be imposed on ‘digital service providers’ to protect users from being exposed to harmful content and abusive interactions – ushering a “new age of accountability for tech and bring fairness and accountability to the online world”.
The bill is focused on upholding the accountability of five critical areas of responsibility for digital platforms – i) duty-of-care for users, ii) protecting freedom of expression, iii) promoting democratic content, iv) ensuring user access to news/journalistic content and V) protecting users from online fraud.
As previously disclosed by DCMS, new powers will be provided to Ofcom as the lead agency monitoring digital standards “- given the power to fine companies failing in a new duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher, and have the power to block access to sites”.
Ofcom CEO, Dame Melanie Dawes, backed the proposals, stating that accountability of online environments was required to benefit the UK as all sections of society engage daily with the internet.
“We’ll support Parliament’s scrutiny of the draft Bill, and soon say more about how we think this new regime could work in practice – including the approach we’ll take to secure greater accountability from tech platforms” Dawes remarked.
Following feedback, DCMS has introduced measures to tackle ‘online fraud’, in which digital platforms must safeguard audiences from being targeted by ‘fraudulent user-generated content’.
The department stated that digital platforms such as Facebook, YouTube and Snapchat held a responsibility to remove user-generated content promoting fake investment opportunities and romance scams.
However, the proposals noted that Fraud via direct advertising, emails or cloned websites will not be in scope because the bill as its focus remains on user interactions within digital environments.
With regards to online fraud protections, DCMS stated that it would continue to work with industry bodies, regulators and consumer groups to strengthen laws and consider non-legislative solutions.
DCMS continues to work with the Home Office on the development of a ‘Fraud Action Plan’, which will propose the duties of advertisers and publishers in minimising the promotion of online frauds
Home Secretary Priti Patel said: “Ruthless criminals who defraud millions of people and sick individuals who exploit the most vulnerable in our society cannot be allowed to operate unimpeded, and we are unapologetic in going after them.”
“It’s time for tech companies to be held to account and to protect the British people from harm. If they fail to do so, they will face penalties.
The DCMS bill has garnered a mixed reaction with regards to how Ofcom will monitor and judge complexities in relation to online abuse, content and user behaviour against the tricky balance of common public expression.