Monday, 26 October, 2020

Britain will allow telecom watchdog agency to regulate internet content

Ofcom to become UK Internet regulator UK Government Looking to Regulator to Police Social Media
Cecil Davis | 14 February, 2020, 19:44

Ofcom, which now watches TV and radio organisations like the BBC, will be able to hold tech and social media companies to account if they do not adequately protect users from harmful content, including that associated with terrorism and child abuse.

Among the ideas mooted in the white paper were to give regulators the power to levy "substantial fines" on companies that don't heed "clear standards" or can't show they are meeting their duty of care to their users.

But Britain has no official internet or social media regulator and is looking for ways to stop harmful online material from reaching children.

The government launched its first Online Harms consultation in April previous year, which received almost 2,500 replies.

The government has suggested it expects companies to use a range of tools including age assurance and age verification technologies to protect younger people from accessing inappropriate and harmful content.

The UK government has long wanted to police the internet, and now it has expressed a desire for Ofcom to regulate social media.

Storm Dennis: Met Office issues warnings for more rain and wind
A separate rainfall warning will be in effect between 6am and 9pm on Saturday. River levels are likely to rise over the weekend and into early next week.

"These are complex topics, and it is essential that we find a solution that both enhances digital safety and fosters a thriving internet economy".

To that end, UK's regulator Ofcom would make social media platforms including Facebook, Snapchat, YouTube, and Twitter responsible for harmful content, reports BBC. The Response does not shed any light on the future penalties regime of Ofcom with its "Online Harms" hat on; in particular, there is no indication of whether Ofcom will be given similar fining powers to the ICO (up to the higher of 4% of annual global turnover or €20 million). It's putting more onus on the companies themselves, requiring them to explicitly state what content is "acceptable" on their platforms and enforce these conditions effectively, with full transparency. The response also states that this will cover "only a very small proportion of United Kingdom businesses (estimated to account to less than 5%)", but implicitly acknowledges that this is not always a straightforward question by stating that "guidance will be provided to give clarity on whether or not the services they provide or functionality on their website would fall into the scope".

The tech companies under Ofcom's remit will be also offered enough clarity in any regulations or codes that are subsequently drawn up, with expectations placed on them clearly stated. It will be up to Ofcom to monitor new and emerging online dangers and take appropriate enforcement action. The government said it would publish a full response to the consultation in the spring, but did not give a timeline for legislation.

A statutory duty of care for internet companies with an independent regulator enforcing new guidelines against so-called online harms was first proposed in a Government White Paper a year ago. "We look forward to working in partnership with the Government and Ofcom to ensure a free, open and safer internet that works for everyone".

According to Julian Knight, chair elect of the Digital, Culture, Media and Sport Committee, a "muscular approach" is needed for regulation among the social media platforms.

Sky News reports that there has been "no confirmation of what punishments or fines the bolstered regulator will be able to hand out".