Wednesday, February 12, 2020

Aw, look. The UK is still trying really hard to be the 'safest place to be online in the world' | The Register

UK comms watchdog Ofcom is to be handed new powers to police social media's handling of harmful content.

Sanctions for lack of compliance with new duty-of-care laws aren't set in stone, but powers to shut down sites entirely via ISPs remain on the table, as well as the ability to issue fines of up to 4 per cent of a company's turnover.

Up until now, the likes of Facebook, YouTube, Snapchat and Twitter have been burdened with the onerous task of regulating themselves. Content featuring violence, terrorism, cyber-bullying and child abuse will fall under the new laws. The aim is to make companies remove it as quickly as possible and to "minimise the risks" of this kind of content being uploaded in the first place.

In a statement, Secretary of State for Digital, Culture, Media and Sport Nicky Morgan said: "With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK," while also pledging to keep the internet "vibrant and open".

It's the latest in the government's apparently dogged commitment to making the UK "the safest place in the world to be online".

This is the government's first action following the Online Harms whitepaper (PDF) released in April last year. The paper recommended more oversight of user-generated content on social media platforms and suggested Ofcom as the regulatory body.

Andrew Glover, chair of the Internet Service Providers Association, said in a statement: "There are a number of important questions that remain unanswered – especially in a post-Brexit environment – such as how Ofcom will use its new powers, how a regulator would deal with companies not based in the UK and ISP blocking – including how the UK reacts to technical developments such as DNS-over-HTTPS. ISPA will be working with its members on these and other points as we enter the next phase of consultation."

Vinous Ali, associate director of policy at techUK, commented: "Whilst the direction of travel is encouraging, much more work is needed to deliver clarity on questions of scope, process, legal but harmful content and enforcement."

Full duty-of-care legislation is expected to be published in spring.

Ofcom is already tasked with regulating television and radio broadcasters, but the internet is a very different beast.

Clunky government legislation has smashed against the lawless wild west of the internet before. The UK's notorious porn block was intended to stop under-18s from leering at internet smut, but after two years during which the prospect of picking up a porn pass from the local newsagents loomed over citizens, the regulation was finally canned.

There are other potential downsides to trying to legislate for the internet. Free speech advocates have warned of the slippery slope of policing content.

The US's controversial "Section 230" makes companies liable for user-generated content that could be linked to human trafficking and coerced sex work. In practice, this has led to overzealous self-policing and made the internet more dangerous for sex workers.

However, children's charities have commended the Ofcom move. "Tech giants will only do all that they should to stop groomers abusing children on their sites if the penalties for failure are game-changing," said NSPCC CEO Peter Wanless.

shutterstock_popcorn_cat

Well, well, well. Fancy that. UK.gov shelves planned pr0n block

READ MORE

"Ministers must now move urgently to get a proactive duty of care onto the statute books that gives Ofcom the powers to lift up the bonnet on social networks, impose hefty fines on rogue companies and hold named directors criminally accountable for putting children at risk."

Other countries have tried taking an increasingly litigious approach to the web. Germany introduced the NetzDG Law in 2018, which means social media platforms with more than two million registered German users have to review and remove illegal content within 24 hours of being posted or be stung with fines of up to €5m (£4.2m).

Australia passed the Sharing of Abhorrent Violent Material Act in April 2019, introducing criminal penalties for social media companies, potential jail sentences for tech executives of up to three years, and financial penalties worth up to 10 per cent of a company's global turnover.

There are fears that such policing would be expensive to enforce and end up increasing barriers to entry for internet startups. The likes of Facebook and YouTube are able to employ thousands of content moderators. This could end up further cementing the supremacy of the big five – another hot regulatory topic in Europe. ®

Sponsored: Detecting cyber attacks as a small to medium business

[from https://ift.tt/2m5N8uC]

No comments: