How social media regulation could affect the press


Through Alicia Ceccanese / CPJ Global Technology Researcher

The UK moved closer to regulating social media in December when a parliamentary committee major changes recommended to the country’s online security bill to hold internet service providers accountable for content posted on their platforms. “We need to call the time on the Wild West online,” said committee chairman Damian Collins. “What is illegal offline should be regulated online. The bill, which will be submitted to the UK parliament in 2022, aims to penalize companies that allow content related to crimes such as child abuse and online harassment; news reports and free expression groups have reported similar efforts in Kazakhstan, Australia, Indonesia, Chile, and Canada, among other countries

Regulating social media is important for journalists who use platforms to work, especially when the legislative focus is on news or the spoken word. In 2021, the American nonprofit association Liberty house found that at least 24 countries seek to govern how platforms regulate content. States like the UK, which have decided to prevent platforms from censoring journalistic publications in the Security Bill, are facing thorny questions know which jobs deserve protection and how regulations should be enforced.

Many journalists themselves demand that governments regulate social media to help resolve issues that affect the press, such as online abuse, disinformation, or declining advertising revenue, but there could be other unintended consequences. Lawmakers in the United States, UK, India, pakistani and Maurice are among those platforms that discourage offering encrypted messaging, which helps journalists communicate securely. Legislation obliging platforms share data with the police would be bad news in countries that journalists jailed for social media posts. Some social media laws, like Turkey also affects news sites and search engines. Others have implications for news sites with comments sections.

At worst, the authoritarians can jump on the regulatory train to stifle reporting. In 2020, a report by Danish think tank Justitia found that 25 countries were inspired by German Network Enforcement Act 2017 to “provide coverage and legitimacy for digital censorship”. Such laws leave social media companies with a difficult decision: to comply or leave the country.

Alicia Ceccanese of CPJ spoke to Kian Vesteinsson, research analyst for technology and democracy at Freedom House, and Jacob Mchangama, executive director of Justitia, about their respective research.

Each told CPJ how social media regulations can cause platforms to remove more information:

Ban on broad categories of content

Governments’ are outsourcing control of online content that [they] don’t like the platforms themselves, “essentially forcing tech companies” to do the dirty work for them, “according to Mchangama.

In 2018, David Kaye, former United Nations special rapporteur for freedom of opinion and expression, noted broad and restrictive laws on topics such as extremism, profanity, defamation and false information used to force companies to suppress legitimate discussions on social media.

A disturbing example:
Journalist Dung Le Van, right, broadcasts live on Facebook at a cafe in Hanoi, Vietnam, May 15, 2018 (Reuters / Kham)

Applying short withdrawal windows

Germany requires platforms to remove “clearly illegal content” within 24 hours, or up to seven days if the legality is unclear, and other countries have followed their example without adopting the same protections rule of law, according to Mchangama.

“Typically [it takes a court] over a year to deal with a single case of hate speech, ”he said. “Some of those states then require social media companies to do roughly the same legal assessment in 24 hours. “

Under pressure, platforms are removing more content, according to Vesteinsson. “Companies are correcting too much,” he said.

Tight deadlines are driving companies to use solutions like artificial intelligence to automatically filter posts looking for something that may be illegal, according to the Washington DC-based newspaper Center for Democracy and Technology. But a recent analysis of Facebook’s internal document leaks indicates that such filters have been ineffective, especially in some languages ​​- as have poorly trained human moderators, according to The Associated Press and international non-profit journalism Rest of the world.

A disturbing example:
  • Information technology rules introduced in India in February, require removal of content within 36 hours of receiving notice from a government agency or court, depending on the digital rights organization Electronic Frontier Foundation. CPJ has in the past documented restrictions on sharing social media accounts news and opinions about Kashmir through an opaque process.
Indian commuters walk past a Google advertising poster in Bangalore, India on April 6, 2018 (AFP / Manjunath Kiran)

Erosion of intermediary liability protection

Best practices protect intermediaries like social media companies from lawsuits regarding someone else’s content, which “protects [companies] to moderate and remove content on their platforms and to protect them from legal liability for the activities of their users, ”Vesteinsson told CPJ. Accountability makes them less likely to push back against demands for censorship and surveillance, he said.

Mchangama agreed. Laws that erode liability protections provide an “obvious incentive for platforms to say, ‘Better safe than sorry’ when governments make demands, he said.

A disturbing example:
  • On December 24, a Russian court fined Google nearly $ 100 million, the largest of several recent fines imposed on major platforms accused of failing to remove banned content, according to the The Washington Post. Local access to Twitter has been slowed down for the same reason under a law spent in July, according to Reuters. The nature of the content involved in each case was unclear, but the regulators separately warned journalists and other social media companies not to allow information about anti-government protests earlier in the year.
The logo of Russian communications regulator Roskomnadzor is reflected on a laptop screen displaying the Google start page, May 27, 2021. (Reuters / Maxim Shemetov)

Requiring localization

Location laws require social media companies to host staff – often local nationals – and data within the country under the control of local authorities. Representatives risk being brought to justice if the company does not follow government rules, according to a recent analysis by Rest of the world.

“Companies [will] think twice about whether they want to challenge these governments [and] risk the freedom and safety of their employees in the field, ”Mchangama said.

Support PREMIUM TIMES integrity and credibility journalism

Good journalism is expensive. Yet only good journalism can guarantee the possibility of a good society, responsible democracy and transparent government.

For free and ongoing access to the best investigative journalism in the country, we ask that you consider modestly supporting this noble enterprise.

By contributing to PREMIUM TIMES, you are helping to maintain relevant journalism and ensure that it remains free and accessible to everyone.

Make a donation

TEXT OF THE ANNOUNCEMENT : To advertise here . Call Willie +2347088095401 …

PT Mag campaign ad

Source link


About Author

Comments are closed.