October 9, 2024

magellan-rfid

More Computer Please

Russia moves to ban Instagram, declare Meta an ‘extremist organization’

[ad_1]

Meanwhile, the Russian government said Friday it was blocking popular social media app Instagram, taking further action against Meta — the parent company of Facebook, Instagram and WhatsApp — because of reports the previous day that Facebook temporarily suspended its hate-speech rules to allow posts that called for the death of Russian leader Vladimir Putin. The country had previously blocked Facebook, which has a much smaller audience in Russia than WhatsApp or Instagram.

The incremental escalations over the past two weeks between Russia and the tech giants has forced the companies to rethink the ways they police speech online, rewriting their rules as they go in response to the fast-moving conflict. Social platforms are critical tools for the public to communicate and share information during wartime, but Russian propaganda outlets have also used them to spread disinformation about the war. And the companies are weighing pressure from world leaders to increase Russia’s isolation against potential retaliation by the country itself.

In the wake of Russian interference in the 2016 election and a global pandemic, companies including Facebook, Twitter and YouTube have moved away from a historically hands-off approach to policing the content on their platforms, creating new rules to attempt to halt the spread of misinformation that they said could cause real-world harm.

But the Ukrainian conflict has prompted a flurry of new rule-changing and policymaking as the companies have banned state media outlets and allowed some speech previously considered to be hateful.

“This is clearly a crisis information environment and tech companies are making many decisions on the fly,” said Graham Brookie, senior director of the Atlantic Council’s Digital Forensic Research Lab.

The companies say there is some precedent for the last-minute decisions and that it is necessary to stay nimble during fast-paced world events.

“I want to be crystal clear: Our policies are focused on protecting people’s rights to speech as an expression of self-defense in reaction to a military invasion of their country,” said Nick Clegg, Meta’s president for public affairs. “The fact is, if we applied our standard content policies without any adjustments we would now be removing content from ordinary Ukrainians expressing their resistance and fury at the invading military forces, which would be rightly viewed as unacceptable.”

Instagram head Adam Mosseri tweeted Friday that Instagram had 80 million users in Russia who will be cut “off from one another and from the rest of the world, as [about] 80% of people in Russia follow an Instagram account outside their country. This is wrong.”

Facebook also temporarily suspended its hate-speech policies last year, allowing Iranians to call for death to the country’s leader, Ali Khameini, for two weeks during a period of government repression.

YouTube enacted a policy against “denying, minimizing or trivializing well-documented violent events” in 2019, and cited the Holocaust and the 2012 Sandy Hook school shooting in Connecticut as examples. On Friday, the company said it would take down Russian state media worldwide “in line with that policy.”

Companies have long resisted cracking down on Russian state-backed channels — despite their known propensity for spreading propaganda — because they feared being shut out of the country. The companies also worried about being perceived as inconsistent, because news outlets like PBS and the BBC in the United States and Europe also receive funds from government or other public sources. Instead, the companies chose in 2018 to label media outlets that receive the bulk of their funding from governments.

As a result, state-owned channels have hugely capitalized on social media. With more than 16 million followers for its English, Spanish and Arabic channels combined, outlet RT has claimed to be the most-watched news network on YouTube and boasted more than 10 billion views over time. RT’s YouTube English channel gained 130,000 more followers in the weeks leading up to the Ukraine war.

The companies argue that in places where the government controls the news media, Western social media services are often one of the few places where people can organize and express opinions more freely. Services like Instagram and YouTube are hugely popular with the Russian public, and have been a place where some criticism of the invasion has been able to find a platform, despite harsh penalties for dissent inside Russia.

But the calculus for social media companies is quickly changing amid international condemnation of Russia’s invasion and Russia’s retaliation against Silicon Valley services, in what some are calling a new digital iron curtain. Increasingly, the companies are willing to pick a side.

State media outlets have inaccurately described Russia as liberating Ukrainian people who support Russia and protecting them from Ukrainian Nazis. The Russian government doesn’t refer to the war as a “war,” instead calling it a “special operation” to “demilitarize and de-Nazify” Ukraine.

(Ukraine’s government and its Jewish president were democratically elected, and a recent poll run by a Kyiv-based agency showed nearly 80 percent of Ukrainians oppose making concessions to Russia and 67 percent said they were willing to put up armed resistance against Russia.)

In shutting down outlets like RT and Sputnik globally, YouTube is knowingly risking a retaliatory shutdown in Russia. Facebook was also willing to risk a total shutdown of Instagram and WhatsApp when it hastily instated the temporary policy allowing people to call for death to Putin and to Russian invaders earlier this week.

Former Facebook chief security officer Alex Stamos said that despite tech companies’ hesitancy to do so, in geopolitical events in which thousands of people are dying, “It’s okay to pick a side. In fact it’s the only reasonable thing you can do. Because if you don’t pick a side, you’re actually picking the side of the powerful over their victims.”

Tech companies have long championed protecting free speech and have been hesitant to take down political content as long as it wasn’t overtly violent. That hands-off stance has shifted in recent years, but researchers have repeatedly shown that tech companies struggle to consistently police content and often fail to enforce their rules. Some say the companies have been too permissive with state media.

The coronavirus pandemic is a major example, said the Atlantic Council’s Brookie. Tech companies for years said they would not block misinformation on their platforms because they did not want to be arbiters of truth, but then they began removing content regarding the coronavirus that experts said went against public health guidelines.

That extended to RT, whose initials once stood for Russia Today, which media watchers have pointed out was pushing different narratives about the coronavirus pandemic depending on who its audience was. Domestically, it was supporting mask-wearing and vaccines, while on its English, French, Spanish and German channels, it was pushing stories about how mask mandates were an attack on peoples’ freedoms. (Last year YouTube cracked down on two German RT channels as a result of pushing covid misinformation, prompting the Russian government to vow to take “retaliatory measures.”)

Facebook has been criticized by its own independent Oversight Board for having inconsistent rules. The company created an exception to its hate-speech rules for world leaders but was never clear which leaders got the exception or why. After it suspended the account of President Donald Trump in the wake of violence at the Capitol on Jan. 6, 2021, the Oversight Board said the decision was correct but that Facebook did so without a clear rationale or plan.

And for years, Russia has used both state media and covert operatives to promote a distorted view of coronavirus as well as the long-running civil war in Syria, which Russia has been directly involved in since 2015. In 2018, Facebook even took down a covert disinformation campaign about Syria that it said was tied to Russian intelligence.

RT’s YouTube channel frequently posted reports on people protesting Covid-19 mandates in Western countries like the U.S. and Australia. The site also recently posted videos alleging Polish officials were mistreating migrants crossing over into the country from Poland, without mentioning the Belarusian government’s efforts to push migrants over the border in retaliation for European Union sanctions.

The tit-for-tat between the companies and Russia started in the early days of the invasion, when Facebook began to fact-check misleading articles on Russian outlets. Facebook also quickly changed its hate-speech policies to allow praise of a previously banned neo-Nazi group in Ukraine that was fighting against Russia.

The Russian government asked Facebook to remove those fact checks, but Facebook refused, Clegg said in tweets last week. The company argued that its services are critical for activists and everyday Russians to communicate with their families. Russia’s Internet regulator then said it would begin limiting access to Facebook services.

Facebook, YouTube and others responded by blocking the ability of Russian state media services to buy advertising. Then the European Commission announced that it was banning state media services RT and Sputnik, and they asked the tech companies to comply with the regional ban.

The tech companies complied, and also said they would be further limiting the reach of Russian government-backed outlets around the world.

A week ago Friday, Russia announced it was fully blocking Facebook. But the Russian Internet regulator did not extend the block to the more popular Instagram or WhatsApp.

On Thursday, a Facebook content moderator leaked new guidelines that showed that Facebook had decided to break its own rules to allow for some calls to violence against Russian invaders. The company confirmed the leak.

Russia’s top prosecutor said Friday that the government was opening a criminal case against Meta and is seeking the classification of the company as an “extremist organization and the prohibition of its activities” on Russian territory, alleging the platform was used to incite “mass riots accompanied by violence.”

The invasion and the flurry of demands about taking down certain content has sparked an emergency for the tech companies, so the relatively rapid changes in policy make sense, said Daphne Keller, who was associate general counsel for Google until 2015 and now directs the Program on Platform Regulation at Stanford University’s Cyber Policy Center.

“When it’s a crisis situation, doing something exceptional that isn’t already covered by law or isn’t already covered by the platform’s discretionary policies makes sense,” she said.

But the resulting Russian retaliation has negative downstream effects, Brookie said.

“Creating a digital iron curtain that shuts off the Russian people from a global information environment will make it harder to hold the Russian government accountable for its actions,” he said.

Craig Timberg, Ellen Frances and Mary Ilyushina contributed to this report.



[ad_2]

Source link