Tag Archive for: Misinformation

Facebook parent Meta says Russians targeting Ukrainians with misinformation and hacking attempts on Facebook


Facebook parent company Meta said Sunday night that it has taken down a coordinated Russian influence operation that was targeting Ukrainians across Facebook and Instagram. The company said the misinformation campaign has ties to another Russian network in the Donbas region that was previously banned from Facebook in April 2020. 

In addition to the influence operation, Meta said it also took down a coordinated hacking group attempting to target and compromise accounts within Ukraine.  

“We took this operation down, we’ve blocked their domains from being shared on our platform, and we’ve shared information about the operations with other tech platforms with researchers and with governments,” David Agranovich, director of threat disruption for Meta, told reporters.

Agranovich said the coordinated campaign used fake accounts to target high-profile Ukrainians including journalists, members of the military and public thinkers. Those behind the campaign operated fictitious personas and were also active on YouTube, Twitter, Telegram, and two Russian social media sites “to appear more authentic” and “avoid scrutiny,” Agranovich said.

The operation also ran a handful of websites, Meta said, which would publish claims about the West betraying Ukraine and Ukraine being a failed state. Agranovich said the content created by the influence operators was “primarily off of our platform.”

“The idea was they would write an article, posting that article onto their website as if they were a reporter or a commentator and then the accounts were really just designed to post links to their own websites and direct people off platform,” Agranovich said. 

While Meta described the influence operation as a “relatively small network” consisting of approximately 40 accounts, pages, and groups across Facebook — with fewer than 4,000 followers on Facebook and not even 500 on Instagram — the company would not say how many users interacted with the misinformation or how many times the posts were shared with others. 

“What we’ve generally found is that the best proxy for the size of these operations ends up being the number of people who follow them,” Agranovich said. “In general, what we saw here…

Source…

Denise Simon on Cyber Warfare and Misinformation | Mind Matters – Walter Bradley Center for Natural and Artificial Intelligence

Denise Simon on Cyber Warfare and Misinformation | Mind Matters  Walter Bradley Center for Natural and Artificial Intelligence
“cyber warfare news” – read more

Social Media Promised To Block Covid-19 Misinformation; But They’re Also Blocking Legit Info Too

Sing it with me, folks: content moderation is impossible to do well at scale. Over the last few weeks, all of the big social media platforms have talked about their intense efforts to block misinformation about Covid-19. It appeared to be something of an all hands on deck situation for employees (mostly working from home) at these companies. Indeed, earlier this week, Facebook, Google, Linkedin, Microsoft, Reddit, Twitter, and YouTube all released a joint statement about how they’re working together to fight Covid-19 misinformation, and hoping other platforms would join in.

However, battling misinformation is not always so easy — as Facebook discovered yesterday. Yesterday afternoon a bunch of folks started noticing that Facebook was blocking all sorts of perfectly normal content, including NY Times stories about Covid-19. Now, we can joke all we want about some of the poor NY Times reporting, but to argue that its reporting on Covid-19 is misinformation would be, well, misinformation itself. There was some speculation, a la YouTube’s warning that this could be due to content moderators being sent home — and not being allowed to do their content moderation duties over privacy concerns, but the company said that it was “a bug in an anti-spam system” and was “unrelated to any changes in our content moderation workforce.” Whether you buy that or not is your choice.

Still, it’s a reminder that any effort to block misinformation is going to be fraught with problems and mistakes, and trying to adapt rapidly, especially on a big (the biggest) news story with rapidly changing factors and new information (and misinformation) all the time, is going to run into some problems sooner or later.

Permalink | Comments | Email This Story

Techdirt.

When You Set Out To Block Misinformation, You Can Wind Up Blocking A Hero Like Li Wenliang

Combating disinformation and misinformation online is an admirable goal. However, we often criticize overly broad attempts to do so, noting that they could lead to censorship of important, accurate, and useful information. Here’s a somewhat tragic case study of that in action. You may have heard late last week about anger in China over the death of doctor Li Wenliang, a physician who had tried to warn people about the new coronavirus well before most others had realized how dangerous it was. Dr. Li eventually caught the virus himself and passed away, sparking tremendous anger online:

Since late Thursday, people from different backgrounds, including government officials, prominent business figures and ordinary online users, have posted numerous messages expressing their grief for the doctor, who contracted the new coronavirus, and their anger over his silencing by the police after he shared his knowledge about the virus. It has prompted a nationwide soul-searching under an authoritarian government that allows for little dissent.

“I haven’t seen my WeChat timeline filled with so much forlornness and outrage,” Xu Danei, founder of a social media analytics company, wrote on the messaging platform WeChat.

The “silencing,” if you haven’t heard the details, was that the police told him he was spreading misinformation online. Inkstone News (a subsidiary of the South China Morning Post) has a translated letter that the police gave to Dr. Li telling him to stop spreading “untruthful information online.” Dr. Li responded to the notifications saying he would stop his “illegal behavior” and that he “understood” that if he continued he would be “punished under the law.”

According to the law, this letter serves as a warning and a reprimand over your illegally spreading untruthful information online. Your action has severely disrupted the order of society. Your action has breached the law, violating the relevant rules in “Law of the People’s Republic of China on Penalties for Administration of Public Security.” It is an illegal act!

The law enforcement agency wants you to cooperate, listen to the police, and stop your illegal behavior. Can you do that?

Answer: I can

We want you to calm down and reflect on your actions, as well as solemnly warn you: If you insist on your views, refuse to repent and continue the illegal activity, you will be punished by the law. Do you understand?

Answer: I understand

Even the Chinese government appears to possibly recognize that this whole setup was a problem:

The outpouring of messages online from sad, infuriated and grieving people was too much for the censors. The government even seemed to recognize the magnitude of the country’s emotion, dispatching a team to investigate what it called “issues related to Dr. Li Wenliang that were reported by the public,” though without specifics.

For many people in China, the doctor’s death shook loose pent-up anger and frustration at how the government mishandled the situation by not sharing information earlier and by silencing whistle-blowers. It also seemed, to those online, that the government hadn’t learned lessons from previous crises, continuing to quash online criticism and investigative reports that provide vital information.

Now, some might respond to this that stomping out disinformation online is quite different than Chinese government suppression of information. But no one can come up with a principled explanation of how this is actually different in practice. Stanford’s Daphne Keller, who studies exactly this stuff makes the point pretty concisely:

Be careful what you wish for.

Permalink | Comments | Email This Story

Techdirt.