Facebook parent Meta says Russians targeting Ukrainians with misinformation and hacking attempts on Facebook


Facebook parent company Meta said Sunday night that it has taken down a coordinated Russian influence operation that was targeting Ukrainians across Facebook and Instagram. The company said the misinformation campaign has ties to another Russian network in the Donbas region that was previously banned from Facebook in April 2020. 

In addition to the influence operation, Meta said it also took down a coordinated hacking group attempting to target and compromise accounts within Ukraine.  

“We took this operation down, we’ve blocked their domains from being shared on our platform, and we’ve shared information about the operations with other tech platforms with researchers and with governments,” David Agranovich, director of threat disruption for Meta, told reporters.

Agranovich said the coordinated campaign used fake accounts to target high-profile Ukrainians including journalists, members of the military and public thinkers. Those behind the campaign operated fictitious personas and were also active on YouTube, Twitter, Telegram, and two Russian social media sites “to appear more authentic” and “avoid scrutiny,” Agranovich said.

The operation also ran a handful of websites, Meta said, which would publish claims about the West betraying Ukraine and Ukraine being a failed state. Agranovich said the content created by the influence operators was “primarily off of our platform.”

“The idea was they would write an article, posting that article onto their website as if they were a reporter or a commentator and then the accounts were really just designed to post links to their own websites and direct people off platform,” Agranovich said. 

While Meta described the influence operation as a “relatively small network” consisting of approximately 40 accounts, pages, and groups across Facebook — with fewer than 4,000 followers on Facebook and not even 500 on Instagram — the company would not say how many users interacted with the misinformation or how many times the posts were shared with others. 

“What we’ve generally found is that the best proxy for the size of these operations ends up being the number of people who follow them,” Agranovich said. “In general, what we saw here…

Source…