Facebook forced troll farm content on over 40% of all Americans each month

Facebook forced troll farm content on over 40% of all Americans each month

Facebook forced troll farm content on over 40% of all Americans each month

In the wake of the 2016 election, Facebook knew it had a problem. Pages and fake accounts created by the Kremlin-backed Internet Research Agency had spread through the social network and drawn massive engagement from real users. Facebook knew it had to get things under control.

But years later, Facebook’s own internal research teams revealed that troll farms were still reaching massive audiences, even if they didn’t have large direct followings. The company’s own algorithms pushed the troll content onto users who had not expressed interest in the pages, expanding the trolls’ reach exponentially. A report detailing the research was leaked to MIT Technology Review by a former employee.

When the report was published in 2019, troll farms were reaching 100 million Americans and 360 million people worldwide every week. In any given month, Facebook was showing troll farm posts to 140 million Americans. Most of the users never followed any of the pages. Rather, Facebook’s content-recommendation algorithms had forced the content on over 100 million Americans weekly. “A big majority of their ability to reach our users comes from the structure of our platform and our ranking algorithms rather than user choice,” the report said.

The troll farms appeared to single out users in the US. While globally more people saw the content by raw numbers—360 million every week by Facebook’s own accounting—troll farms were reaching over 40 percent of all Americans.

The report, authored by Jeff Allen, a former data scientist at Facebook, reveals that the company’s prioritization of engagement led to the problem. Facebook, he said, knows very little about content producers. Who posted something wasn’t being considered in the News Feed algorithm.

“It’s a lot of extremely fancy collaborative filtering algorithms but all based around engagement,” Allen wrote. “When the content producers that win that system are exploiting communities on our platform rather than building and supporting them, it becomes clear that the ranking system does not reflect our company values. So much so that it is actually working against us.”

Advertisement

Foreign influencers

Many of the pages are run from countries on the Balkan peninsula and target foreign audiences with a primary focus on Americans, the report says.

The popularity and reach of the troll farms led Allen to believe that Russian Internet Research Agency operatives were likely able to exploit the same techniques or use the same pages to reach American users. “If the Troll Farms are reaching 30M US users with content targeted to African Americans, we should not at all be surprised if we discover the IRA also currently has large audiences there,” Allen wrote.

What’s more, the troll farms were able to slip their content into Instant Articles and Ad Breaks, two Facebook programs that give partners a cut from sales of advertisements that run alongside page content. “In Instant Articles, there was a period when perhaps as much as 60% of Instant article reads were taking place on scraped content, which is the Troll Farms article writing method of choice,” Allen said. Facebook had been inadvertently paying the troll farms.

Facebook “had already been investigating these topics” when the report was published internally, a Facebook spokesperson told MIT Technology Review. “Since that time, we have stood up teams, developed new policies, and collaborated with industry peers to address these networks. We’ve taken aggressive enforcement actions against these kinds of foreign and domestic inauthentic groups and have shared the results publicly on a quarterly basis.”

Ars has sent Facebook additional questions, and we’ll update this story if we get a response.

Exploited communities

Users who saw troll farm content tended to fall into two groups, Allen wrote. “One camp doesn’t realize the Pages are run by inauthentic actors who are exploiting their communities. They tend to love these Pages. They like how entertaining the posts are and how they reaffirm their already-held beliefs,” he wrote. “The other camp does realize the Pages are run by inauthentic actors. They hate the ever-loving shit out of these Pages. They hate these Pages with a passion that even I find impossible to match.”

Advertisement

The latter group was actively telling Facebook about the problem. “Our users are literally trying to tell us that they feel exploited by these Pages,” Allen said. 

By way of example, Allen cited one user who discovered a troll farm page that was targeted toward American Indians. The troll group stole art and sold it reprinted on T-shirts that were frequently never shipped to customers, the user said. “This whole group is a fraud ring,” the user wrote.

The troll farms highlighted in the report primarily targeted four different groups: American Indians, Black Americans, Christian Americans, and American women. At the time it was written in October 2019, the report says that for many of those groups, a majority of the top pages were run by troll farms, including all 15 top pages targeting Christian Americans, 10 of the top 15 targeting Black Americans, and four of the top 15 targeting American Indians. When MIT Technology Review published its story, five of the troll groups were still active. Three targeted Black Americans, one targeted Christian Americans, and one targeted American Indians.

Much of the content these groups posted, while frequently stolen, apparently did not otherwise violate Facebook’s content rules. Still, that didn’t mean it was harmless, Allen said. “The bottom line is, regardless of if there is or is not IP violation taking place, posting strictly unoriginal content violates our policies and exposes our communities to exploitation,” Allen explained.

Simple fixes

Allen believed the problem could be fixed relatively easily by incorporating “Graph Authority,” a way to rank users and pages similar to Google’s PageRank, into the News Feed algorithm. “Adding even just some easy features like Graph Authority and pulling the dial back from pure engagement-based features would likely pay off a ton in both the integrity space and… likely in engagement as well,” he wrote.

Allen left Facebook shortly after writing the document, MIT Technology Review reports, in part because the company “effectively ignored” his research, a source said.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Blog - UK News - BlogUK News - BlogUK