Carnegie Mellon University
February 05, 2019

Heinz College Experts Discuss Troll Farms, Fake News

By Scottie Barsotti

Social media platforms have proven to be an efficient means of propagating disinformation and sowing division. This innovation has developed into an urgent problem for which experts in Carnegie Mellon University's Heinz College of Information Systems and Public Policy prescribe technological, policy and human remedies.

The 2016 U.S. presidential election brought this new threat into the public consciousness, with Russia using Facebook and Twitter as frontlines and weapons in a new kind of information warfare.

"In 2014, Russia started to invest small amounts of money in troll farms. Nothing on the scale of a typical military operation. [The intelligence services would spend] a million here, a million there," said Ambassador Sarah E. Mendelson, Distinguished Service Professor of Public Policy at Heinz College and head of Heinz College in Washington, D.C. She remarks that until that point, Russian president Vladimir Putin didn't have much interest in social media as a tool. That changed after Putin saw uprisings of Russian citizens protesting corruption, including his own return to presidential power in 2011 and 2012; these citizen uprisings developed spontaneously and organically in part on social media.

Russian "troll farms" — groups of organized online agitators — identify grievances in other countries and then insert themselves into those debates with the aim of inflaming them, Mendelson said. Rather than promoting any one political ideology, professional Russian trolls instead focus on fanning Americans' emotions around heated topics such as gun control or immigration, and then pitting Americans against Americans. The tactic is — literally — divide and conquer.

"They're not making these things up. They're finding tensions that exist on Facebook or Twitter, and they're amplifying," Mendelson said. "It's pretty basic social marketing, using social media in ways that are hugely successful. And not terribly expensive, in the scheme of things, for the amount of chaos that they created."

Last July, 13 Russian officials and multiple Russian organizations — including the Internet Research Agency in St. Petersburg, a company with ties to the Kremlin that has been named as a troll farm by the U.S. Intelligence Community — were indicted by a federal grand jury for various attacks on the American electoral system, including a hack of the Democratic National Committee and spreading disinformation with the intent to influence and interfere in the 2016 election.

In November, a Russian national was charged for activities pursuant to interference in the upcoming 2018 midterm elections.

"This is ongoing. This is an investment that they continue to make," Mendelson said. "And it's easy to recruit for these positions. There are many young people both inside and outside of Russia who are going to work for these organizations. Perhaps they need money and see it as easy money, but many of them have anti-Western feelings."

For Good or Bad, Social Media is Powerful

Intersecting data analytics, data security and privacy, technology, consumer behavior, policy, and ethics, Heinz College professor Ari Lightman said social media weaponization is a complex problem that Heinz College is uniquely suited to address.

"As a school we focus on data. Some of us focus more on data for policy decisions, and some on data for business value, but the two are becoming increasingly intertwined," said Lightman, who teaches several courses that focus on understanding and harnessing the power of social media data.

"Facebook can be exploited due to its popularity and vastness," he said. Facebook has 2.23 billion monthly active users (204 million in the United States); Twitter has 335 million monthly active users (49.35 million in the United States.) "People are continually checking their News Feed, more often than they may check in on actual news, and that's an issue."

On Facebook, user preferences and content targeting often create "echo chambers" of like-minded people sharing content with each other and "filter bubbles" in which users rarely see viewpoints that oppose their own.

"It's very easy to unfriend people who have a different political belief than you, so you can become surrounded by people who think and believe like you. And then because Facebook wants to show you things that you will like, you'll be shown content that reinforces your belief system," Lightman said.

Bad actors who understand those mechanisms and user tendencies have used that knowledge to weaponize information in various ways, such as swaying public opinion or sowing chaos in the leadup to an election. Between fake accounts and social bots — specialized computer programs that can autonomously post messages on social platforms — false information spreads with incredible velocity.

"Bots exacerbate the problem. People spread disinformation, and then bots spread it in a million directions," Lightman said.

It doesn't help that social media exists in a regulatory grey area. Lightman suggests that government and social media executives work together to find a way forward and combat this problem. First, legislators and regulators need to ramp up their knowledge about the medium.

"I don't think legislators sufficiently understand the rules of engagement, the business model, the community or the social nuances associated with social media. They need to get educated," he said. "Then we need an alliance to be created between legislators, regulators and social media executives to understand the scope of this problem and identify mechanisms that don't require anything drastic like platforms getting shut down. Because social media provides a valuable utility, and that part of the story often gets swept under the rug because there's so much focus on the abuses."

Lightman added that even though public awareness of this problem is growing, it goes beyond the platforms simply deleting fake accounts or using machine learning techniques to identify fake news. Some of the responsibility falls on users to be more vigilant about disinformation and more cognizant of the information they share and how it might be misused.

To that end, Mendelson said U.S. schools need to be teaching digital literacy.

"We're already in a very different situation than we were [three] years ago. People know this is going on now," she said. "The Russians may be continuing to do it, but it is less successful than in 2016 when people didn't even know it was happening."