You are the owner of this article.
You have permission to edit this article.
Edit
hot top story
A Post and Courier Special Report
Troll Hunters

2 Clemson professors race to expose a shadowy force of Russian internet soldiers

  • Updated
  • 14 min to read
2 Clemson professors race to expose a shadowy force of Russian internet soldiers

1. The Amplifiers

Laughter is contagious. So is rage. A few clicks on the keyboard and you might see a post of a cat on a Roomba or a tweet about a politician you hate, and boom, the neurons in your brain erupt, your fingers tap again, repeating and endorsing that message.

The Russian trolls join you sometimes, especially when the messages stoke anger and disgust.

From their keyboards an ocean away, the trolls like and retweet your fury and cynicism, amplifying it. And, with the trolls' support, your rage spreads — rage that accomplishes their mission to inflame American discourse and sap the country's strength.

The Russian trolls deploy this contagion when they sense weakness. It happened in downtown Charleston in 2016, on the one-year anniversary of the Emanuel AME Church shootings.

Russian operatives set up a bogus group called Black Matters US and organized a rally in front of the church, a foreign attempt to sow discord here that hasn’t been previously reported.

Sometimes the trolls target people, such as Susan Collins, the Republican senator from Maine. In 2017, Twitter flared over her support of the Affordable Care Act. Russians piled on with posts calling her a “traitorous sellout.” The following year, Collins had a chance to grill Jack Dorsey, Twitter’s CEO, in a congressional hearing.

Collins asked: Why hadn’t Twitter warned her about the Russian trolls? Why, instead, did she have to learn about their attacks from two professors at Clemson University?

Clemson Troll Hunters 1.jpg

Clemson professors Patrick Warren and Darren Linvill work to find the methods used by Russian trolls on social media at the Social Media Listening Center at Clemson University on Wednesday Dec. 18, 2019. They have found Russian disinformation agents likely backed a new internet troll farm in Ghana. File/Grace Beahm Alford/Staff

2. The Game

Nine months before that congressional hearing, the two professors, Patrick Warren and Darren Linvill, were in a basement trying to stop an epidemic, one that had nothing to do with social media.

The basement was in Linvill’s parents’ house in Clemson, a nicely furnished place with patio windows that open onto Lake Hartwell. A painting with a big, orange Clemson tiger paw hung on the wall; the fridge was full of beer. Two other Clemson professors joined Warren and Linvill for a game of Pandemic.

It was January, and Warren’s wife had given him the board game for Christmas. Pandemic is a different kind of game. Instead of beating your opponent, you win by defeating a deadly disease together. On the box, it says: “Can your team save humanity?” A nerdy game, for sure, Warren and Linvill are quick to say. But they’re proud nerds, and homegrown ones at that.

Warren, 40, grew up in Walhalla, a small town in the foothills 20 miles north of Clemson. He graduated from the University of South Carolina, then earned a doctorate in economics from the Massachusetts Institute of Technology.

Linvill, 43, is from the town of Clemson itself, where his father, Dale, was an agricultural meteorologist at the university. Linvill went to Wake Forest as an undergrad and earned his Ph.D. at Clemson in education, staying on to teach communications.

With four professors together, the conversation sometimes veered away from the game. One time, they stopped for a half-hour to talk about moral theory. Politics was another frequent topic. And that night during their first Pandemic game of 2018, Linvill had been thinking hard about Russian trolls.

Special Counsel Robert Mueller was more than a year away from issuing his findings about Russian election interference. But new evidence had spilled into the public bloodstream. With Twitter’s help, the House Intelligence Committee had identified nearly 3,000 Twitter accounts run by Russian agents.

Linvill was curious. The internet had long been populated by trolls — people who start quarrels and post inflammatory messages. But the Russians seemed to have taken trolling to a new level. What communications strategies did they use? How did they build their audiences? How far did they reach into the U.S. body politic?

Answers were elusive. When Twitter suspended the accounts, the trolls’ tweets also vanished. But Linvill thought he had a workaround. Clemson had a sophisticated program called Social Studio. Corporations use Social Studio to identify when people mention their products on social media. Clemson was one of the few universities using it for academic purposes. Linvill thought it could recover the Russian tweets. He brought up the idea near the end of the game.

Warren listened. Yeah, that might work, he said. Social media posts are persistent, even after they’ve supposedly been erased from the web. The Russian tweets likely were still swirling in the cloud. You just needed a back door to retrieve them.

In the following months, Warren and Linvill opened that back door over and over. They downloaded tweets in batches, sometimes 50,000 at a time. When they were done, they had about 3 million tweets. At the time, it was the most comprehensive trove of Russian troll activity available.

Then they burrowed into the data and found themselves in a hall of mirrors.

Trump Russia Probe

A woman enters the four-story building then known as the "troll factory" in St. Petersburg, Russia, on April 19, 2015. File/AP

3. The Troll Factory

Navigating this hall required a step back into an older labyrinth.

During the Cold War, Soviet spin doctors worked hard to breed confusion. In the 1960s, they concocted fake stories about how the CIA orchestrated the Kennedy assassination; Soviet operatives infiltrated civil rights groups. In the 1970s, they planted false stories to undermine the Camp David peace accords. In the 1980s, they pushed a conspiracy theory that the U.S. military created the AIDS virus.

Their goals were clear: flood the American public with false and confusing information; distract people and spread paranoia and cynicism; do it over and over until the bonds of civil society frayed. In all, Soviet forces carried out more than 10,000 Cold War disinformation campaigns, intelligence experts later calculated.

Fast-forward to 2011, and Russia’s civic fabric showed signs of wear. An election had spawned allegations of fraud. Protesters in Moscow took to the streets. Demonstrators used social media platforms to organize, including Twitter and Facebook.

The protests were an embarrassing challenge to Vladimir Putin’s leadership grip. Furious, Putin blamed then-Secretary of State Hillary Clinton for inciting the unrest. He also tasked his deputies to rein in the internet, including Yevgeny Prigozhin, “Putin’s chef.”

Prigozhin was a convicted thief and former hot dog stand owner. But after the Soviet Union collapsed, he built an array of shadowy companies. One company catered Putin’s dinners for dignitaries, hence the “Putin’s chef” moniker. In 2013, Prigozhin also formed the Internet Research Agency, which became Russia’s most notorious troll factory.

Over time, the Internet Research Agency hired hundreds of young people, paying them about $700 a month, more if they spoke English. They worked from a nondescript office in the Russian city of St. Petersburg. Inside, they crafted fake identities on Facebook, Twitter, Instagram and Russian social media platforms.

Many details about the Internet Research Agency’s inner workings were exposed six years ago, when the troll factory was in its infancy. In 2013, journalists from the independent newspaper Novaya Gazeta published: “Where the trolls live. How internet provocateurs in St. Petersburg work, and who pays them.”

That story in Russia and follow-ups there revealed how the factory’s first targets were Putin’s domestic opponents. The trolls shifted their focus in 2014 to Ukraine, as Russian-backed forces invaded Crimea.

And then they set their sights on Louisiana.

•••

In September 2014, the tweets came in bursts:

“A powerful explosion from miles away happened at a chemical plant in Centerville, Louisiana #ColumbianChemicals,” said one from a fake account under the name Jon Merritt.

Twitter user @AnnRussela posted an image of the plant on fire.

Other operatives posted videos of smoke and a doctored screenshot of CNN’s homepage. A YouTube video had ISIS taking credit.

Local news outlets in Louisiana scrambled. Reporters quickly confirmed that nothing was amiss. Few Americans were fooled, but that didn’t stop the trolls. They planted fake stories about a salmonella outbreak in upstate New York and a fictional story about a police shooting in Atlanta. (#shockingmurderinatlanta.) On Facebook and Twitter, they highlighted incendiary posts about mass shootings and Obamacare.

Working in Russia, a New York Times reporter in 2015 traced these fake events to the Internet Research Agency, concluding: “Russia’s information war might be thought of as the biggest trolling operation in history, and its target is nothing less than the utility of the Internet as a democratic space.”

Despite the investigative journalism in Russia and abroad, the troll factory kept spinning false campaigns.

“One of the tragedies of 2015–16 was that the troll factory was exposed so brilliantly in Russia, but ignored so utterly in the United States,” according to the Digital Forensic Research Lab, a think tank that analyzes social media.

Then, with the 2016 U.S. presidential election looming, the Russians pivoted away from creating fake events toward a more insidious strategy.

The two Clemson professors would later see evidence of this scheme in their massive new database.

Darren Linvill stats.jpg

Clemson professor Darren Linvill discusses data collected as he follows the habits of Russian trolls on social media at the Social Media Listening Center at Clemson University on Wednesday Dec. 18, 2019. Grace Beahm Alford/Staff

4. Pandemic

A database of 3 million Russian troll tweets was a daunting haystack of information. Warren and Linvill could have used algorithms and other programming to label and organize the data. But two human brains are still better at detecting nuances in language. So, early on, they decided to read as many tweets as possible.

“We learned a lot more by going in and looking at every single word,” Linvill said.

It helped that they came from different academic fields. Warren was the data and statistics guy. His previous research included analyses of how open records laws reduce public corruption.

Linvill was the communications guy who had analyzed the impacts of the now-defunct Yik Yak social media platform.

The professors' database soon gave up the trolls' secrets.

Russian operatives often lifted photos of models for their profile photos, especially shots of young women, the professors found.

The trolls incubated their fake accounts slowly. Early posts were innocuous: tweets and retweets about favorite songs, foods and pets.

Then they entered a growth stage.

They followed hundreds and then thousands of people, prompting many to follow them back.

In short order, many of these bogus accounts had tens of thousands of American Twitter followers.

“Most of the people that are awful on social media are real,” Warren said. “And I think this is one thing that would surprise most people: The person who is most likely a Russian or Iranian or Chinese troll is the person who agrees with you.”

Linvill added: “People are persuaded by things they’re already inclined to believe, not by someone yelling at you. The trolls were trying to be your friends, not your enemies.”

Trolls built their audiences in clever ways, including hashtag games, the professors found.

In Twitter circles, users create catchy hashtags and challenge others to come up with pithy comments. Celebrities such as Jimmy Fallon also do this, urging fans to comment on #WorstGiftEver or #WhyImSingle. You won by collecting the most retweets, new followers and, in Fallon's case, funny responses to read on the air.

For Russian disinformation agents, hashtag games were chances to solidify bonds with real Twitter users. They mixed harmless hashtags, such as #ThingsILearnedFromCartoons, with more political ones, such as #WasteAMillionIn3Words Donate to #Hillary.

“You had to read a bunch of tweets to realize they were playing an actual game, a game that most people aren’t familiar with,” Linvill said. “They were engaging with existing communities in a very specific way.”

The professors also found that many trolls could be divided into two categories — left-leaning and right-leaning. Left-leaning trolls often adopted personas of African American and LGBTQ activists. They typically supported Bernie Sanders and were critical of Hillary Clinton.

Right-wing accounts also targeted Clinton and used inflammatory hashtags such as #KILLARY and #Benghazi. One account ran a contest urging followers to submit videos of themselves urinating on a picture of Hillary Clinton.

“The prize was $10 over PayPal, and they had submissions,” Linvill said.

Whether left-wing or right, the trolls were skilled amplifiers. They turned up the volume of irritating tweets from real Americans, filling Twitter feeds with sarcasm and cynicism.

“They played off the real disasters in American political discourse, the real problems that we have in our society,” Linvill said.

Buried in their database was another revelation: Russian agents tried to amplify discord in Charleston. They even dispatched a real operative to the city to meet with local activists.

And they did this when the city was particularly vulnerable.

Emanuel Unity Walk

Doves are released at Emanuel AME Church in Charleston as part of a unity event on June 18, 2016. Russian trolls called for a fake rally the same day. File/Grace Beahm/Staff

5. Fake Charleston rally

In the spring of 2015, Michael Slager, a white officer from the North Charleston Police Department, gunned down Walter Scott, a black man, as he tried to flee. Russian agents peppered Twitter with indignant messages. One left-wing troll named @jaedenmassey wrote: “#IAmWalter What's wrong with them?! Are they trained to hunt us?”

Then, less than three months later, Dylann Roof killed nine people at Emanuel AME Church. Foreign trolls turned up the volume. One named @ina_malone tweeted with awkward grammar: “How someone could walk into the Charleston Church and shot people? #HateCrime." Within hours of Roof’s capture, right-wing trolls such as @karenparker93 retweeted “#DylannRoof is a Real American.”

But the Russian influence in South Carolina extended beyond mere tweets.

A Post and Courier examination of the Clemson database revealed that foreign trolls also staged a fake rally in front of Emanuel AME.

•••

Russian operatives had created a bogus group called Black Matters US, which mirrored a real group, Black Lives Matter.

As with the legitimate group, the fake Black Matters US campaigned against racism and police brutality. It also had an overtly political side, posting YouTube videos promoting Bernie Sanders and calling for protests against Hillary Clinton.

In 2016, as the one-year anniversary of the Emanuel shooting neared, the Russian troll @rallycharleston urged people to protest.

Details about the rally also were sent to The Post and Courier, which published them. In a “Things to Do” article, the newspaper reported the group would hold “a peaceful rally near the church” on June 18, 2016. The paper's article mentioned that Black Matters US was “a civil rights organization that fights for equality and attempts to eliminate race-based discrimination.”

The Russian-inspired rally coincided with the Unity Walk, which was organized by the family members of one of the shooting victims, Daniel L. Simmons Sr. The Russian troll @rallycharleston tweeted: “Never forget the victims of white supremacy.”

Shakem Amen Akhet, a Charleston activist, said he remembers receiving Facebook messages in 2016 from a woman who claimed to be with Black Matters US. At the time, he and another activist, Muhiyidin Moye, had organized Black Lives Matters Charleston.

“The name Black Matters US kind of threw me off,” Akhet said.

Maybe it was an offshoot, he thought. But the group's web page didn't mention any specific people. At one point, he got messages from a man in Los Angeles who claimed to be with Black Matters US and wanted to meet him.

Shakem Amen Akhet .JPG (copy)

Shakem Amen Akhet said people with a group named Black Matters US urged him and Muhiyidin Moye to organize protests in Charleston in 2016. They declined, after growing suspicious. Black Matters US turned out to be a bogus group created by Russian operatives. File/Andrew J. Whitaker/Staff

The man eventually traveled to Charleston and met Akhet and Moye at the International Longshoremen’s Association Hall on Morrison Drive. Akhet said he found the Californian’s approach disconcerting.

“He was pressing, telling us we need to do a rally, we need to do a rally. He wanted us to do all the work. So we opted out,” Akhet said. (Moye was shot and killed in 2018 in New Orleans.)

In the end, only a handful of people showed up for the Black Matters US rally, while hundreds participated in the Unity Walk.

Though unsuccessful, the Black Matters US push in Charleston was a striking example of how Russian operatives went beyond social media to foster tensions by using real American activists. Russian agents staged other fake rallies. They included a "Miners for Trump" rally in Pittsburgh and anti-Clinton rallies in Florida, Special Counsel Robert Mueller said in his election interference report last year.

“What they did was very disingenuous,” Akhet said. The trolls “take a good cause, and then, because of their resources, try to change the narrative. People say they’re not doing any harm, but when it comes time to realize a solution — to heal wounds — they keep driving the wedge. They’re using tricknology."

left and right trolls.jpg

Clemson professor Patrick Warren reviews data collected as they follow the habits of Russian trolls on social media at the Social Media Listening Center at Clemson University on Wednesday December 18, 2019. Grace Beahm Alford/Staff

6. The Attack

Through 2018 and 2019, Warren and Linvill learned more about the trolls and their impacts, often feeling a mix of indignation and admiration.

The trolls’ discipline and patience were impressive, Warren said.

“It was like a military operation, very much planned out.”

And this planning paid off during a pivotal moment before the 2016 presidential election, the professors' database revealed. Warren would call it “the biggest one-day propaganda operation in the history of social media.”

On Oct. 6, 2016, the trolls suddenly changed their behavior.

“They turned the accounts they’d grown into amplification engines,” Warren said.

In a day-long burst, the Russian trolls churned out 18,448 retweets of 4,239 real Twitter accounts.

“They literally worked overtime,” Warren said.

The trolls hadn’t picked those real accounts at random. They chose Twitter users who had extreme views, both on the left and right. Views that served the trolls’ purpose to sow discord.

Now amplified, the real Twitter users quickly generated more tweets and retweets, activity that prompted more like-minded people to follow each other — a more crowded and louder echo chamber.

Warren and Linvill calculated that the Oct. 6, 2016, troll burst added half a million additional followers to those 4,239 real accounts, all within a few days.

The impact was more than just about numbers, the professors learned. The trolls had empowered America’s outliers, people who held the most extreme views. They fostered a sense that outrage was more common than it might be, that chaos was normal.

“Think about it,” Warren said. “There are town hall meetings happening across the United States every month, and how many involve people yelling at each other? Almost never, right? And, if they do, they make the news. Because it’s so rare.”

But looking into social media’s hall of mirrors made you think those outliers were the norm.

It also came amid another disinformation campaign: Russian agents had hacked into the computer of Hillary Clinton’s presidential campaign chairman and stolen emails from top Democratic Party officials. The Russians had then given the emails to Wikileaks, which released them.

Now it was the trolls' turn. With their newly amplified audience of outliers, they poured fuel on an internet fire. “BOOM! Wikileaks Confirms Hillary Sold Weapons To ISIS,” a troll named @ameliebaldwin wrote. This happened the same day The Washington Post published a video of Trump bragging about grabbing women’s privates. That story and the Wikileaks email hack vied for the nation’s attention. The Russian trolls pushed hard on the email story.

Later investigations found that trolls and automated bot accounts generated 1.4 million election-related tweets. Those tweets reached 288 million Twitter users. Trolls also worked on Facebook and Instagram, reaching more than 126 million Americans, a 2017 Senate Intelligence Committee report said. Taken together, the trolls had successfully injected a massive volume of vitriol into America’s body of discourse.

“They were taking advantage of America’s open wounds and poking a stick in them,” Linvill said. “They wanted us to be disgusted with each other.”

7. Chronic Contagion

Dec. 18 was a quiet day on the Clemson campus. Students were on break, and Warren and Linvill had a chance to catch up on their troll research.

It was a big day in Washington, D.C., though, and on social media. The U.S. House of Representatives was voting to impeach President Donald Trump.

Linvill listened to some of the proceedings but had to turn it off. He'd listened to Republicans and Democrats speak past each other. It was just the kind of divisiveness that the trolls had tried to foster. 

The professors' lives have taken unexpected turns since that first game of Pandemic in the basement in early 2018.

Federal investigators have tapped their expertise. Brig. Gen. William Hartman, leader of the U.S. Army's Cyber Command, has called their work "invaluable." Congressional investigators also have called, including the one who prepped U.S. Sen. Susan Collins for the Twitter hearing.

After Collins referenced the Clemson professors, journalists began phoning them, often asking whether a particular Twitter account was a troll. Nearly every time, Linvill said, the suspect wasn’t a troll. It was just some jerk. A real one. 

“It’s important to remember that a lot of the voices you see on social media are just not representative of real discussions in the world,” Linvill said.

Darren Linvill mug.jpg

Clemson professor Darren Linvill discusses data collected on the habits of Russian trolls on social media at the Social Media Listening Center at Clemson University on Wednesday Dec. 18, 2019. Grace Beahm Alford/Staff

As the impeachment hearings wore on, Warren joined Linvill in the school’s Social Media Listening Center, which was really just a converted classroom with lots of large-screen monitors. Warren, wearing a sweater with a Clemson tiger paw, monitored his phone. He suddenly let out a groan: The University of South Carolina, Clemson’s in-state rival, had landed the state’s top football recruit. Linvill shook his head.

Warren put his phone away and pointed to the screen. One showed a word cloud with #impeachmenthearing in the largest letters. There in a classroom, they moved quickly into teaching mode. China and Iran have their own troll factories now, they said. In October, Daryl Morey, the general manager of the Houston Rockets, posted a positive tweet about Hong Kong protesters.

In response, pro-Chinese government trolls generated 1,000 new Twitter accounts a day for a week, with each one blasting Morey.

"The Chinese could dwarf what the Russians have done,” Linvill said. "The Chinese have the ability to throw vast resources at something as seemingly minor as a single, offensive tweet."

And the Russians haven’t quit.

In August, a troll named @IamTyraJackson garnered nearly 290,000 likes for a single tweet about a football player’s charity work. And then there's @Dany_xoo, who posted a video of a 2-year-old playing soccer on his father’s bed. It generated more than 430,000 likes.

Warren and Linvill reported the trolls to Twitter, which suspended their accounts.

But even after the troll accounts are gone, their effects linger like a virus that doesn’t go away, the professors said.

“The reason that it’s so diabolical is that it sticks,” Warren said. “It’s like an infection. It spreads into the social media ecosystem. And even when you take away that vector, you’ve done lasting damage to the body.”

Over time, the trolls foster a chronic feeling of disgust.

“Destroying empathy is their end goal, and disgust is the mechanism,” Warren said.

Infect enough people, Linvill added, and soon you have friends and neighbors calling each other animals. That's the kind of rhetoric you see before genocides and wars.

Given the stakes, the professors are pushing hard to make Clemson a center of social media research. In class, they're trying to educate their students about how to be more cautious social media users. The good news, Linvill said, is that young people tend to be more savvy about ignoring the trolls. The bad news: Baby Boomers seemed to get more easily suckered.

Warren’s phone rang.

It was a call from a congressional aide. Linvill said he had a similar call from a federal agency in the morning. Warren excused himself to take the aide's call. Classes were out, so the long hallway was quiet and empty, amplifying his footsteps. 

Reach Tony Bartelme at 843-937-5554. Follow him on Twitter @tbartelme.

Get up-to-the-minute news sent straight to your device.

Topics

Breaking News

Columbia Breaking News

Greenville Breaking News

Myrtle Beach Breaking News

Aiken Breaking News