Social media bots are damaging our democracy
One need only look at the apparent suicide of Jeffrey Epstein, who had been implicated in an international child-sex-trafficking-ring investigation, to see the effects of social-media bots. Within moments of the announcement, Twitter flooded with conspiracy theories surrounding Epstein’s death. Unsourced assertions and hypotheses spread throughout the network faster than the actual news did, thanks in part to prodigious retweeting by automated accounts.
Social bots are algorithmic software programs designed to interact with humans, sometimes to the point of persuading them that the bot is human, or autonomously perform mundane functions such as reminding people to like and subscribe in a video’s comments. Think of them as chatbots but with additional autonomy. In fact, one of the earliest bots was ELIZA, a natural language processing computer developed at MIT in 1966. It was one of the first systems to even attempt the Turing Test.
As the internet emerged in the early 1990s and IRC (internet relay chat) channels came into vogue, so too did bots. They were designed to automate specific actions, be able to respond to commands and interact with humans in the channel — functions that have since been adapted to modern social-media platforms like Twitter and Facebook via APIs. Twitch especially leverages bots in its operations, in part given that it is built off the same technology as IRC. Their roles now include everything from responding to user queries, automatically moderating discussions and actively playing games. They’ve been put to use outside social media as well. Google’s web crawler is a bot, as is Wikipedia’s anti-vandalism system.
But on social media, they shine. A modest network of coordinating bot accounts on Twitter can massively expand the size and scope of attention a tweet receives, influence the course of a thread, and either mitigate or multiply the impact of a media event. An April 2018 study by the Pew Research Center estimates that between 9 percent and 15 percent of all Twitter accounts are automated. What’s more, 66 percent of all tweeted links to popular sites were disseminated by bot accounts, though a staggering 89 percent of links to news-aggregation sites were bot sourced.
Compared with humans, these bots are relentless. The same study found that the 500 most active (suspected) bot accounts were responsible for 22 percent of tweeted links to popular news sites while the 500 most active human accounts produced barely 6 percent of the same linked tweets.
And it’s not as though these bots are particularly subtle about what they’re doing. A separate Pew study from October 2018 found that 66 percent of Americans are aware that these bots exist, while a whopping 80 percent of those folks believe that bots are primarily used for malicious purposes.
But what American’s can’t seem to do is confidently identify bots when interacting with them. Only 47 percent of respondents of the survey were very or somewhat confident they could recognize a bot account and a mere 7 percent were very confident. That’s fewer folks than even the percentage of guys who think they could score a point off Serena Williams.
The fact that Americans are so gullible online does not bode well for us. “One of the big problems for the general public is we mostly believe what we see and what we’re told,” Frank Waddell, assistant professor at the University of Florida’s College of Journalism and Communications, told Engadget. “And this is kind of amplified on social media where there’s just so much information.”
Increasingly, bot networks are being deployed to spread misinformation, damaging the country financially. We’ve already seen bot activity influence the stock market. The so-called Flash Crash on May 6th, 2010, wherein the Dow dropped 1,000 points (9 percent of its value) in minutes was caused by a flurry of automated trades by a single mutual fund’s automated traders. And in 2013, the Syrian Electronic Army hacked the Associated Press Twitter account and ran a false story about then-President Obama being injured in a terrorist attack, causing the market to temporarily crash until the hoax was revealed.
These bots are even more dangerous to our democracy. “Unfortunately the news is mostly bad, these bots have been very effective in the past at shaping public opinion,” Waddell continued. “They can just do more tweeting and sharing than the average person and they can do that by quite a large magnitude.” By flooding a discussion with their own content, they can shape the nature of public opinion, he explained.
He points to the 2010 election as one of the earliest examples of bots used to influence political discourse. “Some people call it astroturfing, other people call it Twitter bombs,” he said. “The whole purpose of it, from a political perspective, was to smear other candidates. It’s meant to promote one candidate while discrediting another.”
These influence campaigns can be downright insidious, argues Waddell. “Bots are may be tweeting in a way that supports how users already feel; they might already be inclined to, let’s say, support or oppose gun control. And when you have Twitter bots tweeting consistently inline with [the user’s] beliefs, they may or may not be realized that they’re being sucked into this false consensus being manufactured.” We’ve seen examples of this practice in the discussions surrounding Brexit, special counsel Robert Mueller’s report to Congress, and the Saudi government’s ham-fisted coverup attempt after murdering US-based journalist Jamal Khashoggi. It keeps happening because it’s just so damn effective. Sometimes, it’s even welcomed.
Just as Twitter played an outsize role in the 2012 election and Facebook did in the previous 2008 cycle, Reddit commanded an inordinate amount of influence during the 2016 presidential race — specifically, far-right haven /r/The_Donald. As Saiph Savage, assistant professor of computer science at West Virginia University, and her co-authors found in their 2018 study, Mobilizing the Trump Train, social bots played a critical role in helping to motivate and mobilize the su