© 2024 Connecticut Public

FCC Public Inspection Files:
WEDH · WEDN · WEDW · WEDY
WECS · WEDW-FM · WNPR · WPKT · WRLI-FM · WVOF
Public Files Contact · ATSC 3.0 FAQ
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Social Media Usage Is At An All-Time High. That Could Mean A Nightmare For Democracy

Since Russia's expansive influence operation during the 2016 election, Americans' usage of social media has only increased — and drastically so, as a result of the pandemic.
Caroline Amenabar
/
NPR
Since Russia's expansive influence operation during the 2016 election, Americans' usage of social media has only increased — and drastically so, as a result of the pandemic.

America's new socially distant reality has warped the landscape of the 2020 election.

Candidates aren't out knocking on doors, and U.S. election officials are bracing for a record surge in mail ballots.

But another subtler shift is also occurring — inside people's brains.

Four years after Russia's expansive influence operation, which touched the feeds of more than 100 million users on Facebook alone, Americans' usage of social media has only increased — and drastically so, as a result of the pandemic.

More people are more online right now than at any point in human history, and experts say the Internet has gotten only more flooded since 2016 with bad information.

"It's far, far worse in terms of quantity," says Steven Brill, a former journalist and now the CEO of NewsGuard, a browser extension that helps users discern the quality of what they're reading online.

In April, NewsGuard published a list of 36 websites that were peddling hoaxes related to the coronavirus. Just a month later, that list had ballooned to more than 200.

A study out last week from researchers at Carnegie Mellon University found that nearly half of the Twitter accounts spreading messages about the coronavirus pandemic are likely bots — automated accounts designed to make it appear that more humans are acting a certain way than truly are.

And in 2020, as in 2016, it's only a matter of time before the focus shifts back away from the coronavirus and to the presidential election.

"The same thing is going to happen with the political sphere. There's just no doubt about it," Brill says. "The great thing about the Internet is everyone can be a publisher. The really bad thing about the Internet is everyone can be a publisher."

More than ever before

Social media usage in the U.S. had already been on a steady increase over the past few years, even before the pandemic left millions of Americans stuck in their homes with many hours to kill on their phones and computers.

Loading...

Don't see the graphic above? Click here.

That was despite the negative barrage of publicity suffered by the social media giants — Facebook in particular — in the wake of the 2016 election.

Among the issues: Facebook CEO Mark Zuckerberg downplaying Russian interference efforts, the Cambridge Analytica scandal and a bevy of research that seemed to point to people being happier when they weren't so connected.

In the face of all that, Americans stayed online. Facebook's main platform saw a small dip in U.S. users in the time after the election, according to Edison Research, but Facebook also owns Instagram, which has seen a steady increase in overall users over the past four years.

The pandemic has supercharged those gains.

"We know that people especially rely on social apps in times of crisis and in times when we can't be together in person," Zuckerberg said on April 29. "And right now, we are experiencing both of those all around the world at the same time."

For the first time ever, Zuckerberg announced, more than 3 billion people used Facebook, Messenger, Instagram or WhatsApp in a single month.

Twitter announced last month as well that it saw a first-quarter 24% year-over-year increase in the number of daily users who saw ads.

The increased usage spans the globe, says Pinar Yildirim, a marketing professor at the Wharton School of the University of Pennsylvania who specializes in social media trends.

"If you look at Italy, if you look at Spain, if you look at the United States and Canada, you see the same patterns," says Yildirim.

"There are more users, in terms of unique users. There are more frequent visits," she says. "And then you look at the amount of time people are spending on social media — that's where you start seeing even more of an increase."

Spin cycle

One way that increased usage may be felt in the 2020 election isn't in just the amount of false information being seen and shared, Yildirim says — but in how polarized the American public is during the campaign cycle.

A study from last year found people's usage of Facebook correlated with how polarized they are and how open they are to understanding the views or ideas of the opposition party.

Facebook had internal research pointing to the same conclusion, according to a new report from the Wall Street Journal, but the company reportedly brushed it aside.

"Our algorithms exploit the human brain's attraction to divisiveness," said one slide, from a 2018 presentation.

This idea isn't new. Much of Russia's influence operation on social media in 2016 wasn't about introducing new ideas or controversy but instead was about furthering racial and political divides already present in American culture.

"To put it simply, in this space Russia wants to watch us tear ourselves apart," said David Porter, an assistant section chief with the FBI's Foreign Influence Task Force, earlier this year.

More people spending more time on the platforms where this takes place likely will mean even more attempts at amplifying divisions and stoking discord.

No easy solution

Many federal policymakers, members of Congress and Big Tech leaders agree there are problems, and they are taking some action.

Shortly after the last presidential race, the social media companies went into damage control mode. There were multiple days' worth of Capitol Hill hearings, and new policies rolled out aimed at warning people before they clicked on conspiracy theories.

Facebook introduced a feature last year that tries to limit the spread of websites that are disproportionately popular on the platform, compared with the broader Web. Twitter said it was taking down more than a million suspicious and fake accounts a day.

And in that same April announcement in which Zuckerberg beamed about Facebook's increased global reach during the pandemic, he talked about how much the company has done to warn people about bad information.

"We partner with independent fact-checkers who have marked more than 4,000 pieces of content related to COVID as false, which has resulted in more than 40 million warning labels being seen across our services," he said.

Twitter announced a similar labeling approach for tweets "containing disputed or misleading information" related to the pandemic and even decided this week to add a fact-check label to one of President Trump's tweets for the first time.

But many experts aren't satisfied with the industry's work on the issue.

For one thing, there's research to suggest that the very act of seeing a headline, even if it's notated as false by the platform or by a fact-checker, can still contribute to people believing its claim.

"Fact-checking is predicated on the assumption that people will change their mind when confronted with correct information," writes Alice Marwick, a professor at the University of North Carolina, in a paper published in the Georgetown Law Technology Review.

"As we have seen, this ignores a wide variety of social and cultural factors, and is not supported by empirical evidence. ... In fact, fact-checking may have the opposite effect of making stories 'more sticky.' "

A number of organizations, including Brill's NewsGuard, aimed at curing what some see as this kind of media literacy problem have popped up since the 2016 election.

While well-intentioned, they may be missing the true problem — which is how the platforms allow for bad sensationalist information to go viral, says Peter Pomerantsev, the author of the book This Is Not Propaganda, which details a number of information and influence operations on social media.

"So many studies have shown people can be super-educated and super-critically minded and ignore any evidence that goes against their identity," Pomerantsev says. "People will be very critical when they see something they don't like, and then they switch off their critical faculties when it agrees with their worldviews."

Plus, social media platforms are inherently addicting, Pomerantsev says, so he argues that people can't be blamed for not quitting those platforms cold turkey despite what they may be doing to the structure of democracy.

The real answer, he argues, is for the government to get involved in regulation.

"We don't really have any oversight. They're marking their own homework," Pomerantsev says. "And that's not good enough."

That position was a legal and political minefield before the pandemic — and between the coronavirus crisis and the election, no one expects movement on the issue in the U.S. government this year, or anytime soon.

This means that as the 2020 presidential election goes forward, posts will be posted, and tweets will be tweeted. And because of the pandemic, social media will play some role in the outcome, says Yildirim, of the University of Pennsylvania.

"If you asked me two months ago, I would have had very different predictions about social media's role on the election," she says. "But now it's become the primary source of information and social communication."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Miles Parks is a reporter on NPR's Washington Desk. He covers voting and elections, and also reports on breaking news.

Stand up for civility

This news story is funded in large part by Connecticut Public’s Members — listeners, viewers, and readers like you who value fact-based journalism and trustworthy information.

We hope their support inspires you to donate so that we can continue telling stories that inform, educate, and inspire you and your neighbors. As a community-supported public media service, Connecticut Public has relied on donor support for more than 50 years.

Your donation today will allow us to continue this work on your behalf. Give today at any amount and join the 50,000 members who are building a better—and more civil—Connecticut to live, work, and play.

Related Content