Can Google and Facebook Sway Voter Behavior?
Certain researchers are calling for greater scrutiny of how politics and technology intersect.
Ethan Zukerman said algorithms, the things governing your Facebook feed or the links that pop up when you type something into Google, have politics.
After all, behind the code, are humans. "It's actually politically very important that we interrogate them and ask the question, 'What is the news that we're getting out of these engines?'" said Zuckerman, who directs the Center for Civic Media at MIT.
Zuckerman said while the specifics of many search engine algorithms are trade secrets, the way and the order in which results are presented is important.
"We know from some other search studies that it is possible to sway voter behavior with search results," Zuckerman said. "There have been a number of studies done where people have set up Google-like search engines, they're usually lightly-modified versions of Google, and they produce carefully curated search results."
In one study looking at a parliamentary election in India, manipulating search engine results had a measurable effect in how people decided to vote.
"We found we could easily shift people by 20 percent or more and, in some demographic groups, higher than 60 percent," said Robert Epstein, one of the co-authors of that study, which was published in Proceedings of the National Academy of Sciences. "[People] trust those search rankings partly because it's coming from a computer, and since it's coming from a computer, people think it must be objective and impartial."
And while there's no indication internet companies are actively manipulating algorithms for political ends, Zuckerman believes the topic deserves greater scrutiny -- as does German chancellor Angela Merkel.
"It is possible that if Google wanted to swing the election you could imagine them trying to stack the results," Zuckerman said. "I don't believe that they are doing so - but it would be very, very difficult to verify."
One way to check up on search engines would be through a "distributed audit." That's a coordinated attempt to learn what an algorithm is doing (and isn't) by essentially throwing lots of data at it to see what results come back.
Zuckerman said calls for greater transparency could also yield insight into how algorithms are constructed. Or, he said, tech companies could better acknowledge that search engine or social media algorithms do shape public discourse, and that it may make sense for Silicon Valley companies to establish editorial policies or set up public ombudsmen.
"It's a very difficult position for people who said, 'I thought I was starting a technology company, I didn't mean to be starting an editorial newsroom,'" Zuckerman said. "You can imagine that's an awkward and difficult, and sometimes, pretty frustrating situation for them. It doesn't mean that they can avoid and duck that responsibility. It's probably a responsibility they need to take on."