Politicians using Facebook to promote their ideas and win the votes of constituents is nothing new—the practice has existed as long as the platform. But the 2016 presidential election was an entirely different ballgame.

Throughout the 2016 presidential election cycle, many believed that Russia was attempting to disseminate “fake news” via major social media platforms, primarily Facebook, in order to increase Donald Trump’s chances of beating Hillary Clinton. Mr. Trump was far behind Secretary Clinton in just about every poll that was released throughout the presidential race, and few “experts” thought Mr. Trump had a legitimate chance to beat Secretary Clinton.

Facebook Turns Russian Ads Over to FBI

If you aren’t aware, Facebook ads are perhaps the most effective means of digital advertising today because of a number of factors, namely that Facebook knows more about its users than any other social media platform and because so many people use the platform. It seems that Russia is aware of this information and ran Facebook ads from hundreds of Facebook accounts in an attempt to influence Americans.

New evidence has been uncovered by the FBI’s investigation into Russia’s attempts to influence the election, and this bit of evidence is from a deep dive into Facebook ads. Facebook has turned over these Russia-linked ads to the FBI in their investigation.

Late last week, Mark Zuckerberg, founder and CEO of Facebook, posted a video explaining what steps Facebook would be taking in the future to protect democracy.

This news comes on the heels of the revelation that Facebook has allowed people to boost ads to audiences labeled “Jew haters” and other such anti-Semitic groups. Facebook was quick to make clear that these audiences were not human-created but created by its complex artificial intelligence algorithms.

Facebook’s Frankenstein Moment

Last week, Kevin Roose wrote an article for the New York Times called just that: “Facebook’s Frankenstein Moment.”

He wrote:

When Mark Zuckerberg built Facebook in his Harvard dorm room in 2004, nobody could have imagined its becoming a censorship tool for repressive regimes, an arbiter of global speech standards or a vehicle for foreign propagandists.

But as Facebook has grown into the global town square, it has had to adapt to its own influence. Many of its users view the social network as an essential utility, and the company’s decisions — which posts to take down, which ads to allow, which videos to show — can have real life-or-death consequences around the world. The company has outsourced some decisions to complex algorithms, which carries its own risks, but many of the toughest choices Facebook faces are still made by humans.

“They still see themselves as a technology middleman,” said Mr. García Martínez. “Facebook is not supposed to be an element of a propaganda war. They’re completely not equipped to deal with that.”


Now that Facebook is aware of its own influence, the company can’t dodge responsibility for the world it has helped to build. In the future, blaming the monster won’t be enough.

Can Facebook control the monster it has created, or has it grown beyond the point of regulation? Does Facebook need to have outside oversight of some sort?

These are all genuine questions that will need to be rolling around in the mind of Zuckerberg and all of his colleagues at Facebook in the years to come. This problem is not going to fix itself.

Chris Martin

Chris Martin is the Co-Creator and Chief Content Officer at LifeWay Social as well as a Content Strategist at LifeWay. He and his wife Susie live outside Nashville, TN.