In the days and weeks after the election results came in, we’ve been paying more and more attention to the effect that Facebook’s news feed algorithm had on the spread of fake news. Immediately after the election results, Mark Zuckerberg said that the idea that Facebook had affected the election was “crazy,” but now he seems to be realizing that the idea might not be so crazy after all.
Facebook announced that it is taking direct steps to stop the spread of fake news. However, Facebook users will play an integral part in keeping news feeds clean of fake news. When someone spots a story they think is wildly untrue, they can flag it with a warning that will appear to other users. Facebook also said it will begin working with third-party fact checkers and suggest more accurate “related articles” for users. But Facebook is also developing an algorithm that will identify fake news before people even spot it.
Facebook didn’t say exactly when the updates would roll out, but the social network has already begun blocking fake news sites from making ad money, which it hopes will starve out those producing the fake news.
“The problems here are complex, both technically and philosophically,” Zuckerberg wrote on the network. “We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”
So it looks like it will be up to us to keep our feeds full of accurate information.
What are your thoughts on fake news? Let us know @BritandCo!
(h/t Recode; photo via Justin Sullivan / Getty)