by Alec Williams
We find ourselves in an ever-changing media landscape, one that can directly and profoundly affect the outcome of an election. Of that media, includes Facebook, or as you may know it recently: “That slime-covered alley way of political rants, advertisements and that video of a guy dressed up as his dog’s favorite toy.”
There is an assumption, albeit a poor one, that information you see on Facebook, cable news and otherwise are to be trusted, well that’s just not the case. Recently, Facebook has been dragged into the political spectrum for not doing enough to mitigate fake news articles that are spread.
Mark Zuckerberg, CEO and founder of Facebook, does not believe that is Facebook’s job however, and that Facebook did not have an impact on the recent election.
“There’s a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news,” Zuckerberg said at a recent conference.
Now, I wholeheartedly believe that Zuckerberg is one of the smartest and sharpest minds on the business spectrum right now, but I believe that statement is completely incorrect, and I also believe he knows that.
When Facebook was first launched in 2003, Zuckerberg wanted to offer a website for college students to connect. That’s a simple thing to offer, but, over a decade later, he, and many others, run a website that has connected over one billion people. That is clearly going to have an effect on the way people think, and subsequently, it’s going to affect the way they vote. Especially if what they are reading is not correct on multiple occasions.
I’m not even arguing that Facebook got Donald Trump elected, I’m arguing that Facebook has grown too large for its own good, and there needs to be some safeguards from misinformation.
Weeks before the election, Facebook’s “trending” algorithms accidentally promoted a fake story about Fox News anchor Megyn Kelly pledging her support for Hillary Clinton, so, clearly Facebook misinformation is not a partisan issue.
Even if Zuckerberg believes that Facebook is not creating “filter bubbles” for people to only read and hear what they want to read and hear, it is an issue that we will need to address one day.
Americans wonder how we have gotten so polarized and divided. I would argue it is because we do not hear dissenting information anymore. Don’t like your aunt’s anti-abortion rant? Yeah, let’s just unfollow her; I won’t say I’m not guilty of such a thing.
So, when Donald Trump walked on stage after Election Day, winning the presidency, and we all asked each other “How did this happen?” Maybe we should look at our “unfollowed” lists.
Facebook is no longer just a college student’s pet project, it is a multi-billion dollar media filter, and one day it will need to be held to a different standard.