Social media is undeniably a huge part of our world today. And it’s time to stop idiotically focusing on social media as simply a means of vanity for millennials and start considering the real ways that it impacts our lives, politics and decisions.
Huge sites, namely Facebook and Google, have been facing public concerns about allowing blatantly fake news articles on news feeds and top-tier search results. While there is wider worry about fake news in general, there are more immediate concerns about fake news impacting the 2016 presidential election specifically.
On the morning after the election in November, a Google search asking, “results of presidential election?” would have earned the searcher a list of voting stats provided directly by Google. Just below that, in the related news articles, the very first result was an article from an obscure, unfounded site claiming that Donald Trump won both the popular and electoral votes, complete with entirely fake numbers and statistics while citing a random tweet from a user with no evidence as a source.
During this time of the election, similar fake articles were found all across Facebook. Some users have claimed that such misinformation impacted voters and ultimately tampered with election results.
To address these rising concerns about misinformation on Facebook, the CEO, Mark Zuckerberg, posted on his Facebook page in November offering transparency about Facebook’s position on the issue.
In the post, Zuckerberg foremost addresses claims of Facebook negatively influencing the election, insisting that fake news had next to no actual impact on the election. At the end of the post, he brings attention to the obvious positive impact that Facebook had on the election, helping over 2 million Americans register to vote and prompting a similar amount of people to go to the polls who might not have otherwise.
He also writes that only 1 percent of content on Facebook is actually fake news and hoaxes, and little of that is even about politics. Zuckerberg also touches on how complex the issue of “fake news” is. He mentions that most articles that are popularly labeled “fake” are mostly true, with a few incorrect or heavily biased “facts.” Moreover, even entirely true articles are often flagged as fake by people who disagree with the article’s stance.
Since this statement by Zuckerberg, Facebook has made some changes to how they handle fake news. They’ve strengthened methods for reporting articles with questionable sources and have made it so users will be notified before sharing posts whose validity has been disputed.
Another action Facebook is taking on this issue is removing financial incentives and ad-selling services from fake news sites and other suspicious content. (Removing ad revenue privileges for not upholding values of a social media site is not a foreign concept – YouTube implemented a similar demonetization plan earlier this year.)
So what the heck is to be done? Does this problem actually even matter?
Facebook has become big enough for people to project titles onto it that the site didn’t initially intend. Zuckerberg replied to a comment on his post: “Remember that Facebook is mostly about helping people stay connected with friends and family. News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance.”
Though Facebook didn’t ask to be a news site, should they still take responsibility for what their site has become and the effects of that new state? If so, what should that “responsibility” look like?
People are often thoughtless, if not about everything in life, at least about what they consume online. And it’s tempting to say, “People are going to be stupid online. They won’t check the authenticity of news they trust, and that’s their problem for being idiots.”
But when Internet irresponsibility potentially affects things like national elections, that’s no longer a personal issue. In a situation like that, millions of people’s failure to be correctly informed can potentially affect millions more of other people’s lives.
Is it sites like Google and Facebook’s job to regulate which sources are authentic, if in no other area, at least when it comes to elections and politics? Is that considered censorship? The American people are typically fast to cry out against censorship and demand freedom of speech, but the idea of Facebook taking down fake information seems to hold more support than I would’ve thought.
Is Facebook doing the right thing with the little steps of reporting hoaxes and simply warning users about potential fake news? Should more drastic measures be taken?
People (particularly those who comment on Internet posts, whether young or old, rural or urban) like to view things as black and white: their view is right and obvious, and the other view is outrageous and wrong. That way of thinking is most frustrating to me because it ignores the basic reality of complexity in issues.
And this issue is certainly complex and new. With the widespread use of social media, misinformation has the opportunity to be spread in ways that are unfamiliar – not that the traditional ways of spreading fake news have a real solution either.
There’s no answer I could offer to this problem, other than saying that it has the potential to be very serious, and as a user of social media, you have an obligation to be informed about this issue and to be smart about the information you consume from sites like Facebook.