It seems like Facebook can’t do anything right. In light of the recent US election results, some have accused Facebook of trapping users in “echo chambers” and legitimizing false information that could have swayed election results. Some want Facebook to change its news feed algorithm or introduce some level of human fact-checking or content curation. At the same time, Facebook has received heavy criticism for adding any human touches to its site. Facebook’s Safety Check feature was deployed in winter 2015 for Paris terrorist attacks, but not for a terrorist attack in Beirut that happened the same month. This issue caused backlash, as employees were accused of determining one crisis to be more “relevant” or “important” than the other. Facebook was also accused of screening out conservative news articles from appearing in the “Trending” pane of its site, a move that seemed manipulative and unethical.
Facebook is a product and, in theory, if any of this really irritates users, they can opt out. In reality, Facebook is too omnipresent for this to be viable—Facebook is already a staple in the global ecosystem. Research done by LIRNEAsia and replicated by Quartz shows that in 3rd world countries such as Indonesia or Nigeria, Facebook has conceptually replaced the Internet, with at least a few million of Facebook’s 1.4 billion users saying that they “use Facebook but not internet”. Facebook, with its internet.org initiative and support of Facebook-only mobile data plans, has done nothing to correct this belief. Sheryl Sandberg, current Chief of Operations at Facebook, stated, “people will walk into phone stores and say ‘I want Facebook.’ People actually confuse Facebook and the internet in some places.”
So does Facebook, with its massive reach, have a responsibility to tell the truth, or to be impartial? 60% of millennials rely on Facebook as a primary news source, but Facebook’s algorithms are imperfect. At its core, all tech is still created by humans, and humans are biased. As of 2015, Facebook employee demographic data indicated that technical employees were 84% male and 95% white or Asian, while America is 49.2% male and 82.7% white or Asian. It’s difficult to believe that Facebook can write software that is truly impartial when there is so little diversity within the company, especially when Facebook is headquartered in one of the most liberal areas in the country.
Facebook has also gotten flack for allowing people to shield themselves in ideological bubbles. It’s unfortunately common to see status updates with requests like “unfriend me if you voted for Trump”, and Facebook’s algorithm is designed to show users content that they like, which, more often than not, means content they agree with. The Wall Street Journal’s “Blue Feed Red Feed” interactive graphic is a particularly striking example of the stark difference between liberal and conservative Facebook. Under the topic “abortion”, the blue feed showed lines like “the war on women does not end” and “Trump impedes birth control” while the red feed displayed stories about “victims of sex-selection abortion in China” and referred to a court case in which a woman murdered her child as “after-birth abortion”. Facebook was meant as a social network and source of entertainment, but like most social spaces, politics has seeped in. We all agree that there is no issue when the news feed algorithm is used to serve users more cute animal videos, but given Facebook’s reach in the world, the news feed is now a powerful way to influence beliefs and actions.
Facebook is just one of the major tech companies who has the privilege of our attention. Twitter has algorithms for user discovery and recommended tweets, and Google is the industry champion in “relevant to you” search engine results. All of these outlets have a vested interest in showing us what we want, and what we want is to feel safe, comfortable, and unchallenged.
Facebook cannot keep burying its head in the sand and denying its influence, but ultimately, all media and media delivery systems are somewhat biased, and always will be. Though tech tools have maybe made it easier to avoid dissenting opinions, the means has always existed. Not many conservatives read the New Yorker, and not many liberals watch Fox News.
We can do better to get more diversity into Silicon Valley, represent new perspectives, and clear general misconceptions, but Zuckerberg himself says it best: “I think we would be surprised by how many things that don’t conform to our worldview, we just tune out. I don’t know what to do about that.”
Women in business_Bocconi Female Students Association