Social Media

Illustration by Joaquin Kunkel

Like, Comment but Don’t Share: How Facebook Took Over

What goes on behind the social media screen.

Nov 19, 2016

The aftermath of the U.S. presidential election has left many millennials shocked at a world so mismatched with their expectations. Post-election media coverage tells the same tale that it has for each of 2016's upset victories for conservatism — we should have seen it coming.
So why didn't we?
While social media already feels like a natural part of the modern routine, it is easy to forget that the content you see on Twitter, Facebook or Instagram is a product of who you know, where you live and most importantly, what you like. On Facebook, an algorithm, EdgeRank, determines what you see next on your News Feed. EdgeRank first allowed users to curate their own News Feeds, without even realizing it. The core algorithm has a simple formulation that ranks a post by the number of connections you have to it. Each connection is weighted by three inputs: user affinity, action weight and time decay. Time decay is the simplest, ranking a post higher if the interaction is recent — a property that is exploited when old photos are bumped for new likes. Action weight determines the hierarchy of the ways a post can be interacted with; a like will contribute far less to a post’s rank than a share, for example.
The most important input is user affinity, which means how often you interact with a page, friend or a source. It is determined almost entirely by your actions on social media. Everything you like, comment on or interact with is put in the spotlight. The clearest way to see this in action is to look at a post that hundreds of your friends have liked. Next to the number of likes, you will see two names; assuming that all of your friends have liked the post, then those are the names of your two best friends. This isn’t a lame magic trick — I just have faith in the algorithm. Facebook, and in this case I, assume that you interact with your best friends more than with anyone else. It assumes that this is true for most aspects of your life, that what one wants to see and what one interacts with are one and the same.
While Facebook has evolved beyond the simplistic EdgeRank — they now take over 100,000 behavioural inputs into account for rankings — the core principles and outputs of the algorithm remain: what you choose to look at is what you will want to see in the future. The output is useful when waiting for new photos of one’s significant other, but the effect this algorithm has on our political expectations is not apparent.
As your news feed gets filled with the dank memes and political commentary you are most likely to click on, it also gets filled with the ideas you agree with the most. Over time, affinity for posts with which you agree creates a filter for what the algorithm will show. According to Eytan Bakshy, a member of the Facebook Core Data Science Team, this pattern is a homophily. In network studies, homophily refers to personal bonding in relation to love or appreciation for common things or ideas; however, he concluded that people see more diverse opinions on Facebook than they likely would in real life. In their study, Facebook claimed that homophily is outweighed by the posts of loose-connections, or Facebook friends who you rarely interact with in real life.
Unfortunately, people rarely like dissenting opinions on social media — in fact, they rarely even give it an angry face. Instead, most choose to either ignore or actively remove content they disagree with. In a study done by Pew Research Center, 40 percent of those polled had blocked or unfriended someone over political disputes, and an overwhelming 83 percent said that they ignore political content with which they do not agree. Unlike in the real world, nothing keeps them engaged, so while people may see more dissenting content on Facebook, they can choose to ignore it.
While it's easy to blame our surprise with political outcomes on a misguided faith in polls, our belief in statistics rarely overwhelms our experience. As our experience is increasingly defined by social media, so are our expectations. Yet we seem entirely blind to the expectation bias that this social media imparts to us. Despite Facebook results showing that only one out of six of the average user's friends disagrees with them politically, around 60 percent of people believe their feed represents either an even mix of opinions, or mostly those different from their own.
Looking back, it is clear that this is a contributing factor to how we got so blindsided by the results. While I did not block or unfollow my conservative friends, I rarely gave their posts a second thought because of how infrequently they showed up in my feeds. Facebook’s algorithms worked exactly as intended: the posts I interacted with most rose to the top and enabled my very own filtered bubble. During the course of the election, social media gave me comfort and confidence that the world was what I expected it to be. On Nov. 8, the bubble popped, as bubbles inevitably do.
William Held is a contributing writer. Email him at feedback@thegazelle.org
gazelle logo