At any one time, Facebook has 1,500 possible items it can insert in users’ News Feeds. It inserts just a small fraction of that total. Apparently, it’s okay for Facebook to change that algorithm to improve engagement and earn the company plenty of money, but it’s not okay for Facebook to choose posts that would make those reading their News Feed feel more cheerful—or depressed.
A recently published academic study has drawn negative reactions and cries over a lack of ethics. Researchers, working with Facebook, actively manipulated feeds to show posts with either more positive sentiments or more negative sentiments. The idea was to test if emotional states could be transferred via social networks. Psychologists have already established that the principle of emotional contagion works in person. As a result of the study, they learned emotional contagion also works online, which likely confirms the gut feeling of most people who have used social networks.
Why are people mad? Because psychologists made thousands of people feel sad without their informed consent. I’m happy that there are such strict standards for psychological research. I certainly wouldn’t have wanted to participate in Milgram’s famous study or the Stanford Prison Experiment. But I wouldn’t have minded if Facebook manipulated by News Feed and monitored my posts afterward. That said, informed consent would have been nice.
To probe a little deeper psychologically, maybe people’s anger here is misplaced. Many people are uncomfortable with how social networks shape our relationships and eavesdrop into the goings-on of our lives. This study is something people can point to, a flash point that allows people to express the more subtle sense of unease they feel with services like Facebook.
People may be upset about this study, but if anything, it’s the tip of the iceberg. For just a hint of the kind of observations and correlations that can be found in social networks, look no further than the dating site OkCupid, which often releases data about what affects people’s popularity on their site. Things like tattoos and views about religion and relationships affect compatibility ratings and how likely someone is to initiate communication with others. It’s even possible to game their system in order to get more dates and find love, which one PhD. Student tried (It worked!).
This was a psychological study with published results. Facebook is under no obligation to share what’s behind its algorithm. I’m sure the Facebook News Feed and Data Science team, which the researchers thank in the study, probably have manipulated News Feeds far more, and to greater effect, than the researchers did. Facebook states that one factor in their News Feed algorithms is “How much you have interacted with this type of post in the past” [my emphasis]. Of all their criteria, this is the one that allows Facebook the most flexibility to create different predictive algorithms and unusual groupings. Is it just a coincidence that my News Feed today is cluttered with people all expressing the same opinion about a recent Supreme Court ruling? Or that if I click on one baby picture, a cascade follow?
I don’t know the answer to those questions, because Facebook doesn’t share the specifics. That goes counter to what many in the industry are advising: transparency as a way to earn consumers’ trust. OkCupid’s nerdy transparency is just one example of how this leads to success and earns them great marketing in the process. For Facebook to be transparent would be much trickier. They must look at situations like these that lead to negative press and start quaking from the reaction that would happen if their algorithms were released. Facebook founder Mark Zuckerberg famously stated “privacy is no longer a social norm,” but it appears that statement doesn’t apply to the privacy-busting network itself—just yet.