Yes we are. On a daily basis in fact.
In case you hadn’t heard, Facebook went and did something a little bit strange recently. Facebook claims that the study they conducted – which involved collecting data from people without their consent – was for academic research purposes. However, a lot of people are pretty annoyed that Facebook didn’t inform them about the experiment.
Of course, some might say that it’s not the research that’s strange…it’s that they published it.
The purpose of the study, according to Facebook, was to find out how they could manipulate users’ emotions based on the posts they chose to show in their newsfeeds.
The title for the study is “Experimental Evidence Of Massive-Scale Emotional Contagion Through Social Networks”.
Responses to the experiment have been mixed with some feeling that Facebook had no right to use them as test subjects without prior consent, some saying that it’s a common practice on the internet today, and a few others whose conspiracy theories suggest Pentagon involvement.
The Notorious Facebook Emotional Study
The extract for the study points out a few of their aims and assumptions, and I paraphrase:
- Emotional states can be transferred to others via emotional contagion (the emotional contagion referring to a post, photo, etc, that contains some emotional impact), causing people to experience similar emotions without their awareness.
- Data from social networks, collected over a 20 year period, suggests that moods such as happiness or depression can be transferred through social networks.
- The experiment aims to confirm whether emotional contagion can in fact be passed from person to person, even if there is no physical contact involved (ie, over the internet).
- When people were shown more negative posts, they produced fewer ‘happy’ posts, and when shown more ‘happy’ posts, they produced fewer negative posts.
- This shows that the emotions expressed by others on Facebook will affect our own emotional state. It also shows that in-person interaction and nonverbal (physical) cues aren’t necessary for passing emotional contagion.
Now, it’s easy to understand why people would feel uncomfortable with this. Especially when you take into consideration the fact that the study was conducted in 2012, in partnership with Cornell University and the University of California-San Francisco, and none of us were any the wiser.
However, even though you may feel a little uncomfortable with Facebook taking this liberty, what you should also know is that of the 3 million posts analysed in the study, the researchers were never allowed to see a post in its entirety.
Rather than consider full posts, they were only allowed to work off the occurrence of certain positive or negative words in the posts. Furthermore, the results from the study are (predictably) being blown way out of proportion.
The largest effect measured in the study was a mere two hundredths of a standard deviation (d = .02). The smallest was one thousandth of a standard deviation (d = .001). So with such a small result, was the study really a success?
Adam Kramer, a Facebook employee and co-author for the study released a statement to try to justify Facebook’s actions. In the statement he claimed that the study was conducted in part because Facebook cares about the emotional impact they have on people and that the data they collected was only from a tiny sample group (0.04% of users) who were only minimally impacted by the study. He finished the statement by saying that Facebook were working on improving their internal review practices.
But I’m not really sold on this response. Facebook doesn’t mention what they will do to recompense the people involved in the study, they simply say that they won’t go about this sort of thing in the same way in the future.
The fact remains that a close inspection of the Facebook terms of service will show you that they are technically allowed to do this sort of thing all the time. The terms of service say that Facebook may use a person’s information “for internal operations, including troubleshooting, data analysis, testing, research, and service improvement”.
However, another shocking fact that came to light recently is that their terms of service document was updated four months after the experiment was conducted, which seems to be a rather unethical way of doing things.
The Facebook Algorithm
Also consider the fact that Facebook employs an algorithm to filter your newsfeed. It would be impossible to read every post from every one of your Facebook friends every day, hence the fact that Facebook now shows a “Top Stories” newsfeed by default, rather than the Twitter-style “Most Recent” feed.
What ends up in the “Top Stories” feed is determined by Facebook’s algorithm, which they tweak and update regularly. It’s put in place to make you like more stories and click on more ads. Which makes sense: Facebook wants you to see more happy posts.
The more happy posts you see, the happier you will feel, and the happier you are on Facebook, the longer you’ll stay on their site, the longer you’re online, the more likely you are to click on a Facebook advertisement.
Facebook and You
At the end of the day, what you need to remember is that Facebook collects and stores a huge amount of data about every person who has a profile. They know how old you are, what music you like, what type of events you go to, even if you prefer looking at photos on your newsfeed or videos.
No matter what Facebook does, at this point, none of us are going to delete our accounts. No matter how we’re violated or manipulated, or how aware we become of Facebook’s access to and use of our personal information, we all keep logging in. Maybe a more interesting research paper would investigate why we all do this.