Facebook manipulates user’s emotions in the name of Science

Guns Facebook

According to Forbes Magazine, a team of Facebook data scientists are constantly at work thinking of ways to study human behavior through the social network. Facebook  has no need to get experiment participants to sign consent forms because they’ve already agreed to the site’s data use policy when signing up to use the sight. When the team releases papers about what it’s learned from us, we often learn surprising things about the social networking giant, such as the fact that it can keep track of the status updates we never actually post. This isn’t the first time Facebook has played around with manipulating its users. For example, Facebook got 60,000 people to rock the vote in 2012 that theoretically wouldn’t have on their own — but a recent study shows Facebook playing a whole new level of mind games with its users. As first noted by Animal New York, Facebook’s data scientists manipulated the News Feeds of over half a million users, removing either all of the positive posts or all of their negative posts to see how it affected their moods. If there was a week in January 2012 where you were only seeing photos of dead dogs or incredibly cute babies, you may have been part of the study. Now that the experiment is public, people’s mood about the study itself would best be described as “disturbed.”

Researchers, led by Adam Kramer, concluded that emotions were contagious. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,” according to the paper published by the Facebook research team, “These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

The experiment ran for a week in 2012, during which hundreds of thousands of Facebook users unknowingly participating may have felt either happier or more depressed than usual, as they saw either more of their friends posting ’15 Photos That Restore Our Faith In Humanity’ articles or despondent status updates about losing jobs, getting screwed over by X airline, and already failing to live up to New Year’s resolutions.

The researchers — who may not have been thinking about the optics of a “Facebook emotionally manipulates users” study — jauntily note that the study undermines people who claim that looking at our friends’ good lives on Facebook makes us feel depressed. “The fact that people were more emotionally positive in response to positive emotion updates from their friends stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively,” they write.

They also note that when they took all of the emotional posts out of a person’s News Feed, that person became “less expressive,” i.e. wrote less status updates. So prepare to have Facebook curate your feed with the most emotional of your friends’ posts if they feel you’re not posting often enough.

Facebook’s data use policy says  Facebookers’ information will be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement,” making all users potential experiment subjects. And users know that Facebook’s mysterious algorithms control what they see in their News Feed. But it may come as a surprise to users to see those two things combined like this. When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were created because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience and letting men live with syphilis for study purposes. A 2012 profile of the Facebook data team noted, “ Unlike academic social scientists, Facebook’s employees have a short path from an idea to an experiment on hundreds of millions of people.” This study was partially funded by a government body — the Army Research Office — and via @ZLeeily, the PNAS editor on the article says this study did pass muster with an Institutional Review Board, but we’ll see if it passes muster with users.

 

blog comments powered by Disqus