Facebook Manipulated Users' Moods Because It Could [UPDATE]

In a recently published paper, Facebook revealed that it manipulated the posts some 689,000 users to see if it could affect their moods.

Photo Removed
Complex Original

Blank pixel used during image takedowns

Photo Removed

It's hard to tell what's more alarming: that Facebook intentionally manipulated the moods of nearly 700,000 users for research purposes or that the social network doesn't understand why people are sp concerned about it.

In a recently published paper, Facebook revealed that it manipulated the posts some 689,000 users saw on their newsfeeds in order to determine whether it could enact a process of "emotional contagion" through the dissemination of positive and negative information. The study, which was carried out alongside researchers from Cornell and the University of California, concluded that "[e]motions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

Unsurprisingly, the news prompted concerns over the ethics of the study. In response, a Facebook spokeswoman said the aim of the manipulation was "to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow." She added that there is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely."

UPDATE: One of the company's three data scientists, Adam Kramer, responded to the controversy with the following Facebook post:

None
None
None
None
None

[via Forbes]

Latest in Pop Culture