Facebook conducted secret “mood experiments” on over 700,000 users

TECHi's Author Alfie Joshua
Opposing Author Theatlantic Read Source Article
Last Updated
TECHi's Take
Alfie Joshua
Alfie Joshua
  • Words 150
  • Estimated Read 1 min

In an experiment it conducted earlier this year, Facebook injected the feeds of nearly 700,000 its users with negative content to see if it would make the posts they wrote more negative. The researchers believe that it did. The mood of the posts seen in the news feeds of the experiment’s subjects moved like a “contagion” into the posts of said subjects. The inverse, too, was true, the researchers say. Facebook actually messed with the moods of its users, who — allow me to remind you — are Facebook’s bread and butter. Exposing users to the advertisements of Facebook’s partners — on and off Facebook – is the social media giant’s only real business. So it’s surprising that Facebook would conduct such experiments, and even more surprising that it would be dumb enough to publish the results. The paper was published in the Proceedings of the National Academy of Sciences.

Theatlantic

Theatlantic

  • Words 235
  • Estimated Read 2 min
Read Article

Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world. But few users expect that Facebook would change their News Feed in order to manipulate their emotional state. We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.

Source

NOTE: TECHi Two-Takes are the stories we have chosen from the web along with a little bit of our opinion in a paragraph. Please check the original story in the Source Button below.

Balanced Perspective

TECHi weighs both sides before reaching a conclusion.

TECHi’s editorial take above outlines the reasoning that supports this position.

More Two Takes from Theatlantic

The Amazonification of Reality, Now in video Gaming
The Amazonification of Reality, Now in video Gaming

Death Stranding 2 is not just a sequel, it's a layered commentary on the modern human condition. Along with presenting…

Groupon is cutting about 10% of its workforce
Groupon is cutting about 10% of its workforce

Remember a few years ago when Groupon turned down a $6 billion acquisition deal from Google? Yeah, I can pretty…

Amazon wants to pay authors for every page read rather than book sold
Amazon wants to pay authors for every page read rather than book sold

The e-book industry has remained more or less the same since it was created, and even Netflix-esque services for e-books…

Chrome and Firefox users are actually better employees
Chrome and Firefox users are actually better employees

Internet Explorer's days as the king of web browsers are long gone. Not only are Chrome and Firefox better, the…