Scarlett Madison Scarlett Madison is a mom and a friend. She blogs for a living at Social News Watch but really prefers to read more than write. Find her on Twitter, Facebook, and Pinterest.

Facebook is defending its controversial “experiment”

1 min read

If you have been following the news, you might have heard how Facebook had conducted a secret experiment on its users. The experiment, for those who are hearing about this for the first time, basically filtered posts in a user’s News Feed to either display positive or negative posts. The results found that negative posts could influence a user to end up posting something negative themselves, and the opposite is true with positive posts. While the experiment was legal as it was covered in Facebook’s T&C that we all pretty much agreed to when we signed up, many users were unhappy that they were experimented on. Facebook has since spoken up on the matter to clarify the reasoning behind the experiment. In a post by Adam D.I. Kramer, the lead researcher for the experiment, the basic premise was that Facebook simply wanted to understand its users better.

Ever since word got out that Facebook had briefly manipulated some users’ News Feeds to see how their feelings changed, a number of questions have popped up: just why did the company feel compelled to experiment in the first place? How noticeable was it? And was it worth the effort? As of today, we have some answers to those riddles. Study co-author Adam Kramer explains that Facebook was worried people would stop visiting the social network if they saw too many emotional updates — a lot of negative posts could scare some people off, while a surge of positive vibes would leave others feeling left out. That’s not what happened, however, and Kramer stresses that the company “never [meant] to upset anyone.” He also suggests that Facebook won’t repeat history any time soon. The results of the circa-2012 field test may not have justified the “anxiety” that followed two years later, he says. Also, Facebook has been refining its “internal review practices” ever since, and it’s taking the public’s current response into account. Kramer doesn’t say whether or not similar experiments will take place again, but it’s clear that the company will be treading more carefully if it does. As it stands, there was only just enough of a change to suggest that the altered News Feeds had an effect.

Avatar of Scarlett Madison
Scarlett Madison Scarlett Madison is a mom and a friend. She blogs for a living at Social News Watch but really prefers to read more than write. Find her on Twitter, Facebook, and Pinterest.

How should Facebook determine which news stories we see?

In case you missed it, Gizmodo reported last Monday that Facebook is actively and methodically suppressing conservative news stories, and has been doing so for a while. Even...
Avatar of Brian Molidor Brian Molidor
1 min read

Facebook is now wrapped in a massive political bias…

Facebook and Google have become the primary sources of news for a significant chunk of the developed world, which means that both companies are in...
Avatar of Brian Molidor Brian Molidor
1 min read

Facebook is making its own morning show

Back in the day, people would flip on the television or grab the newspaper as soon as they woke up, but nowadays, most people open...
Avatar of Louie Baur Louie Baur
1 min read

Leave a Reply

Your email address will not be published. Required fields are marked *