Published On: Mon, Jun 30th, 2014

Facebook Is In Hot Water Over Secret Emotional Experiment

Are Facebook users now being used by it as guinea pigs without their knowledge?

A view of Facebook's logo May 10,    2012 i

 

Facebook has owned up to having conducted a psychological study back in January of 2012 without the participants’ knowledge. The world’s foremost social networking website manipulated the emotions of 689, 003 users in a secret experiment.

A study was conducted by researchers from the University of California and Cornell University in which Facebook placed either positive or negative information on a person’s home page to elicit concurring emotional responses. This process is known as “emotional contagion.” The study was published on June 17th in the Proceedings of the National Academy of Sciences, but only noticed this past weekend.

Its purpose was to determine if emotions can be transferred from one person to another without any direct personal contact. As in the saying, “laughter is contagious, ” it has already been proven that the expression on a person’s face can affect the mood of someone who sees it.

When a user was given access to fewer positive postings by his friends then he was more likely to make negative postings himself. And when fewer negative items were provided to a user then he was shown to upload more positive information.

The researchers stated, “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”

Facebook has defended the study by pointing out that all of its users agreed to its policy on the use of their data when they joined the site. With regards to a user’s information Facebook states that it reserves the right to use it, “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

But many academics find the study to be in violation of ethical standards. They point out that a participant in any such experiment must first give informed consent, which requires that he know what is being done.

James Grimmelmann, a professor of technology and the law at the University of Maryland, wrote in his blog, “Facebook knows it can push its users’ limits, invade their privacy, use their information and get away with it. Facebook has done so many things over the years that scared and freaked out people.”

He further explained that Facebook’s posted policies on privacy do not come close to providing the participants in the study with informed consent and that the research was not even productive as it failed to meet academic and professional standards.

The Professor pointed out in his blog that at least when Facebook takes advantage of a user’s personal information for advertising purposes, the individual in question always knows when he has an ad placed on his page.

Adam Kramer, a Facebook data scientist who worked on the study has posted a formal apology on his personal Facebook page. “We were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone, ” he wrote.

I response to the outcry, Facebook stated, “We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

But the fallout continues as a member of the British Parliament, Jim Sheridan, is calling for an investigation into Facebook’s conduct. “This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people. They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it, ” he complained.

Read more about: , , , , ,

Wordpress site Developed by Fixing WordPress Problems