Facebook Privacy

Just when you thought the NSA was prying into your personal lives, Facebook recently unveiled that they’ve been doing the same thing.

Facebook recently published a study in the Proceedings of the National Academy of Sciences which stated that during one week in 2012 (one week? yeah, right) that the invasive company fixed the newsfeeds of about 700k users. The feeds were altered for people to see containing positive news and happy words and other feeds were shown showing negative news and words. Facebook concluded that at the end of the week, people who saw happy posts, posted happier posts, while the people who saw sad posts, posted more sad posts.

Doesn’t sound like rocket science does it?

But not everyone is happy that their emotions have been toyed with.

“What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we’ll respond to but actually change our emotions,” wrote Animalnewyork.com, a blog post that drew attention to the study Friday morning.

On Sunday, the Facebook data scientist who led the study in question, Adam Kramer, said he was having second thoughts about this particular project. “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he wrote on his Facebook page.

“While we’ve always considered what research we do carefully,” he wrote, Facebook’s internal review process has improved since the 2012 study was conducted. “We have come a long way since then.”

From the Wall Street Journal:

Facebook’s Data Science team occasionally uses the information to highlight current events. Recently, it employed it to determine how many people were visiting Brazil for the World Cup. In February, The Wall Street Journal published a story on the best places to be single in the U.S., based on data gathered by the company’s Data Science Team.

Those studies have raised few eyebrows. The attempt to manipulate users’ emotions, however, struck a nerve.

“It’s completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments,” said Kate Crawford, visiting professor at MIT’s Center for Civic Media and principal researcher at Microsoft Research.

Ms. Crawford said it points to broader problem in the data science industry. Ethics are not “a major part of the education of data scientists and it clearly needs to be,” she said.

Asked a Forbes.com blogger: “Is it okay for Facebook to play mind games with us for science? It’s a cool finding, but manipulating unknowing users’ emotional states to get there puts Facebook’s big toe on that creepy line.”

Slate.com called the experiment “unethical” and said “Facebook intentionally made thousands upon thousands of people sad.”

Mr. Kramer defended the ethics of the project. He apologized for wording in the published study that he said might have made the experiment seem sinister. “And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it,” he wrote on Facebook.

Clutchettes, does this news make you change the way you use Facebook?

Tags: , ,
Like Us On Facebook Follow Us On Twitter