Furor erupts over Facebook's experiment on users
Almost 700,000 unwitting subjects had their news feeds altered as part of a study to gauge effect on emotion.
By Reed Albergotti, The Wall Street Journal
A social-network furor has erupted over news that Facebook (FB), in 2012, conducted a massive psychological experiment on nearly 700,000 unwitting users.
To determine whether it could alter the emotional state of its users and prompt them to post either more positive or negative content, the site's data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users.
The research, published in the March issue of the Proceedings of the National Academy of Sciences, sparked a different emotion – outrage -- among some people who say Facebook toyed with its users emotions and uses members as guinea pigs.
"What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but actually change our emotions," wrote Animalnewyork.com, a blog post that drew attention to the study Friday morning.
Facebook has long run social experiments. Its Data Science Team is tasked with turning the reams of information created by the more than 800 million people who log on every day into usable scientific research.
On Sunday, the Facebook data scientist who led the study in question, Adam Kramer, said he was having second thoughts about this particular project. "In hindsight, the research benefits of the paper may not have justified all of this anxiety," he wrote on his Facebook page.
"While we've always considered what research we do carefully," he wrote, Facebook's internal review process has improved since the 2012 study was conducted. "We have come a long way since then."
The impetus for the study was an age-old complaint of some Facebook users: That going on Facebook and seeing all the great and wonderful things other people are doing makes people feel bad about their own lives.
The study, Kramer wrote, was an attempt to either confirm or debunk that notion. Kramer said it was debunked.
According to an abstract of the study, "for people who had positive content reduced in their News Feed, a larger percentage of words in people's status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred."
The controversy over the project highlights the delicate line in the social media industry between the privacy of users and the ambitions—both business and intellectual -- of the corporations that control their data.
Companies like Facebook, Google (GOOG) and Twitter (TWTR) rely almost solely on data-driven advertising dollars. As a result, the companies collect and store massive amounts of personal information. Not all of that information can be used for advertising -- at least not yet. In the case of Facebook, there is an abundance of information practically overflowing from its servers. What Facebook does with all its extra personal information -- the data isn't currently allocated to the advertising product -- is largely unknown to the public.
Facebook's Data Science team occasionally uses the information to highlight current events. Recently, it employed it to determine how many people were visiting Brazil for the World Cup. In February, The Wall Street Journal published a story on the best places to be single in the U.S., based on data gathered by the company's Data Science Team.
Those studies have raised few eyebrows. The attempt to manipulate users' emotions, however, struck a nerve.
"It's completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments," said Kate Crawford, visiting professor at MIT's Center for Civic Media and principal researcher at Microsoft Research.
Crawford said it points to broader problem in the data science industry. Ethics are not "a major part of the education of data scientists and it clearly needs to be," she said.
Asked a Forbes.com blogger: "Is it okay for Facebook to play mind games with us for science? It's a cool finding, but manipulating unknowing users' emotional states to get there puts Facebook's big toe on that creepy line."
Slate.com called the experiment "unethical" and said "Facebook intentionally made thousands upon thousands of people sad."
Kramer defended the ethics of the project. He apologized for wording in the published study that he said might have made the experiment seem sinister. "And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it," he wrote on Facebook.
Facebook also said the study was conducted anonymously, so researchers could not learn the names of the research subjects.
Kramer said that the content -- both positive and negative -- that was removed from some users' news feeds might have reappeared later.
The emotional changes in the research subjects was small. For instance, people who saw fewer positive posts only reduced the number of their own positive posts by a tenth of a percent.
Comments from Facebook users poured in Sunday evening on Kramer's Facebook page. The comments were wide-ranging, from people who had no problem with the content, to those who thought Facebook should respond by donating money to help people who struggle with mental health issues.
"I appreciate the statement," one user wrote. "But emotional manipulation is emotional manipulation, no matter how small of a sample it affected."
Facebook users agree to terms of service that give the company wide leeway in how it can treat them.
More from The Wall Street Journal
Who among us are really shocked at this so-called news. It would surprise me to hear that they haven't been doing this type of research from the beginning and will continue to do it.
So yeah, duh
Are you kidding me?! Never mind that we are being and have continuously been manipulated by all manner of messages 24/7 since the 1920’s when Edward Bernays, the father of modern advertising, figured out that if you speak to consumers’ insecurities, you can sell more stuff. This concept quickly morphed from merely using consumers’ lack of self-esteem against them, to actually perpetuating and even creating more of those insecurities. Never mind that 99% of most people’s opinions are not their own and have merely been unquestioningly adopted from their parents, teachers, politicians, television, movies, magazines, and society as a whole. Never mind that the real issue is not whether or not others are trying to manipulate us, but that we are so manipulatable in the first place.
Because no matter what side of that debate you’re on, you’re still totally missing the point that we don’t have to pay any attention to what’s in our newsfeeds, or on our television screens or written in magazines. The question should never be about how to control those who control our minds. The question I want to ask is: why aren’t we freaking controlling our own damn minds in the first place?
It doesn’t matter what images are thrown at you if you exercise your ability to think for yourself, choose what you give your focus to and form your own, educated opinions. It doesn’t matter what some company does as long as you remember that you don’t have to buy their products. It doesn’t matter what the TV shows you as long as you remember that you have the power to change the channel and turn it off. And if you don’t like the content that’s being distributed, make your own. It’s never been easier to voice your opinion. It’s also never been easier to research all sides of any topic. There really is no excuse to live mindlessly anymore. There really is no excuse for handing that much power, the power over your own experience in this physical reality, over to others. But if you do, you really have no right to complain.
(The above are excerpts from a transcript I read about the Facebook debacle)
My, my, my! Don't you wonder how many others do similar acts? And what else do companies do to manipulate us? Subliminal messaging is done thru words, pictures and acts to manipulate the masses. MSN keeps you flipping to more pages just to see there advertising and you do. You view all this and whether you remember in you subconscious or consciously read it, it affects you!
Now we know, not only do criminals of various intent mine Social sites for victims that expose their life style, possessions and weaknesses, but now we know the sites them selves take advantage of members.
So much is done to manipulate the masses. Food stores move things around so you have to wonder the store to find what you want or put the most common purchases, like bread at the far end of the store.
Enticements to subscribe to magazines and newsletters far in advance so they can have and hold your money now.
Think about this - write a message and see the button below " Also post to FaceBook"? Now you can post your anonymous comments directly to FaceBook in your own recognizable name. You then be identified and sorted by key word or phrases into political, racial, religious or what ever categories they choose.
It goes on and on....Twitter, Linked in ....
I already know I'm an oddball contrarian. I joined FaceBook long ago but never used it very much at all.
Tough luck for all you sheep!
Copyright © 2014 Microsoft. All rights reserved.
Start investing in technology companies with help from financial writers and experts who know the industry best. Learn what to look for in a technology company to make the right investment decisions.
With new apps geared toward booking business trips, two startup stars of the sharing economy aim to tap into the lucrative -- and highly competitive -- corporate travel market.
VIDEO ON MSN MONEY
MUST-SEE ON MSN
A charcuterie master shares his process for cold-smoking meat at home.
- Jetpacks about to go mainstream
- Weird things covered by home insurance
- Bing: 70 percent of adults report 'digital eye strain'