If you’re feeling very happy, sleepy or angry, don’t attribute it to the weather. It might be Mark Zuckerberg’s team.
Facebook created anger, toward itself, over news that the social media company two years ago manipulated the news feeds of nearly 700,000 users, without their knowledge, to determine if friends’ posts can change your emotions.
One group of test subjects — remember the days when we used to call them customers? — received mostly upbeat posts. The other group received gloomy ones. The results, published in Proceedings of the National Academy of Sciences in March, found that a positive news feed inspired positive posts, while a steady diet of bad news prompted bad-news posts.
There is something a little creepy about being an involuntary lab rat. Unlike lab rats, no humans were injured in this experiment — that we know of. But face it, we’re lab rats every time we visit a website, use a credit card or even shop for groceries.
Determining likes and dislikes is a staple of modern marketing. Privacy policies governing the use of your data are written in such legalese that no one bothers to sift through the fine print to figure out what we have really authorized them to do with our information. Routinely, we click “accept” and “next” and go about our business. It’s just as well. We’d be amazed.
What makes Facebook’s experiment seem so unnerving is that Facebook toyed with emotions in a secret study. That makes it a bit different than asking a focus group to tell a researcher whether they prefer chocolate or vanilla ice cream. No drug company would conduct drug trials without explicit, informed consent. In social media, the rules are a lot murkier, and at Facebook, a company known for pushing the privacy boundaries, the rules are darn near impenetrable.
But while we’re getting irked at Facebook, remember that even brick-and-mortar retailers appeal to feelings, even if it’s just pleasant background music to encourage sales or make stores inviting. And in politics, strategically worded “push” poll questions are designed to emotionally draw a certain response and mold public opinion.
Facebook has apologized, but it still seems a bit clueless about the public outrage. It issued a convoluted statement about how some people whose friends post positive content might feel left out of all the positiveness and ... (wait for it) leave Facebook. Say what?
What did Facebook actually learn? “At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week,” explained Facebook researcher Adam D.I. Kramer.
We think that means they got spit. But they certainly created negativity, and this time it is measurable.
— The Dallas Morning News