Facebook’s Been Running Psychological Experiments On You

facebook_experimentFelt a little lower than usual, or happier than usual for a week in January 2012? It might have been because of an “experiment” Facebook conducted on you.

Facebook identified 689,003 English speaking users to run a psychological experiment on, for the duration of a week. They began to manipulate the newsfeed of a group of these users to remove posts with a negative emotion attached to them, and removed all posts with a positive emotion for the other group. The objective of the study – can we be emotionally influenced by what we see in our Facebook newsfeed?

And if so, how much?

The experiment has caused a huge amount of uproar on… uhh, Facebook. There’s post after post of people feeling that they’re being treated like guinea pigs and that manipulating someone’s mood is incredibly dangerous. These are all legit concerns. Someone battling depression, or worse, on the brink of suicide, being exposed to increasingly “negative” posts on Facebook for an entire week might be persuaded to do something that they normally wouldn’t.

Is what Facebook did ethical? Probably not.

Is what Facebook did legal? Absolutely.

 

Facebook’s terms of service, which every user agrees to, gives them a lot of wiggle-room in terms of what they can or cannot do with our data, and the kind of information and updates that we see on Facebook.

In fact – this is true for most (if not all) social networks. LinkedIn, Twitter, Google+, Tumblr, Pinterest – all of these websites are designed and engineered to influence us to click more, engage more and interact more with them. The nature of their algorithms is never revealed, but one thing is always made clear – they’re doing all they can to give us as much relevant content as possible.

This practice however, of showing one group one piece of content and another group another, isn’t new or ground-breaking. Marketers call this practice of split testing content, A/B testing. Showing a group of people one piece of content and analyzing what emotion it elicits, and showing another group another piece of content and seeing what kind of emotion or action that elicits.

It’s primarily used to drive higher conversions to sales, higher engagement, more sign-ups and so on and so forth.

Stepping outside the realm of social networks, websites such as Amazon run a huge amount of A/B tests to figure out what influences users to buy more. To extrapolate that – you could say that they’re testing out what we emotionally respond to, in order to get us to buy more. Other websites run A/B tests all the time to figure out what landing page will trigger more conversions, what graphic will get more clicks and shares, and what line of text will influence engagement.

 

Websites such as BuzzFeed and Upworthy A/B test their headlines to drive more click-throughs. They’re designed to draw out emotions from readers that prompt them to share the article, and generate more “buzz” for BuzzFeed. That in a way in itself, is a manipulation of your emotions, isn’t it?

 

The truth of the matter is, that websites A/B test all the time. For marketers, this has been common knowledge as the practice of A/B testing has become incredibly common in a digital world when it’s so easy to conduct such a test using a variety of tools that are available to them.

The only difference being, that Facebook actually came out and talked about this A/B test. Most other companies don’t, and never have. It tends to be closely guarded information, to not let leak what companies have learnt through A/B testing about their users, because why would they want to let competitors gain insight from experiments that they’ve run.

You’re not paying to use Facebook or Twitter. When you’re not paying for the product – you are the product. Your information, your memories, your experiences, that’s what you pay with each time you use a social network.

All that set aside however, this does raise a significant amount of questions. If by manipulating the content you see in your newsfeed Facebook is able to influence your thinking, and as an extrapolation, your actions – what is it possible to do, by manipulating the newsfeed of users?

Could a potential political candidate that’s backed by a network like Facebook essentially be able to get more votes? Ask any dictator, any ruler, what one wish they’d like to be granted – and they’d undoubtedly wish for the power to be able to influence the mood and emotion of their people.

Has Facebook essentially, by taking a group of close to 700,000 – proved that if push comes to shove, sway the opinion of the 1.3+ billion people that use the service?

Emotional engineering is, and always has been, Facebook’s business model.

Two thoughts are playing in my mind about this entire debacle. The first, why would Facebook go public with such information? Why would they openly talk about such an obviously unethical experiment that proves that they’re able to influence the mood of a large group of people? Why would they want this known?

And the second, which is far more unsettling, is that all said and done – we aren’t truly surprised that Facebook did this. We’ve come to expect it of them to blur the line between right and wrong, and for us to just shrug, shake our head, and carry on.

 

Read more from a recent NBC news article: Facebook Manipulates Emotions: Business as Usual for Social Media Giant

(original article)

2014-07-21T08:21:29-04:00

Share This Story, Choose Your Platform!

Go to Top