Facebook identified 689,003 English speaking users to run a psychological experiment on, for the duration of a week. They began to manipulate the newsfeed of a group of these users to remove posts with a negative emotion attached to them, and removed all posts with a positive emotion for the other group. The objective of the study – can we be emotionally influenced by what we see in our Facebook newsfeed?
And if so, how much?
The experiment has caused a huge amount of uproar on… uhh, Facebook. There’s post after post of people feeling that they’re being treated like guinea pigs and that manipulating someone’s mood is incredibly dangerous. These are all legit concerns. Someone battling depression, or worse, on the brink of suicide, being exposed to increasingly “negative” posts on Facebook for an entire week might be persuaded to do something that they normally wouldn’t.
Is what Facebook did ethical? Probably not.
Is what Facebook did legal? Absolutely.
Facebook’s terms of service, which every user agrees to, gives them a lot of wiggle-room in terms of what they can or cannot do with our data, and the kind of information and updates that we see on Facebook.
In fact – this is true for most (if not all) social networks. LinkedIn, Twitter, Google+, Tumblr, Pinterest – all of these websites are designed and engineered to influence us to click more, engage more and interact more with them. The nature of their algorithms is never revealed, but one thing is always made clear – they’re doing all they can to give us as much relevant content as possible.
This practice however, of showing one group one piece of content and another group another, isn’t new or ground-breaking. Marketers call this practice of split testing content, A/B testing. Showing a group of people one piece of content and analyzing what emotion it elicits, and showing another group another piece of content and seeing what kind of emotion or action that elicits.
It’s primarily used to drive higher conversions to sales, higher engagement, more sign-ups and so on and so forth.
Stepping outside the realm of social networks, websites such as Amazon run a huge amount of A/B tests to figure out what influences users to buy more. To extrapolate that – you could say that they’re testing out what we emotionally respond to, in order to get us to buy more. Other websites run A/B tests all the time to figure out what landing page will trigger more conversions, what graphic will get more clicks and shares, and what line of text will influence engagement.
Websites such as BuzzFeed and Upworthy A/B test their headlines to drive more click-throughs. They’re designed to draw out emotions from readers that prompt them to share the article, and generate more “buzz” for BuzzFeed. That in a way in itself, is a manipulation of your emotions, isn’t it?
The truth of the matter is, that websites A/B test all the time. For marketers, this has been common knowledge as the practice of A/B testing has become incredibly common in a digital world when it’s so easy to conduct such a test using a variety of tools that are available to them.
The only difference being, that Facebook actually came out and talked about this A/B test. Most other companies don’t, and never have. It tends to be closely guarded information, to not let leak what companies have learnt through A/B testing about their users, because why would they want to let competitors gain insight from experiments that they’ve run.
You’re not paying to use Facebook or Twitter. When you’re not paying for the product – you are the product. Your information, your memories, your experiences, that’s what you pay with each time you use a social network.
All that set aside however, this does raise a significant amount of questions. If by manipulating the content you see in your newsfeed Facebook is able to influence your thinking, and as an extrapolation, your actions – what is it possible to do, by manipulating the newsfeed of users?
Could a potential political candidate that’s backed by a network like Facebook essentially be able to get more votes? Ask any dictator, any ruler, what one wish they’d like to be granted – and they’d undoubtedly wish for the power to be able to influence the mood and emotion of their people.
Has Facebook essentially, by taking a group of close to 700,000 – proved that if push comes to shove, sway the opinion of the 1.3+ billion people that use the service?
Emotional engineering is, and always has been, Facebook’s business model.
Two thoughts are playing in my mind about this entire debacle. The first, why would Facebook go public with such information? Why would they openly talk about such an obviously unethical experiment that proves that they’re able to influence the mood of a large group of people? Why would they want this known?
And the second, which is far more unsettling, is that all said and done – we aren’t truly surprised that Facebook did this. We’ve come to expect it of them to blur the line between right and wrong, and for us to just shrug, shake our head, and carry on.
Read more from a recent NBC news article: Facebook Manipulates Emotions: Business as Usual for Social Media Giant
The beauty of hackers, says cybersecurity expert Keren Elazari, is that they force us to evolve and improve. Yes, some hackers are bad guys, but many are working to fight government corruption and advocate for our rights. By exposing vulnerabilities, they push the Internet to become stronger and healthier, wielding their power to create a better world.
Keren Elazari charts the transformation of hackers from cyberpunk protagonists to powerful hacktivists, lone rangers and digital robin hoods who are the unsung heroes of the digital frontier.
At F8, which will now be an annual event, Mark Zuckerberg and a few other team members spoke about their commitment to making the mobile experience better and allowing developers to “build,” “grow” and “monetize” their apps. The other substantive point Zuckerberg made was that Facebook would start to focus on “putting people first.” In large part this means addressing privacy concerns Facebook users have expressed and giving them more control over which kinds of information they share.
Here are some details about what Facebook unveiled and how it affects businesses and developers.
Facebook Changes that Affect All Users
1. A New Mobile Ad Network: Audience Network
Deb Liu, Facebook’s Product Management Director, announced a new Audience Network, a tool that will allow advertisers to place ads into third-party apps. This is one way that business owners with limited advertising budgets can drive sales using Facebook.
2. A New Mantra: “Move fast with stable infra”
The era of “breaking things” is over — for now. Facebook introduced a “Two-year Core API Stability Guarantee.” Zuckerberg says this will give confidence to developers. The guarantee extends to Login and Sharing features.
3. New Permissions Options: More Control for Users
Plenty of people don’t like to log in anywhere but Facebook with Facebook. Going forward, users will have more control over what information is shared if they do decide to log in via Facebook. Zuckerberg described a “check list” of options that might include things like Friends List, Birthday, Email Address, Public Profile, etc. Users will be able to uncheck any or all of these things. Ultimately it will build trust with users. Along with better permission options comes a mandatory review process for all Facebook apps that ask for more than the most basic information.
4. An Anonymous Login: More Privacy for Users
With the anonymous login, users will no longer have to share their identity or other personal information with applications that they don’t know much about or don’t “trust.” Facebook will still have information about users, but the apps won’t. This also means that users can try apps without having to install them. Apps will still get the benefit of Facebook identity verification and cross-platform sync, and people can easily upgrade to the full Facebook Login experience later on, according to Facebook. (Note: This change is still in beta.)
5. A Promise to Fix Major Bugs within 48 Hours: Fewer Bugs!
Facebook’s mantra before “move fast with stable infra” was “move fast and break things.” The problem with that mentality was that Facebook engineers would sometimes push out products/features that weren’t even close to perfect. Sometimes these “features” didn’t work at all, sometimes they caused problems for users and, often, for developers. It remains to be seen how the new mantra will change things, but we hope it’s for the better.
6. Versioning APIs: More Stability
Developers everywhere are rejoicing over this one. The gist of this is that Facebook will support APIs for two years. This will make the platform more stable and make the Facebook experience better for developers and users.
7. A General Commitment to Mobile: A Better Mobile Experience
Facebook announced three major features that are designed to support mobile applications:
• Mobile Like Button. The new Like button will make it much, much easier to engage with mobile content.
• “Send to Mobile.” This is a new feature that is designed to allow users on desk top computers to send reminders to their mobile devices.
• The Message Dialog. This feature functions like a “Share privately” button. It will allow users to decide whom they are sharing content with.