Like? Unlike? How Facebook is toying with its users' emotions

Questions are being asked about the social media giant's use of 689,000 people as lab rats

Mark Zuckerberg

Defender: Facebook chief operating officer Sheryl Sandberg claimed that the controversial experiment had been poorly communicated

thumbnail: Mark Zuckerberg
thumbnail: Defender: Facebook chief operating officer Sheryl Sandberg claimed that the controversial experiment had been poorly communicated
Joe O'Shea

Social Media users are forever being accused of over-sharing, emotional incontinence and giving their followers TMI (or too much information). But it seems that Facebook – with more than 1.2 billion active users worldwide – simply cannot get enough of our emotions, or data on how they can track and manipulate them.

And as the fallout from their secret 'Thought Control' study into social media's "emotional contagion" effect continues to trouble the online networking giant, real and far-reaching questions are once again being asked the direction social media is pushing us in and how it may be influencing our emotions and personalities.

Facebook's vast experiment, which is to be investigated Ireland's Data Protection Commissioner, manipulated the emotions of 689,000 users, without their knowledge or consent. It has been called everything from "Orwellian" and "disturbing" to "scandalous and spooky". In the UK, Labour MP Jim Sheridan said the study – and the ability of Facebook to use "thought-control" to carry it out – raised "very serious concerns". "If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it," he added.

The company's decision to publish the results of the study have backfired in a big way, with both the academics involved and Mark Zuckerberg's people – Facebook chief operating officer Sheryl Sandberg said the experiment was "poorly communicated" – scrambling to deal with the fallout

Cornell University, the Ivy League college that processed the data collected by Facebook, has faced accusations that the study was part-funded by the US military while Facebook stands accused of a massive overreach and breach of privacy laws.

Facebook would only say that the research was done in collaboration with two US universities to gauge if "exposure to emotions led people to change their own posting behaviours". A spokesperson claimed there was "no unnecessary collection of people's data". "None of the data used was associated with a specific person's Facebook account," he said.

If we leave aside, for the moment, how Facebook gets to judge what represents "unnecessary collection of personal data", the study has excited academics around the world. What it did on an unprecedentedly massive scale was examine if social media could make people feel happier or sadder through a process of "emotional contagion".

Almost 700,00 randomly selected users had their home feeds – the flow of comments, videos, pictures and web links posted by other people – manipulated by the researchers over a one-week period in 2012. For half of the human lab rats, all feeds containing negative words were removed. The other half had any positive words or phrases blocked from their page.

And the result? The study did find a small but measurable effect on the emotional content of status updates by the users. In other words, making feeds more negative led to more negative behaviour, and vice versa.

For psychologist Fergal Rooney, Facebook's vast experiment, thought to have used the largest sample size ever seen in such a study, was "fascinating and a little disturbing ... you would have to question the ethics of the study, the way it manipulated people's emotions without their consent or knowledge."

He added: "However, this is an area – how social media affects people's emotions and the way they interact with the online and real world – that we still know very little about. Social media is still a relatively new phenomenon. It's massive; we are talking about billions of people around the world. And while we suspect that it must be having an effect on people's personalities, on their emotions and relationships in a very wide sense, we don't have a lot of solid research or information on just what exactly it is doing to us.

"From that point of view, the findings are very interesting, they raise our understanding of what's going on but they also open up a whole new set of questions that we now need to look at".

Dr Rooney, a senior psychologist at St John of God's Hospital in Dublin specialising in relationships, said he was not surprised by the evidence of "emotional contagion", that social media users can be affected by positive or negative content of their feeds.

"I think people who have studied this area will have had a strong hunch that this would be the case," he adds. "And the Facebook study would seem to have confirmed suspicions.The value of it is that it brings us to a greater awareness of the potency of social media, how it affects the way we view the world and interact with people online and in our real lives.

"We just don't know, or can't measure at the moment, how the emotions we experience in the virtual world affect us when we close down the screen. Is it a real change, will it affect how we interact with other people, our partners? Or can people differentiate between their attitudes and emotions online?

"For instance, you see people show anger and a sometimes scary disregard for other people's feelings online that you would think they would never show in, say, a coffee shop. But could we see a blurring of the lines there, see that anger and terrible disrespect transfer into people's real lives?"

Dr Rooney believes the Facebook study – both the results and the way it was carried out – should represent a "major wake-up call" for people using social media. With reports of addiction seen in a significant number of social media users, especially amongst younger people, the hours we spend online and the emotional impact it is having on us is a new, unknown phenomenon.

"We need to be aware of how we behave towards others and how willing we are to share information about ourselves," Dr Rooney adds. "And there are issues of over use, of needing to check in last thing at night or first thing in the morning. When you get to that stage, you need to ask yourself questions." The advice from the Dublin-based senior psychologist is to switch-off occasionally.

Checking the small print

Think you didn't sign up for this? You may want to go back and have a look at the endless pages of terms and conditions – the things no one anywhere reads – you rushed past to hit 'click and confirm'.

Privacy experts in the US believe that while Facebook's mass experiment may be ethically questionable (and that's putting it lightly), the company could well have been within its legal rights.

Changes in the firm's terms and conditions – announced in 2011 – are believed to have paved the way for the mass experiment. However, the clause granting Facebook the right to use information about its customers "for internal operations, including ... research" was only added to its data use policy in May 2012, four months after the company experimented on hundreds of thousands of users to see if they could affect their emotions. As with the survey, the rules on what Facebook can collect and what it then does with that information lie in a grey area.

And while the social networking giant may be legally safe from the human lab rats it used to conduct its experiment, the damage it has done to the trust we have in social media – and the Facebook brand – is still to be quantified.