What Facebook’s social experiments really say

Tags: Opinion

Research tells the social network how to facilitate user interaction

What Facebook’s social experiments really say
HIDDEN MOTIVES: For Facebook users who don’t want to be guinea pigs, the rule of thumb is to provide as little information about themselves as possible
Facebook’s 2012 emotional contagion experiment triggered so much public indignation that people forgot to ask the obvious question: If Facebook has a science team, is it doing something like that all the time, studying users as if they were lab rats?

To be fair, The Wall Street Journal attempted an answer. It talked to a former data science team member, Andrew Ledvina, who said, “Anyone on the team could run a test. They’re always trying to alter people’s behaviour.”

Thee interesting part, though, is how — apart from reducing the number of “positive” and “negative” posts from users news feeds to see how that changed the tone of their posts, as the data science group’s Adam Kramer did in the now-infamous experiment.

In fact, Facebook has never tried to hide it was doing this kind of research. It just hasn’t advertised it. The best place to start looking for the data science team’s insights is, you guessed it, the unit’s Facebook page.

It links to a lot of innocuous research, like this little study of how mothers on Facebook relate to their children or this one proving that true rumours are more viral than false ones. There is, however, plenty of more sensitive material, like this study of how close Facebook friendships really are. The scientist behind it, Moira Burke, surveyed about 4,000 people about their relations with their friends and then matched the data to the server logs of the participants Facebook activity. Burke says in her description of the study that all those data were anonymised, but the approach still raises unpleasant possibilities.

And then there are behaviour-changing experiments like the 2012 one. During the 2010 US congressional election, 98 per cent of American users aged 18 and over were shown a “social message” at the top of their news feeds, encouraging them to vote and then report having done so by hitting a special button. Users could see which of their friends had done so. One per cent saw the same message, but without the pictures of their friends. The remaining one per cent saw nothing at all. The scientists found that the social messages worked best, and even showed that the first message generated tens of thousands of votes — by matching their data to voting records.

The experiment was widely reported in 2012, after the team that ran it published the results in Nature. For some reason, nobody got mad about it, though one could imagine repressive regimes interested in 100 per cent voting turnouts using the technology to smoke out dissenters.

Much of the Facebook research is published. The easiest way to unearth it is to take the names of scientists from the data science team page and run a specialised Google Scholar search for them.

Kramer, for example, suggested the number of positive words in Facebook posts as a measure of “gross national happiness” and analysed Facebook users’ “self-censorship” — that is, last-minute edits made after publishing a post (he found that people do it more for group posts, when their audience is harder to define). In 2011, he also argued that Facebook “holds potential to influence health behaviours of individuals and improve public health.”

Facebook has no plans to stop this research activity. It has obvious commercial value because it tells the social network how to facilitate user interaction (like this study by Moira Burke, focusing on newcomers’ socialisation on Facebook). The academic community loves it, too: Facebook scientists’ research is widely quoted. Besides, it holds huge potential for governments seeking to gauge the public mood or influence it, as in the election experiment or in Kramer’s public health example. That may explain why Facebook chief operating officer Sheryl Sandberg pointedly refused to apologise for the 2012 experiment, saying only that the company “regrets” the fact that it was “communicated really badly.”

So for Facebook users who don’t want to be guinea pigs, the rule of thumb is to provide as little information about themselves as possible. That means getting an account under an assumed name and supplying false information on location, age, education, marital status and any other matters that the user considers private. This violates Facebook’s terms of use, but there’s nothing the social network can do about it. In fact, so many people are already doing this — judging even by my news feed — that the actual value of all that research is uncertain.

—Bloomberg

(Leonid Bershidsky is a Bloomberg View contributor and a Berlin-based writer, author of three novels and two nonfiction books)


Post new comment

E-mail ID will not be published
CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.

EDITORIAL OF THE DAY

  • RBI and government intervention is vital if all Indians are to have homes by 2022

    There is a strong case for interest subvention for India’s real estate sector.

FC NEWSLETTER

Stay informed on our latest news!

INTERVIEWS

GV Nageswara Rao

MD & CEO, IDBI Federal Life

Timothy Moe

Goldman Sachs

Chander Mohan Sethi

CMD, Reckitt Benckiser India

COLUMNIST

Varun Dutt

Public perception on climate change

Climate change is already happening; however, surveys conducted in 2008 ...

Parvez Imam

The absoluteness in representations

When a representation replaces the original object or subject, a ...

Dharmendra Khandal

Religion and conservation must go hand in hand

In 1986, former president of WWF International, HRH Prince Philip, ...

INTERVIEWS

William D. Green

Chairman & CEO, Accenture