Mandie Sami reported this story on Monday, June 30, 2014 08:15:00
Listen to MP3 of this story (minutes)
| MP3 DOWNLOAD
CHRIS UHLMANN: If you're one of Facebook's more than 1 billion users you might have been part of a social experiment without knowing it.
A new study has revealed that Facebook intentionally manipulated the news feeds of almost 700,000 users in order to change their emotional state.
And it claims it was done for research.
Facebook insists the study was legal, but as Mandie Sami reports, privacy experts say the experiment was dangerous and unethical.
MANDIE SAMI: When users sign up to Facebook, most know the social network will use their information for marketing purposes.
But many don't think that Facebook will use them as guinea pigs in an experiment they're not told about.
FACEBOOK USER: Facebook conducting these experiments without telling people is not okay.
FACEBOOK USER 2: Yeah I don't think it's really okay either; I think informed consent is always really important whenever an experiment is being conducted.
FACEBOOK USER 3: Keep it transparent if you're going to manipulate us.
MANDIE SAMI: A new study published in the Proceedings of the National Academy of Sciences has revealed that's exactly what Facebook did.
The researchers who carried out the study are affiliated with Facebook, Cornell, and the University of California-San Francisco.
For one week in January 2012, they manipulated what almost 700,000 Facebook users saw in their news feeds when they logged in.
Some people were shown content with only happy and positive words, while others were shown posts that contained sad and angry content.
The point of the experiment was to find out whether emotional states were contagious.
DAVID VAILE: The conclusion was, we can actually manipulate people's feelings by manipulating their news feed on Facebook.
MANDIE SAMI: That's David Vaile, the Co-convenor of the Cyberspace Law and Policy Community at the UNSW Law Faculty.
The study found more negative news feeds did lead to more negative status updates and conversely more positive news feeds resulted in more positive status updates.
David Vaile says, while not illegal, the experiment was unethical.
DAVID VAILE: Any sort of university or established researcher would typically have to take this to an ethics panel and get ethics clearance and first thing they'd ask is, where's the informed individual consent?
So, the issue here as well as not being informed consent for the individuals, is also the question of research ethics that you know they're messing with real people's lives and let's not forget that some you know negative experience online have real consequences for people.
At worst, case people get desperate or get… ending up feeling depressed, people have committed suicide from terrible things that have happened to them online.
MANDIE SAMI: AM contacted Mia Garlick, Facebook's head of Policy for Australia and New Zealand, to ask about the ethical concerns surrounding this experiment and whether informed consent was obtained.
She declined to comment and instead referred AM to Facebook's media enquiries line.
A spokesperson from Facebook is yet to respond.
CHRIS UHLMANN: Mandie Sami reporting.
Source
.
| MP3 DOWNLOAD
CHRIS UHLMANN: If you're one of Facebook's more than 1 billion users you might have been part of a social experiment without knowing it.
A new study has revealed that Facebook intentionally manipulated the news feeds of almost 700,000 users in order to change their emotional state.
And it claims it was done for research.
Facebook insists the study was legal, but as Mandie Sami reports, privacy experts say the experiment was dangerous and unethical.
MANDIE SAMI: When users sign up to Facebook, most know the social network will use their information for marketing purposes.
But many don't think that Facebook will use them as guinea pigs in an experiment they're not told about.
FACEBOOK USER: Facebook conducting these experiments without telling people is not okay.
FACEBOOK USER 2: Yeah I don't think it's really okay either; I think informed consent is always really important whenever an experiment is being conducted.
FACEBOOK USER 3: Keep it transparent if you're going to manipulate us.
MANDIE SAMI: A new study published in the Proceedings of the National Academy of Sciences has revealed that's exactly what Facebook did.
The researchers who carried out the study are affiliated with Facebook, Cornell, and the University of California-San Francisco.
For one week in January 2012, they manipulated what almost 700,000 Facebook users saw in their news feeds when they logged in.
Some people were shown content with only happy and positive words, while others were shown posts that contained sad and angry content.
The point of the experiment was to find out whether emotional states were contagious.
DAVID VAILE: The conclusion was, we can actually manipulate people's feelings by manipulating their news feed on Facebook.
MANDIE SAMI: That's David Vaile, the Co-convenor of the Cyberspace Law and Policy Community at the UNSW Law Faculty.
The study found more negative news feeds did lead to more negative status updates and conversely more positive news feeds resulted in more positive status updates.
David Vaile says, while not illegal, the experiment was unethical.
DAVID VAILE: Any sort of university or established researcher would typically have to take this to an ethics panel and get ethics clearance and first thing they'd ask is, where's the informed individual consent?
So, the issue here as well as not being informed consent for the individuals, is also the question of research ethics that you know they're messing with real people's lives and let's not forget that some you know negative experience online have real consequences for people.
At worst, case people get desperate or get… ending up feeling depressed, people have committed suicide from terrible things that have happened to them online.
MANDIE SAMI: AM contacted Mia Garlick, Facebook's head of Policy for Australia and New Zealand, to ask about the ethical concerns surrounding this experiment and whether informed consent was obtained.
She declined to comment and instead referred AM to Facebook's media enquiries line.
A spokesperson from Facebook is yet to respond.
CHRIS UHLMANN: Mandie Sami reporting.
Source
.
No comments:
Post a Comment