23

HuffPost has published a story that claims:

A newly published paper reveals that scientists at Facebook conducted a massive psychological experiment on hundreds of thousands of users by tweaking their feeds and measuring how they felt afterward.

Is it true that Facebook permitted researchers to alter users' data feeds without the users' respective permissions?

SIMEL
  • 29,037
  • 14
  • 123
  • 139
  • The claim isn't altering of user data but showing different status updates in the newsfeed. – Christian Jun 29 '14 at 20:51
  • 3
    Who claims that data was "altered"? As far as I can tell the claim is that the way that data was selected for display was changed, but no post was altered. –  Jun 30 '14 at 00:35
  • 1
    @Articuno Yes, it sounds *very* different without that key word. Edited. –  Jun 30 '14 at 00:38
  • I would also change the wording of `without the users' resepctive permissions`, because they did give it in the blanket agreement they should've read (I suppose). Wouldn't something like "without their explicit / specific permission or knowledge" be better? Otherwise the answer is probably something in the lines of `no they did not, because their license allows them to do this`, even though the question is more if they actually did it ;) – Nanne Jun 30 '14 at 07:14
  • @Nanne Argh! https://www.youtube.com/watch?v=35rErQtJ6uA&feature=youtu.be&t=1m17s Technically, I agree, but I was trying to be implicitly precise. Please feel free to edit if you think it's vital, and thank you! –  Jun 30 '14 at 07:20
  • 2
    I would suggest something like `without the users explicit knowledge`, but I'm not sure what the best wording is :) – Nanne Jun 30 '14 at 07:22
  • Facebook is constantly tweaking their algorithms for choosing stories that go to newsfeed anyways. I this case two groups of people have had bit different tweaks. No one claims that the post themselves were altered. – vartec Jun 30 '14 at 09:38

1 Answers1

30

The experiment is published in the article "Experimental evidence of massive-scale emotional contagion through social networks" in PNAS.

The authors state:

The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed.

[...]

Two parallel experiments were conducted for positive and negative emotion: One in which exposure to friends’ positive emotional content in their News Feed was reduced, and one in which exposure to negative emotional content in their News Feed was reduced. In these conditions, when a person loaded their News Feed, posts that contained emotional content of the relevant emotional valence, each emotional post had between a 10% and 90% chance (based on their User ID) of being omitted from their News Feed for that specific viewing.

This cleary shows that the authors claim that they manipulated the feeds of hundreds of thousands of users.

The authors explicitly thank the Facebook team in the acknowledgements, so we can safely assume that Facebook did help the researchers to manipulate the feeds:

We thank the Facebook News Feed team, especially Daniel Schafer, for encouragement and support; the Facebook Core Data Science team, especially Cameron Marlow, Moira Burke, and Eytan Bakshy; plus Michael Macy and Mathew Aldridge for their feedback.

About the issue of informed consent the authors write:

As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.

They claim that agreeing to the Facebook policy when signing up constitutes informed consent, they did not explicitly request permission nor inform the involved users afterwards. There is a blog post on Science Based Medicine from David Gorski exploring the ethical issues behind this and the question whether this actually constitutes informed consent.

Mad Scientist
  • 43,643
  • 20
  • 173
  • 192
  • +1, but can you post an admission by Facebook, if available, for chk? –  Jun 29 '14 at 18:40
  • 4
    The primary author works for the Facebook Core Data Science team; is that not enough of an admission? – Oddthinking Jun 29 '14 at 18:48
  • 1
    Note: the paper does state that they had informed consent, due to conformance to the Facebook Data Policy that all users agree to. This has been somewhat controversial. The [Pharyngula blog](http://freethoughtblogs.com/pharyngula/2014/06/29/whatever-happened-to-informed-consent/) reports that the editor understood that they had approval from an ethical review board. (Declaration: I wrote an email of complaint to PNAS yesterday.) – Oddthinking Jun 29 '14 at 18:52
  • 2
    @Oddthinking I find that formulation in the paper to be highly problematic, a clause in the ToS can constitute consent, but declaring it "informed" is quite a stretch. – Mad Scientist Jun 29 '14 at 19:02
  • 1
    @Fabian: they inform you about it in their ToS so, yes, you are informed. Obviously it is in your rights to choose to click "I accept" without actually reading what you are accepting... – nico Jun 29 '14 at 20:03
  • 5
    For instance, from FB Data Use Policy: *we also put together data from the information we already have about you, your friends, and others, so we can offer and suggest a variety of services and features. For example, we may make friend suggestions, pick stories for your News Feed, or suggest people to tag in photos.* – nico Jun 29 '14 at 20:09
  • While I don't think the paper conformed to the Helsinki Declaration or the COPE Code of Conduct, I understand there are other views and that this debate won't be resolved in the comments, so I noted it as controversial. I brought it up because the question title explicitly asks whether it was done with permission, the paper claims it was and this answer doesn't address the issue. – Oddthinking Jun 30 '14 at 03:48
  • 1
    @nico Informed consent means that you agreed to something and that you where aware of what you where agreeing to. – Taemyr Jun 30 '14 at 08:05
  • @Taemyr: they tell people their data may be used to "pick stories for your News Feed" and they accept. What may be controversial is giving out that data to someone else, but I bet that is nicely covered somewhere else in the ToS. Not much different from Google giving different search results to different people. – nico Jun 30 '14 at 08:42
  • 3
    @nico Let me reiterate "Informed consent" is a technical term that includes resonable assumptions about what how a person might interpret the actual wording of the agreement. So agreing that my data might be used to "pick stories for your News Feed" does not automatically imply informed consent for them making biased selections from my data. – Taemyr Jun 30 '14 at 08:46
  • @Taemyr: if they "pick stories" that makes it immediately biased. The ToS are a legally binding contract as far as I am aware, so there is no "reasonable assumption" to be made there... – nico Jun 30 '14 at 10:30
  • 7
    @Nico We are not arguing wether legal consent have been given. We are arguing wether "informed consent" have been given. This is a piece of jargon meaning consent with a full understanding of future implication of this consent. – Taemyr Jun 30 '14 at 10:34
  • 2
    @Taemyr the point is that outside of academia, 'informed consent' isn't at all required - online companies routinely do behaviour analysis, experiments, A/B testing, etc for their commercial needs, and they are allowed to do that freely without informed consent. They need your permission in order to get/use/store your data at all, the ToS covers that; but you have no say in how your news feed is shown, they can write whatever algorithms they want for it and there is no requirement that you get the same news feed as others. The only question is if the academic publisher accepts it as ethical. – Peteris Jun 30 '14 at 10:59
  • 1
    @Peteris Read earlier comments, Nico answers to Fabian's answer to Oddthinking claim that the paper claimed that the users had given informed consent when they accepted the TOS. So the discussion is specifically about informed consent, and specifically in the context of academia. Informed consent is stricter than most legal standards, but misleading contracts can be invalid, especially when the contract is a click through agreement rather than a negotiated agreement. IANAL;IIUC; For persons trying to take FB to court for this the difficult of showing damages would be larger hurdle. – Taemyr Jun 30 '14 at 11:08
  • 1
    @Taemyr on that point it's quite obvious - yes, facebook allowed researchers to alter data feeds without informed consent, they do it all the time. Informed consent is a farce. For every academic study that gets informed consent there are many marketing studies that use much more invasive methods and don't bother with that, they're being done every day at every major webcompany on all of us. FB would have done the experiment anyway, measuring the impact of filtering emotional posts on user retention and ad clicks. The only question is - do we want those results to be published in academia? – Peteris Jun 30 '14 at 11:26
  • 1
    Arguing the semantics of what "informed consent" means here is pointless, because at least two of the authors of the paper work at universities that receive federal funding, and thus are required to follow the federal government's definition of "informed consent", which includes a requirement that *every participant in a study know they're participating in a study.* By definition, Facebook's TOU and DUP are not informed consent. – KutuluMike Jul 01 '14 at 00:48
  • BTW: One *could* argue that this study did not *need* informed consent due to the minimal risk to the participants, and the chance that consent would alter the outcome, but that's not what they're claiming -- they're falsely claiming they *got* informed consent. – KutuluMike Jul 01 '14 at 00:54
  • The discussion about informed consent is better for [chat]. This question was only about permission. –  Jul 01 '14 at 14:26