Facebook, the prestigious social science field study that also lets you post pictures of your cat, has responded to concerns raised by users over an experiment it performed in early 2012 to determine whether altering the content of someone’s news feed could change their posting habits (and, presumably, their emotional state). The response was written by Adam Kramer, the Facebook data researcher who designed the experiment, and it emphasizes that, over the course of the week the experiment was conducted, no posts were actually hidden from users—they were only filtered out of certain iterations of the news feed by the algorithms that already choose what to display whenever the feed is loaded. Kramer also downplayed the project’s effect, saying, “The result was that people produced an average of one fewer emotional word, per thousand words”—although, as the paper itself points out, that change was large enough to be considered statistically significant.
Kramer also issued an apology, saying, “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.” He said that Facebook’s internal review policies have evolved and updated since the experiment was conducted.
Even more details about the paper—including questions about the effectiveness of the study design, and the decision by Facebook to interpret a terms of service agreement as informed consent—are addressed by reporter Robinson Meyer in this write-up of the issue for The Atlantic.
Meyer communicated with Susan Fiske, the Princeton University psychology professor who edited the paper for publication. Fiske says the study had approval from the Cornell Institutional Review Board (which evaluates the ethical concerns of all university research performed before it can begin), where one of the paper’s co-authors is a researcher. However, Fiske also notes that the board approved the research as a “pre-existing dataset,” which indicates that the data had already been collected before Kramer and his associates ever brought it to them.