Are We Lab Rats?

with No Comments

Lab Rat

In “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks,” a study published June 17 in the Proceedings of the National Academy of Science (PNAS), scientists manipulated Facebook users’ newsfeeds to see how overemphasizing positive or negative news would affect the tone of viewers’ posts.

They found, not surprisingly, that sad newsfeeds made users post sadder comments and happy newsfeeds made for happier posts. More importantly, they discovered they have the power to influence user emotion by manipulating their newsfeeds. This may mean that they can, in turn, affect the offline behavior of Facebook users.

Many have questioned the ethics of the study, prompting the authors to defend their research. They claim it passed Cornell’s ethical review, which appears to be disingenuous.

One would assume it passed Cornell’s ethical review for studies with human subjects, its version of the American Psychological Association (APA) Code of Ethics. Both of these ethical codes regulate experiments to protect participants from unnecessary harm.

However, the authors of the Facebook study said, “Because this experiment was conducted by Facebook, Inc. for internal purposes, the Cornell University IRB [Institutional Review Board] determined that the project did not fall under Cornell’s Human Research Protection Program.” The publisher, PNAS, reports that “This statement has since been confirmed by Cornell University.”

According to Cornell’s official statement, “Because the research was conducted independently by Facebook and Professor Hancock had access only to results–and not to any individual, identifiable data at any time–Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.”

In other words, Facebook manipulated human participants without following the code of ethics. Researcher Hancock just analyzed the data.

According to Cornell’s own flowchart created to aid researchers in answering the question “Is your activity covered by Cornell’s Human Research Protection Program?” this Facebook study clearly qualifies as “research with human participants,” thus requiring review by the Human Research Protection Program.

The most important regulation for a human participant study is informed consent. The APA Code of Ethics requires informed consent to be a document written in plain, understandable English and signed by all participants. In the document, they must disclose the purpose of the research, procedures, the right of participants to decline to participate or to withdraw, potential risks, and so on.

If any of these sections is found lacking or the language too confusing to be plain English, the APA can shut the study down or try them on ethics violations.

Informed consent is necessary when any researcher intervenes in the lives of human participants. Only in naturalistic research, where researchers are observational only, can researchers study humans without needing informed consent.

The study’s authors claim their study “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

First, Facebook’s Data Use Policy does not even mention social research. Second, even if such a vague reference existed, it would never pass the APA requirement for fully informed consent.

Cornell standards also state, “Ethical standards also require that researchers not put participants in a situation where they might be at risk of harm as a result of their participation. Harm can be defined as both physical and psychological.”

Feeding someone negative thoughts has already been shown to be harmful in psychological studies. The power of suggestion is so strong that just reading the potential side effects of a medication can make you feel ill.

But feeding users a stream of positive posts can be just as damaging to someone with low self-esteem. Facebook can already seem like a group of perfect people posting the highs of their lives. All that positive perfection could make users struggling with serious psychological problems feel lonely, envious, miserable or even suicidal.

Lastly, studies that involve deception are deeply frowned on when undertaken lightly. They require researchers to debrief their participants at the end of their session. In the Facebook study, experimenters deceived participants into thinking they were viewing their real newsfeed when they were really seeing a censored version.

Because of this deception, the researchers were ethically obligated to debrief each participant, at a minimum with a disclosure document much like informed consent.

According to the APA, researchers must set human participants back to the way they found them. For example, if during the debriefing they had discovered one teenager who was slipping toward clinical depression because of her negative newsfeed, they would be obligated to help set her back to the way she was before they conducted their study.

They failed on this final obligation, however. Not only did they not inform their participants that they were being studied and deceived them with doctored newsfeeds, but they never debriefed almost 700,000 Facebook users. As a result, they don’t even know if they have harmed participants, let alone helped them get them back to normal.

Cornell’s Jeffrey T. Hancock is one of the official credits on the controversial Facebook study. Hancock is also listed on the Pentagon’s controversial Minerva Initiative website. The Minerva Initiative is a research group funded by the Department of Defense (DoD) to study how a small group on the Internet can give rise to mass civil unrest within a country.

Even just their categorization of online activists as “social contagions” gives rise to the idea that those who criticize government’s abuse of power need to be contained, silenced or eliminated.

The Minerva Initiative found the connection disturbing enough to distance themselves from the study. They denied they had funded this particular study in any way. They assured the public that “the Minerva Research Initiative is committed to ensuring informed consent by its research subjects and abides by all human subject protection regulations for its domestic and international work alike.”

Talk about throwing Cornell and Facebook under the bus.

Regardless of how much the DoD was involved, we know such an initiative is right up their alley. They have a “Cuban twitter” program and other “sock puppet” software where they create fake online identities and flood the world’s newsfeed with their propaganda.

They know the subtle nudging of a manipulated newsfeed is more effective than outright coercion when it comes to getting people to do what they want.

But this nudging comes at the cost of our freedoms. Even in this Facebook study, both the participant with the manipulated newsfeed and his friends whose posts were filtered had their freedoms violated. His friends lost their freedom of speech, their posts silenced because of the tone of their content. Meanwhile the participant lost his freedom of assembly, a subset of his friends forcibly removed from the social forum where they had tried to gather.

Social sciences and the freedom of expression ought not to be a tool of our government uses to maintain its control.

That’s why this Facebook study is a sad example of the alliance of academia and the Internet serving to aid the government in manipulating its citizens. But perhaps Facebook will be able to tamper with our newsfeeds to mollify these negative feelings.

Photo by Tatiana Bulyonkova used here under Flickr Creative Commons.

Follow David John Marotta:

President, CFP®, AIF®, AAMS®

David John Marotta is the Founder and President of Marotta Wealth Management. He played for the State Department chess team at age 11, graduated from Stanford, taught Computer and Information Science, and still loves math and strategy games. In addition to his financial writing, David is a co-author of The Haunting of Bob Cratchit.

Follow Megan Russell:

Chief Operating Officer, CFP®, APMA®

Megan Russell has worked with Marotta Wealth Management most of her life. She loves to find ways to make the complexities of financial planning accessible to everyone. She is the author of over 800 financial articles and is known for her expertise on tax planning.