As a scientist, I enjoy reading articles like this. Those reality-based self-defense experts out there who are evolving into the present and future thought leaders understand the importance of objective assessment and rigorous descriptions of human behavior. The more we understand the nuances of that, the more we can have intelligent conversations about violence and human psychology.
The Role of Deception in Scientific ResearchDeception May Be a 'Necessary Tool,' but New Methods Are Prompting Scientists to Reconsider
When Geoff Pearson, a sports-management and law professor at the University of Liverpool, wanted to study the behavior of rowdy soccer fans in the U.K.—"football hooligans" as they're known—he decided that just talking to them wasn't good enough. The well-behaved fans sometimes exaggerated their role in mayhem while violent ones played down theirs, Dr. Pearson says.
So he went undercover.
Over a roughly 10-year period, he joined the fans at football matches to watch how crowd behavior went from calm to rowdy. To maintain his guise, he regularly committed what he describes as minor criminal offenses, such as storming the field or bringing alcohol on the train to the game. He wrote down his observations while huddled in the restroom, talked into a recorder by pretending it was a cellphone and jotted down copious notes after matches. Most important, he didn't tell fans he was studying them.
He studied the emotions and behaviors of fans on the way to the games, in the stands and—because most violent behavior tended to occur outside the stadium—what happened after the matches, in pubs and public spaces. He was in the crowd and tear-gassed during the 1998 World Cup in Marseille, France, and hit with a water-cannon in the 2000 European championship.
Dr. Pearson's unusual research techniques are one extreme example of how deception can be used in the name of science. "Deception" is generally defined as when researchers intentionally act or withhold information from participants so that they hold false beliefs, according to Ruth Faden, director of the Johns Hopkins Berman Institute of Bioethics. It occurs in many forms and with varying degrees of ethical uncertainty.
One of the main tenets of ethical research is that participants should be informed that they're being studied and for what purpose. Deception can upset participants who feel misled. It can lead to mistrust of scientists or the medical system in general, experts say. Outrage over a study published using Facebook data is a recent example.
Often researchers use deception when they want to study behavior that people can't or won't honestly engage in if they know why they are being studied such as to learn whether they use illegal drugs. Other times researchers may be concerned that bias or expectations will color the results. A common example of this is the double-blind study of new drug treatments, in which neither the researcher nor the participant knows if a participant is getting the new drug or a placebo.
New methods for research, such as the ease of collecting large amounts of data on the Internet, are prompting scientists to think even harder about deception.
Kypros Kypri, a professor in the school of medicine and public health at the University of Newcastle in Australia, and his colleagues in the U.K. have independently studied the drinking habits of tens of thousands college students in New Zealand, the U.K. and Sweden. They found that just asking heavy drinkers about their alcohol use sometimes changes their behavior.
The researchers surveyed students about their drinking behavior and, based on the responses, identified thousands of heavy drinkers. They sent the students one or more follow-up surveys to see if they changed their behavior or sought help for their drinking.
While in most cases the students understood they were filling out surveys as part of a research study, at no point did the researchers tell them they were participating in an intervention—not even at the end of the process, which is usually when subjects learn the true purpose of a study. (In one case, the British team didn't even tell the students they were in a research study.)
Disclosing the goal at the beginning of the study would have changed the very behavior the researchers were studying—whether students would decide to cut down on their drinking after answering questions that might prompt them to think about whether they had a problem, Dr. Kypri says. Disclosing the true purpose at the end of the study would have done more harm than good because participants might have felt misled about the study, he says.
The researchers decided that not disclosing was ethical, because the intervention was subtle and unlikely to cause negative consequences.
In addition, they reasoned, the subjects were healthy college students, not individuals who might have been vulnerable because they didn't understand what was happening or couldn't take care of themselves.
However, when Dr. Kypri and his colleagues published an article about the ethics of their work in the American Journal of Bioethics in 2013, some scientists disagreed with them on issues including the researchers' belief that participants would have been upset about being misled about the purpose of the study.
Likelihood of public benefit is a "necessary condition" for deception, Dr. Kypri says, and another is that there wasn't any other way to answer the question. "I'm confident in the studies we've reported, that the research question is important for public health, that the risk to participants was very low, and that there was no other way to answer the question."
Whether deception is ethical comes down to whether the knowledge gained through deception justifies the trickery and ultimately is a matter of ethical judgment, says Rebecca Dresser, a professor of ethics in medicine at Washington University in St. Louis.
But there also are ways to minimize the consequences of deception. It should be avoided if there are alternative ways to get that information. It also should be time-limited, in that participants ought to be told about the deception at the end of the study, Dr. Dresser says.
Deception may not be as necessary as researchers think, Dr. Dresser says.
For instance, in a study published in 2000 in the journal Chest, participants were given asthma inhalers that registered when they were used. The data showed that most participants didn't use the inhalers regularly even though they said they did and instead dumped the medicine before they came in for a checkup.
Though deception may have been necessary to get more accurate information, and the information was useful for the researchers to know, Dr. Dresser says it is still important to question whether deception was needed. Why not just tell participants that their use of the inhalers would be monitored?
"I think that's respectful of people so they accept those conditions," says Dr. Dresser. "You can tell them what's going on, and you don't have to deceive them."
Dr. Pearson, the football hooligan researcher, says he remains convinced that only by going undercover was he able to gain the insight that added to a body of research from other academics that has led to changes in the way that crowds now are policed at soccer matches in Europe.
The researchers found that soccer crowds were more likely to become disorderly when there was a "high-profile" show of force by police, such as when they dressed in riot gear. It was more effective when police engaged in friendly conversation with the crowd.
These research insights led to more training in the "friendly but firm" approach, which was used by the Portuguese police in 2004 at the European Championship.
But he wouldn't necessarily repeat everything he did during his time as an undercover fan, such as when he threatened a fan from another team to a fight.
"I was 21 years old in the first year of my Ph.D. and making snap decisions in the field," says Dr. Pearson. "You do occasionally make mistakes."
Write to Shirley S. Wang at email@example.com