In part one of our blog post on controversial research, we revisited last year’s heated debate regarding whether papers that outlined how to mutate the H1N1 virus into a contagious aerosolized version of the flu should be published. We also listed some of the recent advances and techniques that have raised flags about ethical concerns and propensity for misuse, despite potential benefits.
In part two of our post, Labguru looks at the other side of this argument, offering four reasons society benefits from supporting research even if it might involve great risk for human life, danger or dual use issues:
1. With great risks come great rewards
To date, 29 fatalities have been associated with either manned space flight or training missions, with a plethora of non-fatal incidents that occurred during spaceflight and put crews in potential danger. This makes space flight, even with today’s technology, a dangerous proposition at every phase of a mission. Yet few could argue that the benefits of space exploration to humanity and the quest for understanding the Universe have been enormous. Because of research to prepare for, adapt to and survive in space, we have developed technology like CAT scans and MRI machines, LED lights, infrared satellite technology, transportation and household benefits, and treatments for cancer and other health benefits.
2. Academic freedom
This is the most theoretical, and contentious, of all reasons for oppenness in research. There is a long-standing tradition in academia of believing that freedom of inquiry, and free exchange of information among scientists, are essential to advancing academic ideals. In the United States, this goes back to the 1940 Statement on Academic Freedom and Tenure, which protects professors from persecution based on communication of facts, ideas and research. This extends to topics of research, such as the highly contested H1N1 paper.
3. Sometimes, there is simply no other choice
There are virtually no cures for difficult-to-treat health conditions, such as cancer, that did not arise because of sacrifices made by patients in risky clinical trials, testing of drugs to get proper dosages, combinations and chemical properties. For every Gleevec and Herceptin, which some scientists have said are ushering in a Golden Era of cancer research, there was a medicine that didn’t work, a patient that died. The first polio vaccine saved millions of lives, but was also only 60-70% effective and contamination concerns killed several children, leading to the development of the vaccine we use today. If we severely limited research based on dangers posed to patients, many great future health care advances would be forfeited.
4. Scientists must stay one step ahead of potential misuse
The most famous example of this is the development of The Manhattan Project (the atom bomb) during World War II, famously portrayed in many books and the play Copenhagen. In October of 1939, the famous physicist Albert Einstein wrote US President Franklin Delano Roosevelt a letter detailing the recent capability of harnessing atomic chain reactions to create extremely powerful atom bombs, and his fear that the German government was actively pursuing uranium research. With a tremendous degree of collaboration and efficiency, Roosevelt, with the help of the British and Canadian governments, established a collective of scientists scattered across a number of research site. In just four years, the team was able to theorize, develop, construct and test the first nuclear bombs in history. Fueled by the fear that the atom bomb would land in the wrong hands, this unlikely scientific feat likely helped change the course of history, both with regards to World War II, the establishment of the United States Atomic Energy Commisison, and a continued debate about the limits of nuclear power.
Today, this issue manifests in asking whether we should research highly infectious diseases or bioweapons. Indeed, the Federation of American Scientists lists several areas of benefit to studying bioweapons including “threat assessment of acquiring, growing, modifying, storing, stabilizing, packaging, and dispersing BTA to determine various properties and capabilities.” Imagine if scientists didn’t study super-powerful flu mutations and one hit us? We’d be far behind in assessing genetic structure or preventative measures. Although regulations and exactly the kind of debates that the H1N1 paper engendered need to happen to make sure research is channeled properly and not abused, it is important that scientists stay one step ahead in their knowledge of potentially devastating technologies and health threats.
What do you guys think about this heated topic? Do you feel there should be limits to the kind of research scientists should conduct? Do you have other examples of why they should? Drop us a line in the comments section below.