In defense of stem cell research

Is embryonic stem cell research a waste of time and money? Many conservatives think so. During the recent Congressional debates on funding the research, Rep. David Weldon (R-FL), a physician, claimed, “If you actually read the medical journals, the promise and the potential appear to be in the ethically acceptable alternatives of adult stem research and cord-blood research.” Rep. Charles Boustany (R-LA), a heart surgeon, agreed: “Embryonic stem cells have not produced a single human treatment and have significant limitations. … Adult stem cells have been used to treat 58 human diseases.” Last fall, Daniel Allott raised similar objections in a Brainwash article, positing that the benefits of embryonic stem cell research are at least a decade off and concluding that, therefore, the research should not be funded.

These people are coming to the wrong conclusion. But more importantly, they are asking the wrong question. The whole point of scientific research is to learn about the world. To demand that scientists prove their research will be useful at the outset is to misunderstand the very nature and purpose of the scientific process.

It’s true that embryonic stem cells have not yet produced any human treatments. What opponents of the research don’t tell us, however, is that only the most wildly optimistic would have expected them to. Research on adult stem cells began in the 1960s, while embryonic stem cells were only isolated in 1998. It shouldn’t be surprising that we haven’t produced any human treatments in just seven years of experimentation!

And, in fact, embryonic stem cells do show more promise than their opponents would have us believe. To understand the debate, we need a bit of scientific background. As a fertilized egg grows and divides, the dividing cells begin to differentiate, or specialize. Most of the cells in your body are fully-differentiated: they have settled into being one specific type and can no longer change. Stem cells, however, still have the ability to become different types of cells, and that’s why they hold so much promise scientifically.

Embryonic and adult stem cells differ in how many steps they’ve taken along the path to differentiation. Embryonic stem cells are pluripotent, meaning that they can develop into virtually any type of cell in the body. Most adult stem cells, on the other hand, are generally thought to be multipotent: they have partially specialized and can develop into several types of cells within a category. For example, hematopoietic stem cells can become any type of blood cell, but cannot become fat or skin cells.

Recently, research has shown that some adult stem cells may be closer to pluripotent than previously thought, and this has been seized on by conservatives claiming that embryonic stem call research is unnecessary. However, we still don’t know whether any adult stem cells are truly pluripotent. They have other limitations as well: they are only present in small quantities, can be difficult to isolate, may contain accumulated DNA mutations from a lifetime of copying errors and exposure to toxins, and may not be able to multiply indefinitely. Embryonic stem cells offer solutions to all of those problems.

Furthermore, the promise of embryonic stem cells goes beyond the direct treatment of disease. For example, abnormalities in the early development and differentiation of cells are thought to be the culprit behind diseases such as cancer and many birth defects, and stem cell research may give us more information about these processes and provide ideas for treatment. An unbiased look at the literature shows that both embryonic and adult stem cells show significant promise, and claims that only the latter could be useful are unfounded.

In a sense, however, this is all beside the point. Funding science based solely on anticipated results is short-sighted. Many important scientific discoveries have been made while pursuing other goals, or simply investigating general phenomena with no goal in mind at all. Scientists like Maxwell, Einstein, Curie and Mendel laid the foundations of the modern world, yet none of them were primarily interested in practical applications for their work.

Indeed, it wouldn’t be the first time that a supposedly useless avenue of biological research proved immensely valuable. Around the middle of the last century, there was a great debate over which type of biological molecule made up our genetic material. Many scientists, led by Linus Pauling, suspected that proteins were responsible – after all, they had more possible building blocks (20 amino acids) and a wide variety of shapes. Proteins seemed much more likely to carry complex genetic information than DNA, with its mere 4 nucleotides and seemingly boring shape. Of course, it turned out that DNA was the winner: its four building blocks could be arranged to spell out an infinite variety of genetic information. But while Pauling was working on proteins, hoping to win the race to identify the genetic material, he made several other important discoveries, such as the structure known as the alpha helix. Protein structure is still a hot topic today, with applications in the study of diseases such as Alzheimer’s and cystic fibrosis. Though Pauling was attempting to solve a different problem, the discoveries he made were nonetheless a significant contribution.

That’s the way science works. It’s difficult, if not impossible, to predict ahead of time which avenues of inquiry are going to produce the best results. Even if a particular topic doesn’t pan out, the addition to the body of scientific knowledge gives later scientists more to work with. The pursuit of pure knowledge, not necessarily practical applications, is a great strength of the scientific process.

If the federal government is going to fund scientific research, it should fund true research: projects for which the results are not necessarily known ahead of time. Requiring researchers to show that their work will have immediate applications is guaranteed to stifle progress. If people like Boustany and Weldon had been in charge in the 1940s, perhaps Pauling would have been able to make his protein discoveries, but would Watson and Crick been allowed to work on the “less promising” avenue of DNA?

Amanda Rohn is a writer in Falls Church, VA. She holds a degree in computer engineering and will attend medical school this fall. She has a weblog at Without Bound.

Post author