|Chromosomes / Photo by Can H. via Flickr|
The latest market research indicates that using genetic tests as pre-symptomatic screens for heritable diseases is a trend that has taken off. It’s a means to predict future disease risks based on chromosomes and genetic mutations. Predictive testing is still a somewhat novel phenomenon, yet proliferation in public health practices is coming fast now, particularly for markets in developed countries. It’s showing up more and more in connection with nutritional strategies and physical fitness training combinations to outmaneuver all sorts of potential disorders like cystic fibrosis, Huntington’s disease, Down’s syndrome, phenylketonuria, sickle cell anemia and breast cancer.
These tests are typically done for people considered healthy, and it entails the identification of specific traits that serve as genetic indicators of disease risk later in life. Some of the most frequently seen kinds of predictive testing predictive diagnostic, genetic susceptibility testing and population screening. They’re often most valuable to people whose family histories are at least suspected of being fraught with treatable genetic disorders. Much of this, including the tests themselves, is being referred to as consumer genomics, which includes applying genome sequencing to the classification of individual people with disparate risk types.
Wellness genomics, meanwhile, is intended to help people make the right choices about sustaining good health, and it generally involves clinical data and genotypic information. In tandem with preventive medicines, consumer and wellness genomics are fast becoming vital to long-term personalization of healthcare despite the controversies about private companies who engage in this kind of marketed service. It’s no longer about at-home DNA kits so much anymore, which have been proven by a recent study to be significantly unreliable. Physicians are reaching a point at which they can make reliable predictions based on these genomics approaches.
A case in point here is the prospect on which some experts say we’re teetering on the cusp; that’s the ability to detect Alzheimer’s disease early just based on a blood test, and by “early,” they expect to mean about 20 years out. That’s something of a game changer in the battle against a monster like Alzheimer’s. “We are getting to this stage of precision medicine in many ways, which is great, but it does raise a number of practical concerns and ethical considerations that have to be worked out,” according to Keith Fargo, director of scientific programs and outreach at the Alzheimer’s Association.
Psychologists, bioethicists, and patients alike are now considering more than ever concerns about diseases like Huntington’s disease, an equally incurable, degenerative brain disease. We’ve actually had very accurate metrics by which to test for this since about 1993, but it’s a hereditary condition with an onset anywhere between ages 30 and 50 on average. A recent British study, however, reported that over 80 percent of those at risk for it simply haven’t been tested, and therein lies the bigger caveat. “People bring their own experiences to receiving information and their desire to have that information,” explains Marni Arnsellem, a clinical psychologist for cancer patients based in New York.
Some medical scientists worry that people who find out that they’re carriers for Huntington’s disease are considerably more likely than the average person to commit suicide. That parallels other fears featured in the argument over whether or not the FDA should approve at-home HIV tests, and this fed an alternate controversy about direct-to-consumer tests. The FDA did approve it, though, and research already exists showing that this fear is less warranted than people think. That research is balanced, however, against opposition research showing that patients already prone to depression or anxiety shouldn’t engage in at-home testing.
The point is that finding out you’re at risk for something is entirely different if you can also say that there’s a definitive way to do something about it. To illustrate this point, experts often point to the well-known BRCA1 or BRCA2 gene mutations — both of which raise the likelihood of developing breast or ovarian cancer respectively. Celebrity actress Angelina Jolie is a member of a group of many women known to have famously had prophylactic mastectomies or oophorectomies once they got their results, vastly improving their odds of nipping these cancers in the bud.
|Celebrity Actress Angelina Jolie / Photo by Gage Skidmore via Flickr|
“They’re saying, ‘I want to have a future, and this is one way the odds suggest that I will have a healthier future,’” Ansellem explains. This kind of advantage is only likely to get better as medical science continues to innovate with new kinds of tests, too. For instance, an international research team that included experts from the Max Planck Institute for Infection Biology based in Berlin, Germany, just engineered a new, very basic blood test that can rather accurately estimate one’s risk of contracting active tuberculosis. It’s a test designed to measure the activities of different gene pairs that play important roles in inflammatory response.
The Institute’s Stefan Kaufmann says, “Using the pattern of activity of these ‘risk 4’ genes, we were able to determine the tuberculosis risk of infected individuals. That means we can say one year earlier who is likely to develop active tuberculosis.” That test is applicable throughout the entire African continent, and with further innovation, it will be applicable worldwide in time. Some of the other controversies can obscure how worthwhile predictive testing has become, but the shortcomings of companies like 23andMe and Ancestry.Com cannot be allowed to dictate whether or not we continue to pursue these advancements.