Consumer genomics kits are all the rage. On Black Friday and Cyber Monday last year, industry leader 23&Me sold 1.5 million. I can understand the appeal. For one, it’s fun to learn about where your ancestors came from and perhaps even pick up a surprising fact about your family heritage to share at cocktail parties. You might even discover a long-lost family member or two. More seriously, people want to find out which diseases they are more susceptible to and steps they can take to mitigate their risk. Setting aside concerns about the accuracy and reliability of these tests, we are still left with a major potential pitfall: the privacy and security threats of amassing large quantities of biometric data.
We already share so much—perhaps too much—on social networks and with hundreds of companies as part of the ever-expanding reach of new digital tools and conveniences. We “pay” for these wonderful services with our personal information and preferences, which can make it feel like we’re getting something for nothing. Sharing our biometric data, even our DNA, is another step in this direction. The potential scientific and personal benefits are tremendous, but we have little ability to understand how these sharing decisions might come back to haunt us—individually and as a society.
Before we examine these problems, and then some possible solutions, let’s look at the positive breakthroughs this technology could bring about—and in some cases already is. The ability to analyze massive amounts of DNA and other physical data will allow healthcare professionals to identify large-scale trends, test the effectiveness of different health practices and policies across geography and demographics, and come up with new, evidence-based solutions. Patients, meanwhile, will be able to independently obtain important information about their health, becoming more informed and proactive participants in the healthcare system.
This depends, of course, on making sure that consumers look to trusted and reliable sources of information. Some genomics companies, such as Helix, are considering partnerships with third parties like the Mayo Clinic to ensure that their customers can properly interpret the genetic data they receive. And that leads to our familiar dilemma: While we usually say that we want more information and more responsibility, it can quickly become overwhelming and lead to bad choices, and/or handing these decisions over to companies with their own interests.
One obvious downside is the potential for insurance companies and employers to use genetic information to discriminate against those at higher risk of disease. This is not a new concern, and legislation already exists, in the form of the Genetic Nondiscrimination Act, to prevent preferential treatment on the basis of such data. The new spike in genomics testing only raises the opportunity for this kind of discrimination though, so we must continue to be vigilant and update old laws as needed. The trend could escalate in disturbing ways, bringing us closer to the dystopian futures envisioned in science fiction classics like Gattaca. While I am generally a techno-optimist, we must be aware of the dangers. We already have the ability to screen for many diseases before conception, but there is no reason to think that, with proper regulation and ethical leadership, we will get to a world where people are ordered according to their genetic potential.
DNA as currency
Another looming possibility is the transformation of genetic data into a new type of currency. As with almost all technological advances, this is not inherently good or bad. It is, however, a development that we must remain ahead of, before predatory actors can use it to take advantage of people. Do we have a special right to our own genetic data, in a way that we don’t to, for example, our browsing history? In a 2013 US Supreme Court case, molecular diagnostic company Myriad Genetics was found to be in violation of patent law by seeking to profit off of genetic material, which the Court classified as naturally occurring and thus not patent-eligible. A patent based on synthetically-created DNA, however, was allowed. Determining what information can be used for private gain and what cannot will become increasingly difficult as the field continues to evolve.
Further complicating the landscape, startups like LunaDNA and Nebula Genomics are attempting to turn the consumer genomics industry into a full-fledged economy. These “bio-brokers” want to give people the opportunity to rent or sell their data to biomedical institutions in exchange for financial compensation. One possibility is a co-op system, in which value accumulates as the data set grows, and individual contributors are paid dividends based on the size of their “genetic investment”—a startling phrase! Another business model would require consumers to sequence their entire genome for payment, since complete information is most valuable to researchers.
The more options there are for such genetic “transactions,” the greater the chances of attracting bad actors and security breaches. We should not wait for a biometric analog to the Cambridge Analytica scandal to take stock of the industry’s privacy regulations and security standards. Any company that enters the space should be held accountable for the important, and deeply personal, data that it accesses. We must keep in mind that regardless of our best intentions, data inevitably ends up in places we didn’t have in mind for it, sometimes by design, other times by criminal hacking. As we continue to turn more and more of our lives into data—as with our finances and communications—we have to measure the risk of where it could end up.
What should we do, more specifically, to address the security risks of this boom in biometrics testing? We have some examples of existing institutions updating regulations to catch up with the changes. In 2017, the US Food and Drug Administration (FDA), loosened regulations for consumer genetic test kits. The new policy allows companies that have gotten FDA approval for a prior test to start selling future tests before vetting. The move is an attempt to increase flexibility and adapt to a rapidly evolving marketplace, while maintaining existing standards for safety and reliability. (All tests will still be vetted, and high-risk tests are excluded from the new rules.)
There are still plenty of unanswered questions. How much of this field do we want to turn over to public oversight, and how much should remain in the hands of private companies? Should we establish a public database, making biometric data, or certain segments of it, into a non-profit public good, or will removing too much of the profit motive curtail needed advances? As usual, I will end on an open-ended note. I am hopeful that these innovations will open up exciting new horizons, as long as we move forward with equal doses of common sense and integrity.
*** This is a Security Bloggers Network syndicated blog from Blog | Avast EN authored by Avast Blog. Read the original post at: https://blog.avast.com/the-privacy-and-security-risks-of-consumer-genomics-kits-avast