We’re living in an age of unprecedented access to genomic data; all we have to do is send off a sample of saliva in the mail to a company like 23andMe or Ancestry to get a comprehensive report containing information about our ancestral lineage, diseases we might pass on to our children, and diseases we might develop during our lifetime. There’s no question that understanding genomic data affords a great benefit to many people, but there’s a tradeoff that’s critical to acknowledge, and it’s one of privacy. Where does our information go once in the hands of these companies? Can it really be deleted at the press of a button, as we are led to believe?
If you have been under the impression that you have control over your genomic data, even after it’s been tested by a company, you aren’t alone. Most people don’t realize that once a sample of DNA undergoes health-related genomic data analysis, federal law dictates that it must be saved. In other words, it would be illegal for a company like 23andMe or Ancestry to delete it. If this were more widely understood by the public, it might change the frequency and ease with which we hand over our DNA. This might be particularly true if we were more cognizant of the fact that our DNA doesn’t just contain information about ourselves, but about those related to us. Kristen V. Brown, reporter with Bloomberg News, joins the podcast to discuss all of this and more, including:
Richard Jacobs: Hello, this is Richard Jacobs with the future tech and future tech health podcast. I have Kristen V. Brown. She’s a future of health reporter for Bloomberg news and we’re going to be talking about genomic data and the difficulty of what to do with it once it’s quote-unquote out there. So Kristen, thanks for coming. How are you doing?
Kristen V. Brown: I’m doing great. Thanks so much for having me.
Richard Jacobs: So tell me what have you been reporting on in this arena that you want to talk about?
Kristen V. Brown: I think that we’re in this really interesting moment, right where the human genome is more accessible than ever. You can pay $100, send away a tube of your spit and find out about your ancestry, find out about your health, what diseases you’re at, the risk for, what diseases you’re at risk for passing on to your children. But that accessibility means that information is also potentially accessible to people that we may not want to see that information. I think that we’re in this really interesting moment where a lot of health stuff is not in the clinic. You don’t go to your doctor’s office for these tests. It’s on the consumer side, but most of the existing protections are within the health world. So we have these health things that are in the consumer space and they don’t have the protections that health information typically had. So we’re seeing a lot of tensions at this moment because things become more accessible. There aren’t the protections that typically exist to make sure that the information doesn’t fall into bad hands.
Richard Jacobs: So what were some of your personal experiences? Did you have data at various places and did you tell him, Hey, delete it and then they would or wouldn’t?
Kristen V. Brown: Yeah, so this is a really interesting thing actually. A lot of people think once my data’s out there, I could just get it back from the company. I think that this is a lesson for outside of the world of DNA too. If you share your data with a company, first of all, it’s unclear how many people they have shared it with. So you may be able to get it back from that company, but it’s unclear where else that data has traveled. So the best way to protect your data and your privacy is always to not give anybody your data. But the interesting twist in the world of genomics is that if you are doing a DNA test that involves health in any way that information is governed by CMS, the federal agency. And one of the things that they do is make sure that the laboratories processing your data are up to snuff, right. That their machines are calibrated correctly, that they’re basically practicing good sound science in the lab on one of the requirements of those rules is that they hang on to samples so that if there’s ever a problem, they can rerun the test and see where the problem was. So it’s actually a role that exists for good reason. It’s to make sure that the information that you get back when you do any kind of lab tests is quality and that you can check that quality. But the side note there is that if you want to delete your data from say 23 me, they actually legally can’t just delete it because you asked them to, I mean they could delete it from certain places, but they’re required by federal law to hang on to some of your information so that there can be quality checks in the future if necessary.
Richard Jacobs: But that’s crazy, it’s your data.
Kristen V. Brown: Well, it’s not crazy, I mean, it’s federal offer a reason. It’s to make sure that the information you receive, the quality is to hold these companies accountable.
Richard Jacobs: If you fill out a waiver and you just claim the fact that, okay, I understand you can’t check on the quality, by deleting this I agree that I will not challenge the quality of this data ever. I mean, that sounds like a more sensible thing to do.
Kristen V. Brown: Maybe, but I mean, you’re talking about changing like federal regulations. So that piece of not up to 23 me or any of those other companies. The companies are just complying with existing federal laws and regulations. But I think what’s crazier is that the companies do give you the illusion that you can delete your data. They all have a button that says delete data or an email address you can write to ask that your data are deleted and they do not disclose that they actually can’t delete everything. So I think that’s where it’s misleading is that there is the perception that if you no longer are comfortable with these companies housing your data, you can delete it, but they actually can’t if they’re processing health information.
Richard Jacobs: Oh. So do you believe that change is possible or do you think nothing we can do?
Kristen V. Brown: I think that it’s very difficult to change federal regulations.
Richard Jacobs: Oh yeah. They got there in the first place. I mean what were you hoping to accomplish with your article about this? It’s like, you know, call attention to this, which is very important. But then what?
Kristen V. Brown: I think that the most important thing, it’s not that doing a DNA test is bad. Some people learn really interesting information about their heritage that’s important to them. Some people learn interesting and important information about their health. But I think the most important thing is that people are aware of just how much they’re sharing. Because when you share your spit was a 23 me or ancestry, you are sharing very sensitive information about not just you but about people related to you. If my genetic information became public, somebody might also infer things about my brother or about my mother. So I mean it’s even more exposed than other kinds of information that we share on the internet with big tech companies because most of their information doesn’t also expose people that were related to. So I think that the key here is that people understand the privacy trade-off that they make when they share it with information.
Richard Jacobs: All right. So in certain circumstances, it can be totally deleted. Is it still subpoena able by law enforcement if needed, if they find out?
Kristen V. Brown: Law enforcement wouldn’t subpoena your actual genetic. They wouldn’t subpoena your sample. I think that’s a tough question to ask cause I think it depends on how ancestry 23, it depends on how individual companies structure.
Richard Jacobs: Hey, you made a no, that’s totally okay. I’m just pointing out, maybe there’s a possibility.
Kristen V. Brown: I just think it’s not useful to like raise a question that well is hard to answer.
Richard Jacobs: So in terms of the sharing, are you able to tell these companies not to share data or do they just do it anyway and they have it varied in their terms of use? What did you find out in that regard?
Kristen V. Brown: Well, I mean, similar to your information when you go to the doctor’s office, I mean that we have a misinformed idea that if I go to my doctor, my doctor is the only person who sees my information, but actually there’s a huge market for buying and selling the identified health information. And some of it’s for good reason, for research, to understand more about insurance billing. But usually, any health information is extremely valuable and then also there are cases in which your information is shared just as part of doing business. You need to share it with the cloud server that hosts the electronic medical records. So I mean 20 green meanie and other companies always disclosed their privacy policy that they are sharing your information as part of the course of doing business. For example, they’re obviously sharing your information with the laboratory that processes the data with anybody who is hosting web content. So there’s a lot of people who see your data just as a part of doing business. And then the second piece that is you are able to opt-out of is whether you want your genetic information used for research.
Richard Jacobs: So do you think that this would be, I mean it just leaked out because of data breaches or it could be going to all kinds of third parties regardless?
Kristen V. Brown: When you share any data, there is a risk of it being exposed and the more people see that data, the bigger the risk of exposure. The more times that data is getting transferred from one place to another, it’s the more risk or is. So again, I think this comes back to it’s important to just know the risks when you share your data, especially when you share your data today in this world where every company is contracting with Amazon and Google cloud services where there’s a third-party lab, I mean it’s just, there’s a lot of people that touch our data when we share it and the more connections there are, the bigger the risk of exposure.
Richard Jacobs: So what kind of questions do you think you could ask to determine how your data is being shared and we should worry about it?
Kristen V. Brown: Again, these companies make clearer that your data is being shared for research if you opt-in, and most companies are very clear about who they have research partnerships with and they’re also clear that very third parties they share your information within the course of doing business. To my knowledge, these companies do not typically share the names of all of those third parties because to some extent that might be like giving away the secret sauce of who they’re contracting with all these things. So I mean there’s not a way to necessarily see everybody that looks at your data.
Richard Jacobs: Well maybe try going to one of these places, but if you went to one of these places and you said to them, I read your terms, wonderful. Now you can’t disclose everyone you share it with the fine. But I have the stipulation that if you share it with anybody, it has to be pseudonym zed. You could share the data portion but not the personal data.
Kristen V. Brown: In many cases, the data that is being shared is anonym zed, what’s called de-identified usually. When we’re talking about health or medical information. But the thing about genetic information is it’s inherently identifying. So just because of my name, my sex, my address isn’t attached to it doesn’t mean that you couldn’t infer that it’s me. So I mean this was a particularly tricky thing about health information and especially the genetic information is that if possible to re-identify somebody even if the genetic information has been de-identified.
Richard Jacobs: Yeah, I’ve heard that some mathematicians have worked out ways to do that in certain circumstances. I don’t know which ones, but sure as possible. I guess there’s no a hundred percent bulletproof way to do it, but people at least should be aware of the level to which they’re either protected or not protected.
Kristen V. Brown: Right. And I mean that’s the thing. At this point, if you wanted to re-identify somebody from the identified data, it would be a lot of work. You would have to be highly motivated for some reason to do that. It’s not just like some 15 years old bored after school doing that. But it’s hard to tell how easy or difficult that may be in the future as technology advances, it’s hard to tell how revealing the genome may be in the future as we learn more from it. So again, the important thing is to just be aware that if you share your genetic data, your privacy is compromised. And I think that at this point in my mind, the best thing is that we need more protections for how genetic information may be used if it is compromised. Like right now, there are some protections. We have a genetic information nondiscrimination act of 2008 which makes it so that an employer or a health insurer cannot discriminate against you based on your genetic information. But that law does not protect everything. It does not protect life insurers from discriminating against you, for example. So I think that we’re at this point where it’s become very clear that in the not too distant future, many, many, many people will have shared their genetic information. And that means that even if you have not shared your genetic information, somebody that you are related to probably will have and so your information will still be vulnerable. So I think that the best tact at this point, it’s probably less protecting privacy of that information on more to make sure that when privacy is compromised then that information cannot be used against someone.
Richard Jacobs: Well there are countries that they’re moving towards or they’re requiring all people born in that country to be sequenced. And for that database to be kept like Kuwait, I could see in 2015, they required a pass a law requirement. All citizens and permanent residents had their DNA taken for a national database and they struck it down.
Kristen V. Brown: So some countries are doing that. It’s mostly in small countries. And often it places that are motivated to do so for health reasons. For example, Iceland has been very far ahead of the curve right now, but that’s largely motivated by the fact that they have a small population. They’re trying to prevent bad genetic matches. Basically they’re trying to prevent inbreeding within their population. And in Kuwait, I think and some of the other Arab nations that are doing this, I think that preventing specific genetic diseases and also preventing close family marriages is motivating some of that work.
Richard Jacobs: Okay. So what do you think is going to be the near term future of DNA collection or other genomic data collection and what do you think all this is going?
Kristen V. Brown: I mean, it’s getting cheaper and people are interested. I think that in this current moment, there may be a slow down because there’s so much awareness of privacy and fear of companies like Facebook and Google and how they’re using our information. I think in the long-term, more and more people are going to be interested in getting their genome sequenced or genotyped.
Richard Jacobs: You’re going to be doing more articles in this arena?
Kristen V. Brown: I mean, I write about this area all the time.
Richard Jacobs: Okay. So plenty more. Excellent. So where’s the best place for people to find more of your articles and your writing and to see where your stuff is?
Kristen V. Brown: So I’m a reporter at Bloomberg and if you just type my name into the search bar at Bloomberg, you can find everything that I have written.
Richard Jacobs: Okay. Very Simple. Any closing thoughts on this? Now that you’ve gone through the process of trying to get your data deleted from various sources, like what kind of feeling are you left with?
Kristen V. Brown: I would just say that in this time where we are sharing so much information all the time, that it’s just important to be aware of how that information could be looked at or interpreted by someone else. If you want something to never be seen by anybody else, do not share it. That’s the only way to protect your privacy.
Richard Jacobs: Well, very good. Well, Kristen, thanks for coming on the podcast. I appreciate it.
Kristen V. Brown: You’re welcome.
Podcast: Play in new window | Download | Embed
Subscribe to Our Newsletter
Get The Latest Finding Genius Podcast News Delivered To Your Inbox