One of the concerns I had when I did 23andMe many years ago was that someone in the future would be able to recreate me Jurassic-Park-style. Now I'm not an expert, but based on my reading of how they collect data, turns out 23andMe doesn't actually have my DNA sequence data.
They do what is called "genotyping"[1] which is much cheaper than full "sequencing" [2], but which only probes for a limited set of known variants. So it's only partial information.
Since then 23andMe has launched a more expensive 23andMe+ Total Health offering, which does full sequencing, but like most people, I never subscribed to that package. In fact I had no occasion to interact with the company after the first result, which may be why they are struggling financially.
DNA testing is something most people might do at most once in their lives -- then they lose interest.
Err, so what was the one you were thinking of? I've seen The Island before, so would be potentially interested in something else along the same lines... :)
The point of this attack vector is to drop patients who are high risk BEFORE they get diagnosed so they can claim it isn't for any existing condition.
This is especially scary in small, closely related communities like Iceland or Utah where a relatively low number of collected samples can be extrapolated to large swaths of the population that didn't give their genetic info away.
Also that protection is relatively new in the US and is constantly being attacked politically so it isn't unreasonable to think about a world where it is removed like other longstanding health related protections.
The Genetic Information Nondiscrimination Act is almost as old as 23andMe and older than the prohibition on discrimination for preexisting conditions, so it would be strange to come up with a hack where you break the first to try to get around the second.
Also, since 1991, many US states had already passed laws that prohibit insurers' use of genetic information in pricing, issuing, or structuring health insurance.
If you want to hand wring about future changes you don't need to look at dubious inference of risk from DNA. Just consider cancellation of insurance policies if you come down with some illness that requires expensive long term treatment.
And they were prohibited from considering genetic information by the Genetic Information Nondiscrimination Act before they were prohibited from considering preexisting conditions more generally by the Affordable Care Act.
23andMe being a Google spin-off I wouldn't be surprised if they do a full sequencing regardless of whether you pay for it or not. At the very least I would expect them to store the samples indefinitely so they can do it once it's economically viable.
As I understand it, full sequencing is a _lot_ more expensive. 23 and me currently shows the genotyping is like $100 while sequencing is $1000. It’s probably too expensive and time consuming for them to sequence everybody’s samples unless they pay for it.
Housing the samples in a controlled environment for an extended period of time is likely way more expensive than sequencing. I suspect the samples are discarded once genotyped.
I'm no biologist but the samples seem to survive being shipped by standard mail so it doesn't look like they would need very strict storage requirements. If for any reason they do, since they take very little space it shouldn't be too expensive per-sample. What am I missing?
Well the rightful answer is "it's complicated"
DNA wise yes there are 3 billion bases and they sample about a million of them, so 1/3000 bases.
THing is about the bases
1. most of them are the same in everyone
2. the ones that are different tend to be correlated with each other locally and thus captured by their 1Mb assay
so
3. you can infer all the stuff they didn't sequence with pretty high confidence. Not a "in theory" but more like "has been a typical thing to do in the statistical genetics field for at least a decade"
4. outside of the genome there is the epigenome which may or may not be relevant, it undergoes very specific resets short after fertilization
also worth noting
5. bioinformatics is an imperfect and algorithm based science. Reads are aligned according to error and difference profiles (i.e. string mismatches and the most and least types of mismatches). So unless things are finely calibrated, a level of analysis most bioinformatics don't do, deeply study their read alignment penalties for particular data sets, even a robust bug-free read aligner correctly applying penalties will have bad alignments, false positives, false negatives
anyways I for one hope the futurepopele will clone me from the bytestream
You see it all the time in videos, where they demand you not view with sound off. Every time I see it I can only think "No, I don't think I will. Get absolutely fucked."
Right, I've seen this occasionally. I don't really mind since it's usually because there is some unexpected audio that you might miss if you're watching on mute.
However, 23andMe uses a laboratory that must follow
regulations under the Clinical Laboratory Improvement
Amendments, or CLIA. This means that some data, including
your DNA, sex and date of birth will be retained in order
to comply with these regulations. The company will no
longer use that information, though. You can read more
about the company’s deletion processhere.
I misinterpret 23andme results and believe I have a horrific genetic condition. I then delete my data, commit suicide, and my family sues the company.
I’m not saying this is what they are worried about, but it could be something along those lines. I work in big finance, and there is a LOT of regulation around data retention, and it’s a lot more nuanced than people think.
Should we be allowed to delete the data? Absolutely. This will likely be a hallmark case setting president for the future.
I looked at the 1988 law, nothing in there said they had to keep DNA and it wasn’t even mentioned. They are lying or being very liberal with their interpretation of the law to benefit themselves financially.
it's easy to delete your account and "data", but their policy claims they keep a bunch of it around. I just emailed their privacy email address to find out if they can provide more information.
Gattaca here we come. Can't wait until I start getting denied for service/opportunity X because some computer somewhere used DNA from a distant relative to determine I'm a risk.
It's not just people who gave them data who should be worried. If a close relative gave their info to 23andme, they probably have enough data to associate your DNA with that relative. For instance my mom did 23andme and now 50% of my DNA is in this private commercial database without my consent and this data is completely unencumbered by HIPAA privacy restrictions.
Food for thought, they found the Golden State Killer using the DNA of a third cousin. So in a familial pool of hundreds of people, if even 1 gives up DNA then you could (with effort) find out anyone else. Provided they're in a good spot on the tree, you can't do much with a 5th cousin.
Of course the Golden State Killer was extenuating circumstances. He killed about a dozen people and sexually assaulted over 50 women in the 70s. But, it makes you wonder about the future of criminology.
GDPR would be the obvious one, particularly the bits about them not being allowed to refuse service if you don't agree to data processing that isn't strictly necessary to provide the service (e.g. sharing data with insurance companies)
The article makes a point that 23andMe isn't bound by HIPAA, but even if it were, I wouldn't consider that adequate. The bar for collecting and holding PII, particularly medical, needs to be much, much higher than it is today.
A doctor I recently visited whipped out his iPhone and asked if I was okay with him recording our conversation so that some fly-by-night rando AI company could vacuum up our private conversation and spit out some LLM-generated summary of our visit. "Not to worry," he insisted, "they're HIPAA compliant!"
I probably should have walked out of the office right then and there, but instead I simply told him no, not under any circumstances may he record our private conversation and send it off to some third party over the Internet. He seemed a bit taken aback because I guess I am the only patient he's had push back on it. He tried saying that the service "really helped him" or something like that. It seemed like he was trying to make me feel bad for "making his job harder."
I simply replied that HIPAA compliance didn't prevent the last 5 or 6 letters I've received from both hospitals and insurance companies about "cybersecurity events" leading to the compromise of my PII. And not just any PII, mind you. It was my medical information, supposedly "protected" by HIPAA. These were major insurance companies and hospitals. And you want me to believe that some fly-by-night AI startup is going to somehow be a safe place for a goddamned fscking full audio recording of our private visit, just because they claim to be HIPAA compliant? Are you kidding me?
I've made it a point to start writing my representatives in government about these issues. They need to wake up and start doing something meaningful to protect the people who are being bamboozled by all the yahoos who play fast-and-loose with their privacy, especially medical PII.
I had a similar experience where I was also assured the data was “doubly protected, it’s secured by a password here and re-secured at the remote site.”
Besides that immediately making me question their security, it is a great example how people trust things without much thought. I’ve heard of calls for statistics to be pushed over calculus to improve math literacy in the general population, perhaps some cybersecurity courses should be pushed over “learn to code” to improve tech literacy.
I’m not sure it really matters in practice at this point.
As a condition of getting a flu and covid vaccine, CVS made me agree to give them
permission to share my medical history, test results, etc. with my employer and their affiliates.
Just thinking here: is it possible that's a catch-all disclosure agreement aimed at employers who require certain vaccinations (I know CVS offers TB shots, for example, which are mandatory for working with some older/vulnerable populations), and this agreement lets CVS send those records to employers when requested?
Either way, it's still a too-broad agreement, but my assumption is that CVS thinks it's easier to opt everyone in by default than to ask patients to opt in as needed, and then inevitably have some patients not opt in when they should have, and then deal with the resulting bureaucratic nightmare when the nursing home they work for calls and demands to see immunization records.
HIPAA is a joke in the first place. How to implement HIPAA compliance is entirely up to the company dealing with the data. There are no prescriptive standards to protect your data. Who isn’t HIPAA certified? It has to be the easiest thing to certify for from a technical perspective. Research teams run records through some NLP shit to depersonalize them, but we all already know it’s trivial to reverse engineer that data to its origin.
HIPAA is a legal framework to describe lawful disclosure of health information- defining who and when, and what steps must be taken when unauthorized / impermissible disclosure happens.
It is technologically agnostic, because it applies whether your doctor is fully remote and everything uses electronic records, or if the provider is still using pen and paper and carrier pigeons.
For actual security details, there may be some regulations with the change to the mandating of electronic records, but nothing in HIPAA ourself. For that, you want to look for organizations that have a certification like SOC2 or similar.
> HIPAA is not a joke, employees can be held personally liable for breeches
Okay, great. So which employees were held personally liable for these two breeches? I got "The Letter" telling me I was one of the victims for both of them.
It is possible your doctor doesn’t fully understand concerns here. Or maybe he does and doesn’t give a shit. If it is the first case, maybe there is some hope - we can try and educate them doctors.
I don’t know how to accomplish this, but we need to educate as many people as we can about privacy
It's a technical problem. Computer is still Magic Box to most people. All they know is marketing. There's a lot of lies around software and it's trivial to get someone, anyone, to upload even very compromising information. Just say you're secure, that you got certification X and businesses A - F use you.
Hell, even tech companies fall for this. They'll upload their shit to some cloud without setting up proper guards because "oh well it's Atlassian". Hope you don't have anything too compromising in that Jira!
Don’t conflate doctor/patient confidentiality and data security. If someone broke into an office and stole medical records, that’s not a violation of doctor/patient confidentiality, even if the doctor chose crappy locks on their doors.
> Yet another example why "HIPAA compliant" means nothing.
"HIPAA compliant" doesn't mean nothing. It means a whole lot. It's just not relevant here, because - as mentioned at the beginning of the article - 23 and Me is not regulated under HIPAA.
If they say they will destroy it, then I later find it wasn't destroyed and it was used in some way that was harmful to me, doesn't that make the company massively liable?
That's the only reassurance I have. It would be like illegally collected evidence. Sure, you may have some evidence (my DNA), but it's not admissible because you're not supposed to have had access to it.
IIUC, there are hard rules about admissibility of evidence, and you're aware when an attempt is made to use evidence that way, and have an opportunity to argue it, in a very enforceable way.
Are there analogous rules around the DNA sent to 23andMe, would you have an enforceable opportunity to block uses of that, and would you even know when they're using it that way?
I looked it up, and I’m mistaken, in their “privacy” statement they say they retain your “Genetic Information” because it’s required by law, which from my perusal of the law seems like an extreme stretch of what the law says. It should be enough that they retain your contact info and a brief statement that they took your genetic information, there is no reason to keep all the genome info.
DNA replaced latent fingerprints to place individuals at the scene of a crime. We know fingerprints have led to wrongful conviction. Partial DNA profiles can lead prosecutors to individuals who were not involved.
Legal changes that allow insurance companies to use genetic information to increase or deny coverage. Not just to you but your entire lineage.
Having worked at startups, my guess is that 23andMe doesn't even have the data a malicious company would want. Best case someone will buy them with some evil plan and realize the data is useless anyways.
Governments hunted down my family in the 30s without DNA, so I don’t think random PE firm is going to be able to do anything too evil with my 10 year old cheek cells or partial sequence that they couldn’t do without it. Honestly my name and face give me away.
Finding out that I had all the known markers for Parkinson’s 30 years before symptoms are expected to show up gave me an unbelievable head start and changed the trajectory of my life in several ways.
I read the TOS at the time. Would do again even with the data leak.
I’m glad I at least downloaded my data so I can have it forever and upload it where I like. I don’t regret having 23andme genotype it. I knew the pros and cons when I did it and the pros outweighed the cons for me.
Sending a private corporation your DNA is a bad idea the second you hear about it no matter what the "benefits" are. Why would people, especially those who frequent HN and are aware of the data privacy debacles throughout history even trust them with something like this?
I think that depends on when you used their service. In the last 5 years, then yes, probably very naive. 23andMe was founded at a different point in time, where things looked more optimistic, funding was a different game and we worried less about companies misusing our personal information.
It might not be a huge disaster, but to me the issue is that the company can't make any real promises about how they might profit from the DNA of it's customers in the future. It's not a problem unique to 23andMe, I will never sign up to another social network, because of Facebooks behavior. I'll never sign up to another service such as Gmail, Outlook, YouTube or Reddit, because I've seen what those companies did and how they behaved I can no longer trust any online service. The trust that existed in the early 2000s is gone, the idea that if we didn't like something we could just leave and delete everything is gone. I don't envy someone trying to bootstrap a new service, the previous generation of companies have poisoned the well.
In my opinion this type of stuff won't become a big problem for another couple decades. When DNA is used for more and more stuff.
Just keep in mind the Golden State Killer lost his DNA in the 70s, and was prosecuted in the 2010s. Using DNA from a third cousin who used an online DNA service.
Maybe don't kill anyone and you're fine, maybe not. Time will tell.
Any sufficiently large corporation possesses the resources to gain trust by portraying itself as the 'nice guys,' unlike the others, which is often convincing enough to fool many.
The conviction that every corporation is inherently evil or can turn evil at any point in the future never seems to fail, but many people just aren't that skeptical.
UK citizen and 23andMe customer here. How likely is the sale of UK/EU customer data, or is it worth submitting a GDPR deletion request anyway? Get my data deleted before it's sold.
Whatever you do, do it soon, because it doesn’t sound like they’re long for this (corporate) world, before they sell all that data to (probably) much more nefarious vultures that are circling
gdpr might help you with data in a web database or data warehouse but if they have anything outside of that you're still screwed. I doubt a failing company has the time, energy, or resources to comprehensively clean up your data everywhere. Definitely submit the request but don't expect it to prevent your info from being resold
GDPR covers all personal data, that would include any DNA. It also includes the prevention of creating profiles without your consent.
But as 23andme is an US company, it is not under the jurisdiction of the GDPR. The legal situation isn't clear, the EU would claim some jurisdiction, but I (IANAL) think it's more like you go to the US, walk into a Walgreen and give up your data.
According to the GDPR, its jurisdiction is global via “public international law” and mutual government agreements, but you’re right that’s not entirely clear and they are claiming untested jurisdiction. The law says it applies to non-EU companies if the company establishes any marketing or sales presence either located in the EU, or markets or sells to EU residents (which might apply if the company so much as analyzes sales data by country), or if the company is “monitoring” the behavior of EU residents in any way, where monitoring does not seem to be defined in Article 4 so could mean a lot of things including doing anything with collected data or corresponding with customers.
I’m sure there are US companies that happen to sell to EU residents that happen to acquire some PII but don’t know and can’t correlate it with the EU, and so aren’t subject to the GDPR. But according to the law’s language, it seems as though something simple on a company’s website like using Google Analytics, which does identify and “monitor” the behavior of people by location, might trigger GDPR. I might expect 23AndMe to trigger applicability for multiple reasons, including that they are using DNA to identify regional heritage and relatives, the samples may be delivered with EU addresses on them, and the samples are as personally identifying as it gets. That’s on top of whatever the website, account registration, and sale process collects.
The problem of something like Google Analytics is that a company in the EU (EU company, US subsidiary, ...) exports PII to the US which it can't do (law interpretation is not clear inside the EU, e.g. is it legal if GA doesn't store IPs or if using GA without consent is generally illegal).
And exporting data to the US is illegal because US companies can't guarantee that the EU citizen data is protected (which is the goal of the GDPR).
But then again, it is not clear if this applies if an EU citizen goes to a company in the US (real or website in US datacenter) and leaves their data there.
Notably, the GDPR applies depending on customer jurisdiction rather than company jurisdiction. If they’re serving EU (or UK) customers, the GDPR definitely applies.
> The GDPR is retained in domestic law as the UK GDPR, but the UK has the independence to keep the framework under review.
The UK GDPR. It’s like the GDPR, only with a Union Jack and a bulldog slapped on the side.
Now, in practice, companies seem significantly less scared of the ‘UK GDPR’ than its full-fat European progenitor (probably for good reason; even before brexit, ICO was one of the less aggressive regulators, with its largest GDPR fine ever only being 20mn pounds), and of course the EU has a number of _newer_ consumer protections in this general area (DMA, DSA, AI Act etc) which the UK has _not_ implemented, but, for the moment at least, the UK still has some degree of data protection.
Yes, because of "The GDPR applies to 23andMe because we market and provide the Personal Genetic Service in EU Member States through our UK, EU and International sites."
The problem is that the EU parliament thinks this does not work, because US companies can be (secretly) coerced into giving data to the US government, even without telling the affected EU citizens (the EU commission has a different view). And the EU cititzen have no way of going to court over this. And a US company can't guarantee in any way to protect EU citizen data.
Which also the reason that all the *Shields failed and were killed by EU courts [0]
The view of the parliament is that you can't export personal data to the US at all as a company, so 23andMe can put up anything on the website they want, either they don't export data to the US (my Walgreen example) or they do, then they do it illegally.
So I (again, IANAL) would say this is marketing speak aimed towards users and has no relevancy.
I agree that the EU–US data transfer frameworks are unlikely to provide complete privacy safety, and this is an open problem. However, I was addressing whether 23andme is subject to the GDPR or not, and it clearly is. The data transfer frameworks are what supposedly allows them to transfer data to the US and still be GDPR-compliant. But regardless of whether they are actually compliant or not, they are indisputably subject to the GDPR.
That's not how GDPR works. GDPR doesn't care where your company is registered or does business; if they process the personal data of EU citizens then GDPR applies.
I was an Estonian resident a while ago, and I wanted to delete data in my old VK.com account (a Russian company). They didn’t do anything, naturally, so I wrote to Estonian data protection inspector or something. They said that (surprise!) they can’t do anything either.
Things might be better now, but my bet is if you register a company in, say, Seychelles, and your business is purely digital, you can ignore GDPR all you want.
EU can, in theory, tell payment processors to stop working with you, but I haven’t heard of such cases. Even then it won’t help if you don’t sell anything (apart from user data).
Some EU countries have started blocking websites (by spoofing DNS) – this could actually work to put some actual pressure on non-compliant companies, but also is kinda too authoritarian for EU?
Tl;dr: GDPR has good intentions, it just doesn’t work right now if the data processor is not in EU.
Correction: replace "EU citizens" with "people in the Union". That's how GDPR describes the people it covers. It's where you are that matters for GDPR rather than citizenship.
No if this is the case, they can't service EU citizens at all because US companies can't have any EU data because they can't protect EU citizen data.
The only way to service EU customers is when we assume entering data on an US website is not exporting data from the EU to the US by the US company. Just like when I go into a Walgreen in NYC as an EU citizen.
For the last decade US and EU companies have ignored the fact that it is/was mostly illegal do transfer EU citizen data to the US (it is currently legal but will be illegal again) - also every EU company that exports data to the US (e.g. by using Mailchimp) needs to guarantee the safety of the data by auditing Mailchimp, no one does and there have been no fine for now, but I assume there will in the future.
"The EU parliament raised substantial doubts that the new agreement reached by Ursula von der Leyen is actually conform with EU laws, as it still does not sufficiently protect EU citizens from US mass surveillance and severely fails to enforce basic human digital rights in the EU. In May 2023 a resolution on this matter passed the EU parliament with 306 votes in favor and only 27 against, but so far has stayed without consequences."
Someone randomly walking into a Duane trade in Seattle and purchasing a device would not be reasonably coveted under the GDPR
However if 23&me were targeting European citizens that would be different.
Despite what the adtech industry likes to claim online, Bobs Burger Joint in Baltimore does not have to be specifically concerned about abusing their customers data even if a customer happens to be an EU citizen.
Now if they shipped frozen burgers to France online then sure they would. If they sold “merch” in euros they would. But a local store with a physical premises trading in person? Not covered.
A European citizen living in Austin buying from Amazon though, could well be covered. Amazon do target EU citizens
It depends on the ToS they had at the time, when they started they explicitly had protections (privacy, data handling) only for US customers pointing to some local law, no details on how the data and samples from outside the us would have been handled. And that's why I never used they service. I think the GDPR road is well worth a try, good luck.
Despite my curiosity, for privacy reasons I made the decision to not use 23andMe. (Basically - feels like information an insurer will inevitably want to use against me.) My wife did, however, and over the years our kids did too, for various reasons (an interest in genealogy, a kid with celiacs looking to trace the genetic component, etc).
Recently I was very surprised to look at the app on my wife's phone and see that they have a shadow account for me with a lot of details filled in, due to my wife/kids/siblings/cousins having used the service. I should not be suprised -- this is how they caught the golden state killer, after all.
When you say your wife "used 23andMe", do you mean she merely signed up for ancestry features (e.g. 23andMe Family Tree) without submitting her own genetic data (i.e. to see what happened when a relative submitted theirs), or that she also submitted her own genetic data?
While I agree with you that anyone worried about privacy was a bit short-sighted to use 23AndMe, it's the same deal with everything, everywhere. Your favorite social media platform surveils all your posts/pictures/videos and is probably training an AI model on them by now, even though when you signed up for Friendster in the 1940s all of today's AI developments were distant sci-fi fever dreams. Outlook and Gmail feast on your emails. Your car is sharing everything you do with the manufacturer and the dealer and probably the government. Your washer is using up huge amounts of data for some reason, and you likely didn't even set up the app it has for reasons unfathomable.
Data is money, so of course companies will pull out every stop to harvest it, monetize it, deprive you of control over it, and ransom you with it.
It really isn’t. Lots of people are much more trusting than most tech people that I know who know exactly how easy it is to replicate data, keep data forever, and in general misuse data. Techies and ad people have known for decades.
This is a genuinely interesting ethical question. While our phenotypes (bodies) are separate, our genotypes are very much a shared resource (at least for read access); an extra special case are monozygotic twins, where if you obtain sample from one of them, you just mapped both.
Fortunately we don't see applications like "personalized poisons" yet, but it is likely inevitable.
If, say, an insurance company denies you some policy because of what they learnt from your relative's DNA, you suffered a concrete harm from that sampling decision.
Ironically, I think it’s this kind of attitude that creates an environment where the parent is rightfully upset.
We are not isolated units. Almost all our choices have impact on others. Lack of a shared culture creates societies where people are rightfully scared what the next isolated unit will do with their sensitive data.
If you can infer things about living relatives from a DNA sample (preexisting conditions, for example), then you should need their consent to release that sample. It’s not only your information.
Because I don't like what her DNA can say about me or my kids. This woman and me share enough DNA for this to be potentially pretty rough.
Secondly I don't trust anyone with that information because even though I trust how it might be used today, I don't know how it might be used in ten years.
The US Government already has my DNA. Because of 23andme, I was able to discover I had one copy of delta-32, and that’s pretty cool.
I was also able to find out where I came from and connect with distant relatives. To those who are tightly connected with their huge family, you’re privileged.
I’d be sad if this resource went away but I don’t fear it being used for nefarious purposes. I can rest assured the US government is already miles ahead toward that end.
If I was worried I wouldn't have furnished my DNA to a corporation with very little accountability in the first place.
I got enough out of the deal (instead of nothing from the government) that it was in my mind an acceptable tradeoff. No one's about to start cloning me.
Your DNA is not secret. You leave it everywhere you go. You have no reasonable expectation of privacy for your litter when you litter. It's only a matter of time and of tech before everybody has a copy of everybody's DNA.
That is the crux of the entire privacy argument. Why strive for privacy when "I have nothing to hide?" Also, how sure are you that having a copy of everyone's DNA data will become widespread? At a minimum, perhaps if you delay making the data easy to extract one can at least hold out hope that privacy laws will catch up. Of course, there's zero guarantee in that happening either.
Lastly, security through obscurity is not something to be relied upon. But it can work for a period of time.
There is a list of reasons several agencies in US government like the FBI collects DNA from some people, but they don’t have DNA for all US citizens and I don’t expect the government to have my DNA. What nefarious purposes do you imagine the government has? Is matching suspected criminals against the crime database a nefarious purpose?
Would you care if 23AndMe sold your DNA & analysis to a private for-profit medical insurance data provider who could recommend hiking your price or denying coverage, based on your genetic markers, without having to tell the insurance company why and without having to share your DNA? This is one of the private business nefarious purposes I worry about, based on having a friend who worked in credit processing saying that they were looking for legal ways to sell purchasing habits to medical insurance companies.
Perhaps it was inadvertent diction, but your use of “imagine” appears to ridicule my opinion. Not cool.
I’ve done time with an individual who got (I believe) wrongfully convicted due to genetic genealogy. A lay jury watches Law and Order, hear “DNA”, and will proverbially buy the Brooklyn Bridge from prosecutors.
Get too unpopular with those in power, and maybe your DNA can be traced to a shell casing for an unsolved assassination a continent away from you.
Annie Dookhan wrongfully convicted thousands upon thousands upon her doctored drug tests. Someone just like her could do it to you or someone else with your DNA test.
There are laws against insurers citing preexisting conditions to deny coverage, and most DNA is equivocal as to whether you’ll develop expensive maladies. So that doesn’t worry me either.
> There are laws against insurers citing preexisting conditions to deny coverage, and most DNA is equivocal as to whether you’ll develop expensive maladies.
In the US, those laws have been under persistent attack by Republicans since enactment, and there hasn't been a major election cycle where its repeal wasn't a campaign dog whistle[1].
And since when has for-profit industry required unequivocal evidence to strengthen their balance sheets and fatten their bottom lines?? These gamified business decisions are always beyond opaque and the burden of proof is always unfavorably shifted onto consumers in harm's way.
I’m confused by that. I didn’t ridicule you, no need to make negative assumptions. I’m simply asking what you know about “nefarious purposes”, given that the government certainly doesn’t admit having such intentions.
Okay, yes convictions can be messy and wrong, and juries can believe stuff from TV that isn’t true. Neither of those demonstrates government intent. None of the lawyers nor the juries nor the producers of Law and Order necessarily work for the government. You complained about my use of “imagine” and then threw out a completely hypothetical and vague scenario (three, actually). Even abuses of power by government employed individuals seeking some kind of retribution don’t demonstrate nefarious government purpose on the whole.
There are laws against wrongful convictions and untrue testimony and abuse of power too. Annie Dookhan went to prison, and convictions based on her false evidence are being dropped and overturned. Why do you choose to feel safe with insurance laws made by the government and not trial laws?
Personal experience. Unlike most, I have been wrongfully convicted on fabricated evidence but never denied insurance coverage.
I strongly encourage you to get in the habit of proofreading your posts for tone. You write with pique, a habit I find familiar, as I used to do the same when I was younger.
It’s not just what you say but how you say it, and tone can either further your contribution or get in the way.
I’m sorry my use of “imagine” offended you. I did not intend for that to be a slight, but I apologize that it came off that way nonetheless. I intended it to be an advance acknowledgement of the fact that it may be difficult to prove the government as a whole has intent to use DNA in questionable or “nefarious” ways. I was simply asking your reasons for making such claims.
I know the government does crappy things sometimes, even things that contradict its own laws. I’m still curious, piqued if you will, about how DNA can be used by the government against me, what things I/we should be potentially concerned about.
Personal experience is fair. It’s also the reason I lean towards fear of DNA being used against me by private for-profit companies more that I worry about the government.
I lack faith in the longevity of laws regarding preexisting conditions, both the one in PPACA and the one in GINA. One vice presidential candidate is currently advocating against continuing the preexisting condition protections. There's too much money in the insurance industry to keep up a bulwark for these protections.
> What nefarious purposes do you imagine the government has? Is matching suspected criminals against the crime database a nefarious purpose?
This is just strange.
Do you have no imagination whatsoever or have you never set foot in school or do you know literally nothing about history (maybe you were born yesterday and really quickly figured out how to write, I don't know)?
The government will try all sorts of immoral things, it is made up of people after all, and a significant portion of people have no or very weak morals/compassion. Tuskegee experiments, human radiation experiments, edgewood arsenal experiments, project 112, operation sea-spray….
I would extremely surprised if the US government doesn’t have all of 23andme’s data, it’s simply too valuable not to get that data by any means possible for black ops side of government.
When you say "the US govt", do you mean whether DHS(/FBI/etc.) freely access the genetic data of ~15m users without warrant (or disclosure), or that various LE agencies have for a decade been requesting specific users to voluntarily share their ancestry/genetic data supposedly for cold cases [0], or whether third parties have also subpoenaed the genetic data of specific users (e.g. paternity suits), or whether 23andMe in future firesale/auction that data to other entities and whether there are limits on how they can exploit it? Anyway I guess some year soon there'll be a precedent case (like the 2019 FL Amazon Alexa voice recordings at the time of a murder), probably too late for public awareness.
Has 23andMe disclosed how many users' genetic data have been accessed by govt or LE agencies or third parties? Is it obligated to? Does this obligation still attach if/when 23andMe ceases to exist? Have any privacy researchers estimated what (ancestry-only?) data got resold on the darkweb since 2023?
Since federal laws like HIPAA don't cover 23andMe or the genetic data(!) it holds on 14+m users, because as currently written “HIPAA does not protect data that’s held by direct-to-consumer companies [outside of healthcare] like 23andMe” [1]. Although CA and FL consumer laws give some protection against the company - but no criminal protection against people selling it on the darkweb. Also, it's unclear whether any protections automatically attach in the event of a firesale of assets/bankruptcy.
Hence 23andMe was able to settle the 2023 breach of 6.9m users' data (ancestry data, not actual genetic data) lawsuit for a tiny $30m [2], no criminal penalty, no admission of negligence or wrongdoing, no executive resignations.
Compare to the (suspended) criminal conviction of the CEO in the 2020 Finland Vastaamo psychotherapy center data breach (only ~36,000 patients), for violating GDPR. [3]
Also, to the GDPR-related discussion on expectations about cloud sovereignty of genetic and ancestry data, there's an obvious implication that consumers want GDPR to effectively protect their data, they should at absolute minimum insist on a cast-iron guarantee that sovereignty is enforced. With strong criminal penalties.
They do what is called "genotyping"[1] which is much cheaper than full "sequencing" [2], but which only probes for a limited set of known variants. So it's only partial information.
Since then 23andMe has launched a more expensive 23andMe+ Total Health offering, which does full sequencing, but like most people, I never subscribed to that package. In fact I had no occasion to interact with the company after the first result, which may be why they are struggling financially.
DNA testing is something most people might do at most once in their lives -- then they lose interest.
[1] https://customercare.23andme.com/hc/en-us/articles/202904610... [2] https://customercare.23andme.com/hc/en-us/articles/202904600...