How to delete your 23andMe data amid the company's turmoil

(lifehacker.com)

152 points | by gnabgib a day ago ago

156 comments

  • carimura a day ago ago

    I've been dealing with their support trying to delete my data. Here's the latest response [1]. The way I read it, they won't delete your genetic data, and it sure seems personally identifiable to me. Am I reading this wrong?

        [1] This is a follow-up from the 23andMe Team. Your 
        inquiry has been escalated to me for review. To clarify, 
        once you confirm your request to delete your account, we 
        will delete your data from our systems within 30 days, 
        unless we are required by law or regulation to 
        maintain data for a given timeframe.
    
        For example, your Genetic Information, date of birth, and     
        sex will be retained by 23andMe and our third party     
        genotyping laboratory as required for compliance with     
        applicable legal obligations, including the U.S. Federal     
        Clinical Laboratory Improvement Amendments of 1988 
        (CLIA), California Business and Professional Code 
        Section 1265, and College of American Pathologists 
        accreditation requirements.
    
        It is important to understand that the information stored     
        is distinct from the raw genotype data available within 
        your account. The raw data we receive from the lab 
        has not been processed by our interpretation software 
        to produce your individual-level genotype data (in 
        your account).
    
        You can read more about our retention requirements in the 
        retention of personal information section of our Privacy     
        Statement.
    • drdaeman a day ago ago

      As I get it, it's a federal requirement for a lab to keep genetic data for a while with no way for the specimen to do anything about it.

      So, it's a CDC thing, not exactly 23AndMe fault. Save for the fact that 23AndMe advertised it's easy to delete data on their front page, but with the small print somewhere out there that you can't really delete the actual data. To be entirely fair, it was there somewhere (I think in their help center in some article about data deletion process) when I went to check out their privacy policies - because that's how I learned about it and reconsidered buying a test, but I guess most people don't read the small print until the deed is done.

      My understanding is that they will delete your data on their side (leaving only a few things like payment receipts), but the lab won't because they legally can't.

      • carimura 19 hours ago ago

        Assuming for a second these federal requirements cited are a) valid, and b) for a good reason, it still says right there in the response that it's not just the lab, sadly.

            For example, your Genetic Information, date of birth, and     
            sex will be retained by 23andMe
        • ErikBjare 18 hours ago ago

          23andMe is the legal entity with a lab, it could still be "just" the lab.

    • mikrl a day ago ago

      > For example, your Genetic Information, date of birth, and sex will be retained

      Quite possibly the most terrifying thing I’ve read recently.

      • carimura 19 hours ago ago

        it's ridiculous. unless I'm missing something, they basically have this entire fluff piece of a privacy policy, data control, deletion, etc, and then just keep your genetic info after deletion for sale to the highest bidder / nation state.

        • ErikBjare 18 hours ago ago

          CDC policy. They can't sell the data.

          • moi2388 17 hours ago ago

            Don’t worry, it will “leak” instead.

          • carimura 8 hours ago ago

            not sure their privacy policy [1] agrees

                If we are involved in a bankruptcy, merger, acquisition, reorganization, or 
                sale of assets, your Personal Information may be accessed, sold or 
                transferred as part of that transaction and this Privacy Statement will apply 
                to your Personal Information as transferred to the new entity.
    • cypherpunks01 a day ago ago

      I got an identical email, after asking numerous times for them to tell me when all information will be deleted, i.e. when do the compliance requirements expire for my specific account?

      They certainly don't seem interested in answering this question, no matter how many ways I phrase it. So much for "you are in control of your data", I guess it was all BS as some people predicted.

      • carimura 19 hours ago ago

        Ya I asked that also, as well as "is the information you are retaining personally identifiable to me?" but sadly I think I know the answer to that one....

        no response yet. I'm sure the privacy department is busy.

    • csl a day ago ago

      The way I read it, and I may be wrong, is that they will retain the _interpreted_ results, but now the raw and complete data.

    • 10u152 a day ago ago

      So they'll happily delete everything unless it has value to them. Charming.

      • psunavy03 a day ago ago

        It literally says they have legal obligations to retain data and cite the exact laws that require them to. Did you even read the post you replied to?

        • 10u152 a day ago ago

          I did read it. The response is we will delete unless… and it lists a bunch of possibly applicable laws.

          Based on the vagueness of the response (which law in particular, what are the details etc) I’d argue they won’t delete anything ever and claim that they thought they might run afoul of some law.

  • roughly a day ago ago

    I’m in a weird spot with 23andMe - when I signed up, I used a fake name as a fig leaf in case they decided to sell to insurance or whatever. Since then, several members of my immediate family have all signed up, so “the child of X and the sibling of Y” means that fig leaf is pretty useless now - except I can’t issue an actual CCPA now, because fake name.

    All of this is super predictable, but I wasn’t nearly cynical enough 15 years ago when I mailed my spit to them.

    • randerson a day ago ago

      I wonder how many of your relatives now believe your parents had an additional child that they don't tell anyone about.

      • roughly 8 hours ago ago

        Yeah, the first sibling who joined after me texted me immediately when they saw my fake name there :-D

        (Although that turned out not to be the biggest family scandal that was turned up by the genetic testing - the cautionary click through on the "relatives" page is no joke...)

    • verisimi a day ago ago

      If you (and others in the replies) were to go and update and perfect your data, removing all these ambiguities (fake name, dob), you would then be in a position to ask them to delete it. Ie absolutely remove all doubts about who you are to then address your privacy concerns. Perverse, eh?

    • bee_rider a day ago ago

      I wonder if you can convince them through the customer service portal? People make typos all the time…

      • junon a day ago ago

        Doubt it. I assume they're under HIPAA regulations and it'd be a massive cost if they did it even once.

        • HeatrayEnjoyer a day ago ago

          HIPAA also includes the right to correct incorrect information in your records. 23 may have to get unconventional to verify the individual but they're a DNA lab and have everything they need to make a positive confirmation.

        • dekhn a day ago ago

          HIPAA doesn't apply to 23&me.

        • bee_rider a day ago ago

          There’s no HIPAA concern if you just want to delete your info, I think.

          • lcnPylGDnU4H9OF a day ago ago

            I don't know enough about HIPAA to know if it would be a violation for them to delete my data upon your request but, assuming it is, that would be a good case for requiring identity verification.

        • chimeracoder a day ago ago

          > I assume they're under HIPAA regulations

          You would be incorrect. HIPAA does not apply to 23 & Me (or, for that matter, to almost any direct-to-consumer product).

      • madaxe_again a day ago ago

        Nope. Same boat. They want ID in the name I signed up with to do anything, and I haven’t been able to access my account since they mass reset passwords after their breaches.

        • lamontcg 17 hours ago ago

          Amusing that they literally have your genetic sequence data, but they won't trust anything other than a government ID with a name that matches their records.

        • bee_rider a day ago ago

          I think it should be easier if the goal is just to get your data deleted. If you want to recover your account, that brings up some HIPAA concerns. But if you are just nuking it, that should be easier, right?

          • ErikBjare 18 hours ago ago

            No, security shouldn't be relaxed because you are "just nuking it".

            • bee_rider 13 hours ago ago

              Yes, it should be. It is a “we’re giving away your personally identifiable medical information” issue, if they give you access. It is a normal customer service issue if they are just deleting the account for you.

  • filchermcurr a day ago ago

    I lied about my birth date and apparently there's no way to delete your data without the fake date or a photo ID... with the fake birth date...

    sigh

    • bee_rider a day ago ago

      Have you tried emailing them a bit? It is worth a shot I think: you made a typo (people make them all the time), but you don’t really need to fully authenticate, because you are just making a deletion request anyway (not trying to access the data).

      (Also keep in mind, customer service people have to argue with assholes all day long, staying polite but clear but on-target can go a long way. Stick to the topic and never give them an excuse to cut off communication).

      • filchermcurr a day ago ago

        Yep. That's how I learned that I need the photo ID with the original date of birth. My last e-mail was asking if there is any other way to delete the account but so far I haven't heard anything.

        • xelamonster a day ago ago

          I think the GP was suggesting, and imo it's a good idea, that you should first ask them to update your birth date saying you entered it incorrectly. Then you can request deletion again and your ID should be accepted.

          • filchermcurr a day ago ago

            That was how I initially approached them. I told them I made a mistake with my birth date and asked if there was any way to correct it. They asked me to send a photo ID with my name and birth date, which I assumed was to replace the old date. I then realized they meant the ID had to have the fake date and we went 'round and 'round (and 'round) on that until I gave up on them understanding what I actually wanted to do. Now I'm trying the more direct 'Please help me delete the account.' approach. I'm not hopeful, but you never know!

            • bee_rider a day ago ago

              I think the more direct approach has a good chance. They can’t update your birthday, reasonably IMO, because they would be giving you access to the data after you failed the authentication. But they wouldn’t actually be leaking any info by just deleting the account.

    • j-bos a day ago ago

      Thanks, I want to get my genes sequenced but I'd also like to get my records deleted from the service provider. I guess it'l have to be real name?

  • FloatArtifact a day ago ago

    They might delete it from their database, but it doesn't change the fact that it's been sold and shared in a way we can't also follow up and remove that information. There's no transparency. It not only implicates you, but your relations and future generations.

    Genetic testing done through the hospital for a completely unrelated procedure can impact your life insurance. ( Example genetic testing for a child) Minnesota State Law prevents health insurance from changing. Laws need to protect right to know, not just right to use genetic information.

  • resters a day ago ago

    I tried to download my raw data recently and it took days. Seems like a lot of customers are trying to download it and cancel after the turmoil. I think 23andme has always been held hostage by its scientists who have stopped it from offering a lot of entertaining information about health related studies that are not considered methodologically sound enough to constitute health advice. Why not just add a "speculative or insufficiently replicated / peer-reviewed" section and let us have fun with our data!

    • _DeadFred_ a day ago ago

      Isn't it because there aren't ways around US laws regarding giving medical advice? That's my understanding why the places that do it are outside the US.

    • AStonesThrow a day ago ago

      I tell ya, it's a great party conversation that begins with "Hey, I'm a Libra, 3% Neanderthal, and I share a haplogroup with Genghis Khan! Let's go out for some tacos with extra cilantro, and a dark chocolate churro!"

  • marcell a day ago ago

    I used to work at 23andMe, AMA

    Previous: https://news.ycombinator.com/item?id=41575685

  • tamimio a day ago ago

    Glad I never did any of these tests, I refuse to use biometrics in my own iPhone let alone sending my whole DNA to some company.

    • jesseendahl a day ago ago

      Your iPhone doesn't even really store your biometric information, it stores mathematical models that can be used to check whether the fingerprint (Touch ID) or face (Face ID) matches the person who enrolled on the device (you).

      And that mathematical information is only stored in the Secure Enclave, which means even if the entire Operating System (iOS) is hacked, the attacker still would not have access to your biometric information.

      You should read this page. It goes into great detail about how much security there is around Touch ID and Face ID: https://support.apple.com/guide/security/face-id-and-touch-i...

    • yoavm a day ago ago

      Just a reminder that you're leaving your biometrics basically on every surface you touch, and your DNA pretty much everywhere you drop a hair.

      • lm28469 a day ago ago

        That doesn't mean sharing them online with whoever buys the data is good... Every time I change my clothes at the gym I'm fully naked, I wouldn't want that to be showed on live TV, it's really not that hard of a nuance to grasp

        • yoavm a day ago ago

          True, but the analogy is far from fitting. Your nudity is shared with people in the gym, your biometrics is shared with anyone that has access to anything you ever touched. Your nude picture being shown on TV isn't the same as you-and-15-other-million nude pictures being shown on TV, and while sending your nude photo to your boss can be harmful, it's hard to think about what harm can be caused by sending your DNA to... anyone.

          • lm28469 7 hours ago ago

            Sure but unless you're a state level threat no one will collect your old coffee cups

            Once your DNA is in some random company's DB it's there forever and available to whoever has the money

            > it's hard to think about what harm can be caused by sending your DNA to... anyone.

            Oh yeah ? Is this something that was determined by some kind of god and will be true forever ?

            https://theconversation.com/life-insurers-can-charge-more-or...

            > In Australia, life insurance companies can legally use the results of genetic tests to discriminate. They can decline to provide life insurance coverage, increase the cost of premiums, or place exclusions on an individual’s cover.

      • consteval 9 hours ago ago

        Yes, in meat space. You can't relate that meat space data to anything meaningful unless your DNA is also in some database.

        This is how they found the Golden State Killer. He left some DNA in the 70s. Worthless for a long time. But, a third cousin of his did a DNA test with a company, and the company provided the data to law enforcement, and they worked backwards to the killer.

      • programjames a day ago ago

        Yes, but most people aren't following me around with a plastic bag to grab that biometric data. And if someone were, I'd get pretty paranoid about what they're trying to do with it.

        • yoavm 17 hours ago ago

          Perhaps the reason why people aren't following you with a plastic bag is exactly because it is worthless? Sounds to me like if it wasn't, someone would have been following you. 23andme didn't follow anyone around — people sent it to them — and that doesn't make the data valuable all of a sudden.

      • deely3 a day ago ago

        Yes, but these biometrics does not automatically come with your social information, and quite hard to sell.

        • yoavm a day ago ago

          It seems like a year ago 23andme was hacked and almost 7 million records were leaked. Wikipedia doesn't list one case of someone being affected by it. Was this kind of data ever used to harm someone? It sounds to me like we're just speculating.

  • whalesalad a day ago ago

    I feel like that ship has sailed. Every software company I have ever worked for is dysfunctional in this regard. You might think your "delete my data" request succeeded but there is absolutely zero way to guarantee that it actually did, and chances are it didn't.

    • davedx a day ago ago

      Agree, this is pointless. For one thing how many companies have the technical ability to remove specific records from all their database backups and logs? None that I’ve worked at

      • al_borland a day ago ago

        I’m not expecting my data get deleted from old backups or log files. I can see where that would be an issue.

        What I do expect is my data is deleted from the production database and thus won’t be in any future backups/logs/etc. I guess to that end, they would need to keep a record of delete requests to re-delete them if they ever need to restore from backup.

        If there is a data breach in a year where the company’s user data ends up on the internet, I expect to not be in that user list.

        • whalesalad a day ago ago

          The problem is - imagine microservices - that data does not exist in one spot. And chances are no one actaually knows 100% where the data lives. It probably lives in a prod db, an ETL data lake type platform (or two/three - and god knows if that has any kind of identifier to actually delete it) and chances are if you are big enough some 3rd party systems. So even if you delete it from prod, it still exists somewhere.

          In a perfect world there would be some way to snap your fingers and delete it from every system - but we do not live in a perfect world. There is absolutely no incentive to build systems with this kind of requirement in mind. It's a waste of time and effort. Europeans will say "but hey wait! GDPR!" meanwhile the world keeps spinning and no one gives a shit.

          • geoka9 a day ago ago

            The big scary fine is a absolutely an incentive, unless you are huge company with an army of lawyers on standby. Some customers I've worked with submit GDPR requests on behalf of their users to all of their software vendors; there're even SaaS products for doing exactly that. They also get you to sign documentation to that extent if you want to keep them as a customer.

          • Eisenstein a day ago ago

            I don't understand. If they want the ability to ever update the data how do they expect to have it propagate if these information systems aren't connected? And if they are connected, why would 'delete entry' be harder than 'update entry'?

            • photonthug a day ago ago

              You’re assuming the data is actually handled with care, and lives in a system designed for this sort of thing, but it’s probably not.

              Some data scientist who wasn’t even supposed to be able to view stuff managed to copy data from prod to dev, maybe by accident. Then an engineer who intended to be testing an in place transform pipeline actually wrote a copy elsewhere, maybe once for each test, and didn’t even notice. This stuff happened off and on for like fifteen years while projects and people all changed and no one cares, because sure there’s some vague gestures in the direction of compliance, security, and privacy but it’s just a token effort because anyone who really starts asking questions is let go for not being a team player.

              Syncing updates to some “main” copy won’t make it easy to delete dupes.

              • al_borland 21 hours ago ago

                Not being able to delete every possible record that may have been used during testing doesn’t mean they shouldn’t delete the data they do know about in the main system. If I can change my username, password, or email address… and those functions work. A delete should also work.

                Why a prod db with sensitive user info is being used in testing is a whole separate issue. Those can be deleted in their entirety.

      • HowardStark a day ago ago

        Not that anyone is disagreeing, but it bears repeating: This is a lack of any real pressure from regulators, not a technical challenge. Or rather, there may be technical challenges but they absolutely can be overcome, and aren’t being tackled right now very simply because the business doesn’t care. As is so often the case, the business must be made to care.

        • Terr_ a day ago ago

          > This is a lack of any real pressure from regulators, not a technical challenge.

          Also, I think it's easy to misstep if we start thinking of it as a problem of "better regulators", since some of the blame lies on deeper legal-aspects around (data-)ownership, contracts, and what what happens in bankruptcies.

          Even a company with great intentions may have difficulty ensuring the promises they made are kept long-term, especially if a bankruptcy court voids those promises in the name of repaying creditors.

        • Muromec a day ago ago

          GDPR mandates the ability to delete the data.

          • LegionMammal978 a day ago ago

            Not from all backups, or so I've heard.

            • samastur a day ago ago

              You heard wrong. It doesn't have to be immediate though.

              • LegionMammal978 a day ago ago

                https://news.ycombinator.com/item?id=41068881, https://news.ycombinator.com/item?id=37941653, https://news.ycombinator.com/item?id=36085044, https://news.ycombinator.com/item?id=34207919, https://news.ycombinator.com/item?id=32744415, https://news.ycombinator.com/item?id=32161041, https://news.ycombinator.com/item?id=31340987, https://news.ycombinator.com/item?id=31051129, https://news.ycombinator.com/item?id=31048828, ...

                My impression from all that I've heard is that you should have a backup retention policy, but otherwise there's no set upper bound on how long that may be. Not that the text of the GDPR breathes a word of it, though, everything's just a rat's nest of exemptions suggested by various authorities and other parties that haven't been tested in court.

                • samastur 14 hours ago ago

                  In general I don't particularly care what other people say on this topic and rely on the legal guidance I received during my work from UK ICO and Slovenian office, but even some of your links don't collaborate you. The second one linking to Verasafe's page on which it clearly says that yes, you should delete it.

                  There's a lot of complainig around how difficult that can be and the fact that EU legislation in general often does not like to precisely prescribe its requirements like what reasonable means, which can indeed be annoying.

                  You still need to remove it either directly or your retention policy for backups needs to be short enough that keeping it in backups for a while is judged as reasonable.

                  • LegionMammal978 11 hours ago ago

                    > In general I don't particularly care what other people say on this topic

                    Nor do I see why I should particularly listen to what you say on this topic, given that others have similarly claimed authority from their lawyers or from their local jurisdictions.

                    > The second one linking to Verasafe's page on which it clearly says that yes, you should delete it.

                    Right before the "But don’t panic! Enforcement authorities know how difficult it is to fulfil this obligation in practice." section, where it elaborates on your ability to claim that stripping data from backups is technically infeasible, in which case you must promise to delete the data on restoration. Just like I've heard from everyone else.

                    It's always seemed paradoxical to me that the GDPR is branded as this unyielding hammer against companies improperly storing your data, only for it to be riddled with amorphous holes on every axis. "Data is data, period, unless it's not on a live production system, in which case the written vague rules it abides by are swapped out for a new set of totally undefined rules!"

                    > You still need to remove it either directly or your retention policy for backups needs to be short enough that keeping it in backups for a while is judged as reasonable.

                    And how might I know a priori what's the longest 'reasonable' retention term that a business might be permitted by its jurisdiction? The whole nature of backups is that they're useless right up until they aren't, so the marginal value of each additional week is difficult to measure in the first place. And when most concrete talk of 'reasonableness' is seemingly done behind closed doors if at all, I have no idea just how far other jurisdictions' ideas of a reasonable term might differ from mine.

        • whalesalad a day ago ago

          Disagree. Waste of time and resources. Let the data sit and rot, who cares. We are humans not Germans.

      • yreg a day ago ago

        Data usually leaks from production though, no? So in that perspective it's not pointless.

      • dstroot a day ago ago

        On the other hand very few organizations I have worked at could definitely restore backups (at least it was not tested regularly) and logs will eventually roll off.

    • jakjak123 a day ago ago

      I dno, but delete works most places I have worked, just because it saves money

    • goalonetwo a day ago ago

      Exactly this. Especially for a currently failing company that got an incentive to NOT delete your data (because that's the only value they still have).

    • sfjailbird a day ago ago

      Not in the EU.

    • Yhippa a day ago ago

      I feel like at best we'll get a soft delete

    • ethbr1 a day ago ago

      Historically, yes.

      But don't the GDPR and CCPA et al. create liability around failure-to-delete after receiving a request?

      • ajsnigrutin a day ago ago

        Sure, but how will you know they didn't delete it?

            update users set deleted=true where uid=123345;
        
        And the data is "gone".
      • goalonetwo a day ago ago

        Good luck proving that your data was not deleted.

        GDPR and CCPA etc made it easy to send a request for deletion that will most probably be a frontend gimmick. How much effort are they really going to put into going back in their backups and deleting all your entries? I'm pretty sure it must be the lowest roadmap priorities.

        • ethbr1 a day ago ago

          The financial penalties are pretty nasty.

          And it's amazing how financial liability has a way of getting things on a VP's feature radar that common sense doesn't.

          The reason it was haphazardly handled prior was that there was no liability. Who cared? (legally speaking)

          From working inside a T25 American retail company, I can say that we went top-to-bottom and rearchitected for traceability and hard deletes as a result of the CCPA.

        • yreg a day ago ago

          I have a feeling that it's also quite a difficult problem past some scale of infrastructure.

          If I ask Google to delete my data (EU citizen), I have trouble believing that they actually go through all of their cold storage backups where it was stored and make sure it's erased. At best I could believe that the process is designed in such a way that my soft-deleted data is unlikely to be recovered (intentionally or not) and maybe unlikely to be possible to link to my account.

          • anyfoo a day ago ago

            What they should do (I have no idea what they do) is to encrypt every record belonging to a user with an individual key. Live records, backups, everything. If a user wishes to be deleted, that live key is simply obliterated, making any data the user owns unrecoverable.

            Since the key is not used for end to end encryption, and backends still have access to the data (as long as the key lives), it has different requirements on how it needs to be protected. The biggest challenge is backing up the key itself, as losing it means losing access to all the user’s data by design. But backing up and obliterating a single key is much, much easier than doing so for a whole set of loosely associated data across many databases.

            • ygjb a day ago ago

              Practically speaking, it also makes using and querying that data and doing any kind of analytics much, much more expensive. It is done that way in some cases, but in the absence of a technical requirement to do so, there are cheaper approaches.

              • anyfoo a day ago ago

                Those are solvable problems. I could also argue how address space separation and more generally MMU protections make things so, so much more complex (they do!), yet we don’t question that one very much.

                There is no end to end encryption involved here, so you don’t need to resort to such voodoo as homomorphic encryption.

            • yreg a day ago ago

              Yes, I also expect that this is the way, but I think it makes the problem only partially smaller, since you still need to sync and back up the keys.

              Also, is an encrypted piece of data with a lost key truly deleted? What if the encryption gets cracked?

              I would say it is more deleted than toggling a `deleted` flag in the db and less deleted than burning the tapes in fire.

              • anyfoo a day ago ago

                > the problem only partially smaller, since you still need to sync and back up the keys.

                I mentioned that: It makes the problem much smaller, as you only have one single, small piece of data to backup and and erase, instead of an ever-changing many-faceted blob of distributed data.

                > Also, is an encrypted piece of data with a lost key truly deleted? What if the encryption gets cracked?

                Oh boy. If simple symmetric encryption gets “cracked”, then you have much larger problems.

                > I would say it is more deleted than toggling a `deleted` flag in the db and less deleted than burning the tapes in fire.

                For all practical purposes symmetrically encrypted data that lost its keys is considered “random” data. If you “erase” data on a device before you sell it, most often it will just throw away the key to the disk contents nowadays.

            • sfjailbird a day ago ago

              They already do this (the encryption-at-rest part). Deleting the data is still a hard requirement. Also, the keys are never seen outside of the centralized encryption service. Deletion is still a must.

              • anyfoo a day ago ago

                Encrypt with an individual key for each user. Throwing away the key is indistinguishable from deletion.

          • jll29 a day ago ago

            Before you make a deletion request, make a subject data request and see what they have on you; then request deletion; then make a subject data request again.

            • yreg a day ago ago

              The fact they cannot access the data during subject data request does not mean it has been deleted.

          • sfjailbird a day ago ago

            Google-scale companies have very capable people employed, both on the technical and legal side, who do nothing else than look for these kinds of oversights, and are empowered to make sure they get fixed.

            • goalonetwo a day ago ago

              Large companies fail in spectacular ways all the time. Google is super successful because they tapped into the biggest cash cow of all times. Not because the employees are somehow very capable and above any oversight.

            • yreg a day ago ago

              That's why they get fined all the time?

          • ygjb a day ago ago

            I can't speak for any other companies, but you don't need to speculate. You can search the internet and find several articles outlining that the correct strategy for businesses here is to delete the data from production systems, and then maintain a record of references to those deleted records such that a restored backup can ensure that deleted records are not put back into production.

            There is generally an expectation that data may be retained in backups for a specified retention period, but will not be used or restored. Beyond that, it is up to the regulator to determine if this is meets the standard, but it's worth noting that there are notions baked into the text and the interpretations of the text of GDPR that account for reasonable costs and efforts.

            Auditors can and do test and monitor for this, both using audit processes and demanding evidence, and by performing manual testing and experimentation.

        • jll29 a day ago ago

          Fines for non-compliance with GDPR regarding data of European citizens can amount to 4% of annual revenue:

            83(5) GDPR, the fine framework can be up to 20 million euros, or in the case of an undertaking, up to 4 % of their total global turnover of the preceding fiscal year, whichever is higher.
        • sfjailbird a day ago ago

          I have built systems for a lot of EU companies, and they all took GDPR compliance very seriously.

          Maybe some mom-and-pop shop would bodge it, but any serious business has legal council and wisely listens to them.

    • bluetidepro a day ago ago

      100% this. It's laughable if you believe those requests work as expected. Sure they may "delete" some surface level bs like your account or login, but there is no way it's 100% scrubbed in the way it's supposed to work.

      • toomuchtodo a day ago ago

        A lot of recourse is around intent and liability. I would like to believe my request is honored; in the event it is later proved to not have been honored, recourse is potentially available through legal and regulatory mechanisms.

        23andme didn't implement strong customer identity and auth mechanisms, for example, and it cost them ~$30M to settle their data breach liability [1]. Take action, keep receipts, and failing good faith actions, step back while regulators and the legal system whack whack whack with a hammer.

        [1] https://news.ycombinator.com/item?id=41536494 ("HN: 23andMe settles data breach lawsuit for $30M")

        • bluetidepro a day ago ago

          Oh nice, "~$30M to settle." That <$100 you get back in the class action will be amazing compensation. Sadly the legal route is a joke at this point.

          • bluetidepro a day ago ago

            > I'm happy if it contributes to the death of the org.

            But the not the death of your data. That will be sold onto someone else.

            • a day ago ago
              [deleted]
          • a day ago ago
            [deleted]
  • kulesh a day ago ago

    I moved my DNA data from 23andMe to Genomelink ~5 years ago. Sort of saw it coming.

  • ethbr1 a day ago ago

    One instance where I am disappointed to be vindicated.

    Considered doing 23andMe at the hype peak, discovered they had avoided HIPAA requirements, read through their privacy policy, and marked them off the possibility list.

    It was pretty clear the delta between sequencing costs and price they were charging consumers equaled how much they thought they could make from your genetic information.

    And because they don't fall under HIPAA, your data is theirs after they get it.

    PS: Sequencing costs were also falling rapidly, so it isn't that expensive to get it done.

    • outworlder a day ago ago

      They do not do DNA sequencing. They do genotyping. It's far less detailed.

      • ethbr1 a day ago ago

        That too. And at full sequencing for <$1000 now, why not just pay for the whole shebang? It's not like someone is doing it monthly.

        • j-bos a day ago ago

          Do you know where one can get sequencing done at a HIPAA regulated provider?

          • burningChrome a day ago ago

            I would talk to your primary care person, they should know.

            I've had two members of my family die of ALS and was wondering what my odds are of getting it. One of the steps could be a full DNA sequence. In order to get to that step however, you have to do about six months of counseling, and several blood tests before they do the full DNA sequence. The counseling is to prepare yourself for the possibility of them essentially giving you a death sentence with the blood and DNA results.

            I never got that far. My father convinced me its better not to know and live your life accordingly, rather than trying to live a life always looking over your shoulder.

            But my primary did have the information on how I could get it done, so I would start there.

  • Jerry2 a day ago ago

    The (consumer) company I used to work for also allowed their customers to "delete" their data. Deletion was implemented as a boolean filed in the database "deleted - true/false'. We called it "soft deletion". And why was it implemented like this? It's because actually deleting data is hard. There is no single database and the data is distributed across many servers. It's also backed up in different places. Running the delete operation can be extremely costly and can also create service interruptions and data integrity issues. I think there was a script that was supposed to actually delete the entries but it was not run very often and was there for legal and compliance issues.

    Just remember that when you request to delete some data on the internet, it doesn't actually get deleted (right away anyway). The best way to deal with this is not to give random sites your real information in the first place. However, that can be difficult or impossible when dealing with government, financial institutions or shopping sites.

    Edit: And just to address questions below, the actual delete script was not run daily. I don't know how often it was run (I was not an SRE) but I presume it was run at least once a month. I have no idea how other companies do this.

    • adrianmsmith a day ago ago

      > there was a script that was supposed to actually delete the entries ... was there for legal and compliance issues.

      Sounds like the laws worked in this case. They required data to be actually deleted, and it was due to those laws, and only due to those laws.

      • vasco a day ago ago

        No you don't understand, the script exists for plausible deniability, it even runs sometimes! And if you find out we didn't delete your data, we might even go out of our way to run it for you. Except if the script doesn't run anymore because it's been broken. Or because 5 microservices were added since the last time we "actually had to run it", and so even running it makes no assurance it actually deletes everything about you.

        But if an internal lawyer really puts their foot down, we might put an intern looking at it for a couple of days.

        I'd bet a finger this is how it works in most companies, and I know I've seen worse versions.

    • ravenstine a day ago ago

      Many businesses would still use soft-deletion even if distributed data wasn't an issue. The company I work for has soft-deletion enabled because they want to be able to help customers who accidentally delete something. I wish we would just tell them "better luck next time", but obviously management will never say that.

      What annoys me more is how many companies give next to no insight into or control over data retention. It should be unambiguous how soon or often our data gets hard-deleted, if ever.

    • zapkyeskrill a day ago ago

      Heh, I once worked for a company that had an "is_deleted2" field .. it indicated record was "hard" deleted and not accessible anymore via usual means!!

    • lm28469 a day ago ago

      It's 2024 if you can't delete data without corruption or downtime you're an absolute buffoon of an engineer

      If anything gdpr made painfully obvious how sloppy some devs/companies are

    • williamdclt a day ago ago

      Let’s be clear that what you describe is absolutely not gdpr compliant, so it would be illegal if you do business in Europe

      • tantalor a day ago ago

        Did you read the whole comment? They say there was a batch script to comply with legal requirements.

        • cwillu a day ago ago

          They said they thought there was a script, but it wasn't run very often.

        • a day ago ago
          [deleted]
        • anyfoo a day ago ago

          Didn’t seem sufficient to me at all, but I’m happy to be proven wrong.

          • Raidion a day ago ago

            I work for a company managing a team that has built this for GDPR compliance.

            Customer submits a deletion request. We have a fan out process that takes the deletion request and submits it to a bunch of different data locations. All of these must respond within 2 days (though the required time is 72h). Each of those data locations will queue up a job to remove access (soft delete) the data, and schedule a hard delete for 28 days in the future. If the customer says they don't actually want the data to be deleted, we cancel the data hard deletion and revert the soft delete. If nothing happens the hard deletion goes through.

            • anyfoo a day ago ago

              Thanks, that’s insightful. In this case, it seems sensible to me at least.

        • itake a day ago ago

          > but that was not run very often

          GDPR has strict rules about how long data can persist after the deletion request is made.

          • tantalor a day ago ago

            Who knows what "not very often" means. It could mean once a day or once a year. The point is that this could be made to be compliant with little extra effort, so pointing out "um actually it's not compliant" is not saying much.

  • kanzure a day ago ago

    I don't think I have ever seen a correctly implemented data deletion request system that worked well with the company's backups. If it's backed up, it's likely not getting deleted.

    • kccqzy a day ago ago

      I have seen plenty. The key is to take frequent backups and aggressively delete older backups once you know you won't be restoring from that backup. Also, don't appropriate backups to do other things, such as audit logs.

  • outworlder a day ago ago

    Note that even if they delete the data, if you have close relatives that submitted their samples a company can still infer quite a lot from that.

  • renewiltord a day ago ago

    Don't know why you'd bother. I, and my friends, and soon my family are in All of Us. We'll be in every genomics dataset you want.

  • more_corn a day ago ago

    As a California company the data is subject to the CCPA. You can download your data but more importantly you can request they delete it. I highly recommend that everyone do so.

    I can think of no more sensitive biometric data than your dna.

    • tombert a day ago ago

      > I can think of no more sensitive biometric data than your dna.

      I dunno, is that actually true? You leave DNA everywhere don't you? If someone really wanted tombert's DNA, they'd just have to follow me onto the train and swab the poll I'm grabbing, or grab the cup I was sipping on at McDonald's, or any number of things that could lead to a number of cells containing my DNA in a state that could be collected being dropped.

      • newdee a day ago ago

        Hardly the same.

        Your day to day DNA “leavings” aren’t neatly packaged up and associated with your other PII like name, location, email address, etc in a stolen, searchable dataset.

      • adamc a day ago ago

        If they know who you are, it's easy. If all they have is the DNA, that database will link it to you. They have caught several serial killers because close relatives were in DNA databases, which allowed honing in on a small number of living suspects.

      • AyyEye a day ago ago

        Following you probably costs more per hour than buying a whole country's DNA from a broker. And definitely costs more than a leaked dataset.

      • groby_b a day ago ago

        You do realize there's a difference between obsessing about tombert (or groby) vs hoovering up DNA at scale, right? Your insurance company probably won't follow you around personally, but if they can buy a bunch of DNA (yours included) for a few dollars/person and use that to strategically deny claims/increase costs, yeah, they'll sign up for that.

        The issue with digital data is almost never the individual targeting case. Cheap mass surveillance is the concern.

  • iwontberude a day ago ago

    Honestly it doesn't even matter. There is no proof the DNA is yours because they do no validation of users identity.

    • bdamm a day ago ago

      Speculative results with statistical likelihood are still highly valuable to the right buyers.

    • SoftTalker a day ago ago

      People are convicted all the time without any "proof" of guilt. It all goes to "beyond a reasonable doubt" and with enough circumstantial evidence, that "beyond" can be achieved.

  • a day ago ago
    [deleted]
  • CatWChainsaw a day ago ago

    When you ask a company to delete your data, you're actually asking them to pretend they deleted it by making it invisible to you. There's too much $$$ sloshing around for them to behave ethically.

  • chx a day ago ago

    I still find it astonishing anyone would be so careless of their own and close blood relatives' privacy to hand over their genetic material to a private company. What were you thinking. You can't undo that and you can't change your DNA ever. You have no idea where it ends up any time -- and that "any time" covers your life time and your close blood relatives entire lifetime too. These companies should have never been able to get a single customer but I guess.

    And here we are 18 years later and some people still think they can delete this. What else do you believe in? The tooth fairy? Santa Claus? Come on.

    Also what have you thought they can tell you? An archaeogenetics teacher described this belief as "they think we throw a bone in the machine which tells us it was half hun, half avar, half bear and spoke slavic".

    Y'all surrendered an intrinsic part of the privacy of your, your sister, your brother, even your unborn children for snake oil -- and paid for the privilege. I can't even.

    commence the downvotes but you can't put the toothpaste back once it's been squeezed out.

    • programjames a day ago ago

      As a twin I've always been extra cautious about this kind of stuff. I don't think I have a right to give people my twin's biometric data. I even refrain from posting images of myself publicly---there are at most two pictures of me from the past five years floating around the internet. It astounds me how reckless others are with their relatives' private information.

    • uhtred a day ago ago

      I agree completely about it being careless etc but I am not astonished at all that so many people have done it.

      Have you seen how simple minded the masses are? They find it hard to think! They are barely sentient!

  • artursapek a day ago ago

    inb4 Blackrock lol

  • excalibur a day ago ago

    What, you mean sending your DNA to a random startup to have them analyze it for you was a BAD idea? #surprisedpikachu

    • criddell a day ago ago

      Why was it a BAD idea? What negative consequences have people faced so far? If they've also benefited from the service, how are they supposed to judge if it was a mistake or not?

      • excalibur a day ago ago

        Of course it was a mistake. That data will 100% be compromised, if it hasn't been already. If there's a way for it to be used against them it will be found.

        • yoavm a day ago ago

          How will it be used against them? What's so secret about my DNA, considering I leave traces of it everywhere I go?

          • programjames a day ago ago

            Obviously health insurance costs may go up, or on a national scale one country could target a virus for it's enemy's population. Also, your rhetorical question is a red herring. It isn't about secrecy it's about privacy.

            • yoavm 17 hours ago ago

              If we don't want insurance prices to be tied to customers' health condition, let's make a law against that. I don't know how it is in the US, but I wouldn't be surprised if they're already using the results of bloods tests etc for pricing?

              Creating a virus to target another country sounds like sci-fi at the best case, or a conspiracy theory[0] at the worst case.

              [0] https://en.wikipedia.org/wiki/Ukraine_bioweapons_conspiracy_...

              • programjames 2 hours ago ago

                > If we don't want insurance prices to be tied to customers' health condition, let's make a law against that.

                That's not how private insurance works. It's entire purpose is to manage risk against unforeseen issues, not clearly foreseen but for some reason a protected class. With better data, you can separate people out better, which means everyone basically just pays their own medical expenses. You could nationalize health insurance, but that's a heated debate in America right now.

                > Creating a virus to target another country sounds like sci-fi at the best case

                It could be done, today, with the data 23&Me has helpfully supplied and a few thousand dollars. I don't know why you would try to argue this point.

                -----

                You seem to be coming at this from an "every situation is a cooperative game" perspective. They're not. And in adversarial games, you need to limit the information other people have about you, even if you don't know how they will use it against you. So, even if I were wrong about how your DNA can be used against you, I would still be right about the need for privacy.

        • a day ago ago
          [deleted]
      • cokeandpepsi a day ago ago

        [dead]