Hi,

I’m not sure if this is the right community for my question, but as my daily driver is Linux, it feels somewhat relevant.

I have a lot of data on my backup drives, and recently added 50GB to my already 300GB of storage (I can already hear the comments about how low/high/boring that is). It’s mostly family pictures, videos, and documents since 2004, much of which has already been compressed using self-made bash scripts (so it’s Linux-related ^^).

I have a lot of data that I don’t need regular access to and won’t be changing anymore. I’m looking for a way to archive it securely, separate from my backup but still safe.

My initial thought was to burn it onto DVDs, but that’s quite outdated and DVDs don’t hold much data. Blu-ray discs can store more, but I’m unsure about their longevity. Is there a better option? I’m looking for something immutable, safe, easy to use, and that will stand the test of time.

I read about data crystals, but they seem to be still in the research phase and not available for consumers. What about using old hard drives? Don’t they need to be powered on every few months/years to maintain the magnetic charges?

What do you think? How do you archive data that won’t change and doesn’t need to be very accessible?

Cheers

  • Barx [none/use name]@hexbear.net
    link
    fedilink
    arrow-up
    4
    ·
    2 months ago

    As a start, follow the 3-2-1 rule:

    • At least 3 copies of the data.

    • On at least 2 different devices / media.

    • At least 1 offsite backup.

    I would add one more thing: invest in a process for verifying that your backups are working. Like a test system that is occasionally restored to from backups.

    Let’s say what you care about most is photos. You will want to store them locally on a computer somewhere (one copy) and offsite somewhere (second copy). So all you need to do is figure out one more local or offsite location for your third copy. Offsite is probably best but is more expensive. I would encrypt the data and then store on the cloud for my main offsite backup. This way your data is private so it doesn’t matter that it is stored in someone else’s server.

    I am personally a fan of Borg backup because you can do incremental backups with a retention policy (like Macs’ Time Machine), the archive is deduped, and the archive can be encrypted.

    Consider this option:

    1. Your data raw on a server/computer in your home.

    2. An encrypted, deduped archive on that sane computer.

    3. That archive regularly copied to a second device (ideally another medium) and synchronized to a cloud file storage system.

    4. A backup restoration test process that takes the backups and shows that they restores important files, the right number, size, etc.

    If disaster strikes and all your local copies are toast, this strategy ensures you don’t lose important data. Regular restore testing ensures the remote copy is valid. If you have two cloyd copies, you are protected against one of the providers screwing up and removing data without you knowing and fixing it.

    • 8263ksbr@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      Interesting take on the test process. Never really thought of that. I just trusted in rsyncs error messages. Maybe I write a script to automate those checks. Thanks