I’m writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I’ve taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

  • psmgx@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Currently pushing about 3-5 TB of images to AI/ML scanning per day. Max we’ve seen through the system is about 8 TB.

    Individual file? Probably 660 GB of backups before a migration at a previous job.