- cross-posted to:
- enshitification@slrpnk.net
- cross-posted to:
- enshitification@slrpnk.net
cross-posted from: https://infosec.pub/post/45169245
DB = Dropbox, OD = Onedrive
Just when you think AI isn’t ruining something, it’s ruining something.
Enshittification strikes again.
It should be noted that this affects their BackBlaze backup client that operates alongside their unlimited personal storage solution. I doubt these issues exist with B2 storage where you can store whatever you want.
I’ve never used their personal backup plan because it’s for backing up a single machine. I have servers all over so I just use them for their S3 compatible object storage which is still a decent deal.
If you aren’t running it yourself, you’ll always be held hostage by toxic companies!
Time and time again, from account closures to account locking… if you value anything, you should really look at self hosting it. Yes it’s a learning curve, but now is actually a good time because you have claude to help, but don’t expect that to last!
I am and will only ever use the free tier of claude but even that is actually pretty useful. Just don’t reuse the same chat, create new ones, delete ones no longer needed and you rarely hit the usage limits.
I’ve used claude to get my own AI server running on a low power beelink PC and while i’m still learning, it runs pretty well so Imcan now bounce between my own AI, to claude for the few issues I can’t solve.
I agree in general about self-hosting, but backup seems like a special case. Where do you back up your self-hosted data? An offsite copy of the backup is needed, and it should be automatic. For most people (who only have one site, their home) that’s not easy to arrange except through a cloud backup service.
I don’t really have much to backup tbh, what I do have is backed up to another drive but unfortunately hosting that offsite because I don’t have friends, lol.
In my case I work with a family member in another city. We connect via VPN (tailscale works well too if you prefer that) and push the data we want to backup. Something like nextcloud could be used too, although a regular file explorer works just fine once you’re connected.
Now mind you it’s mostly family photos, so not petabytes of data.
Hi, what kind of beelink do you have and AI do you run there? I have a similar setup and I’m very interested, I didn’t think it was possible to run it on that low end.
I bought the Beelink SER9 MAX, mainly because it’s the only one I saw with 64GB Ram.
It came with windows, but binned that for debian running Ollama and openclaw pretty well. I looked at others but according to Claude, the A1 Pro that i looked at only really supports Windows and I emailed minisforum who confirmed it so kinda went off there PC’s.
I’m not aiming for speed though, i’m aiming for learning but the only issue i’ve run into a few times is:
⚠️ Something went wrong while processing your request. Please try again, or use /new to start a fresh session.But running /compact seems to fix it. I’d love a few more beelink’s to run a cluster though.
Edit: Also I’m also messing with openclaw and openclaw can’t seem to spawn agents to do tasks. Not sure if thats hardware related or config issue. 🤷♂️
Not who you asked but I got the Me2 cube from Beelink. It runs Truenas and has been reliable so far. The giant downside is nvme drive cost, like 3-4x from when I bought it.
If you are just storing data, find the really old Proliant cube servers on EBay, like an N40L. There’s only one fan, if it works you can get 4 hard drives for cheaper than nvme, nstall truenas or xigmanas, and you have a slow but useful and reliable file server, with ECC memory and all kinds of useful things. 60w and quiet.
I’m currently running a U59 and storage is not a problem, although I have mechanical drives. I don’t know if it is the best or optimal solution, but I am using mergerfs and I have a HDD bay to split the data into different drives.
I decided against Backblaze for server backups because they charge for certain API calls, and I ended up exceeding the quota when I was testing with the free tier. I was experimenting with encrypted backups and not sure how I exceeded it, but it really put me off that I could potentially have a surprise bill from experimenting without exceeding my storage quota. I went with iDrive e2 specifically because they don’t have API fees and it has worked fine the last couple years. My storage utilization has grown and I’ve been charged extra, which is expected, whereas API calls would be harder to predict depending on what I do in a given month. For self-hosting, I want easy, predictable pricing and don’t want to deal with surprise bills. It’s enough of a chore to manage cloud spend at work without it being a headache at home too.
Don’t use any of the services they mentioned anyways and nothing in this thread seems to even come close to the $99/year if you have a lot of data. Not going to be switching any time soon unless they end their unlimited backups entirely.
Thanks for the acronym definitions, it’s like they wanted to report these news without really reporting it…
Best solution is still a second NAS at a friends home.
Friend has a high maintenance cost though.
Ugh. How am I supposed to afford a friend in this economy?
Basically moved 5TB away from Backblaze when they started raising their prices… greedy fucks, every one of them
I left because their support was atrocious. I literally pointed out what the problem was on their end, but they didn’t give a damn and continued gaslighting me.
Fuck em’
Uhg. Where do I go now? I really just ultimately want encrypted zfs replication…
rsync.net offers ZFS send/receive and I’ve been using it for 5 years now, it’s pretty great. It’s not super expensive per GB, but they ask a minimum of 5TB if you want native ZFS support, which is $60/month.
You get access to a full FreeBSD VM which is very nice, because you can do things like metrics or a “pull” setup that pulls backups from your machines, so you’re more resilient against stuff like ransomware.
Sounds good but $60 per month is a lot of money.
Yes it’s not the cheapest option, but I think it’s the only one if you need zfs send/receive. But if you don’t need it you can get less than 5TB for cheaper, or just go elsewhere.
This would increase my yearly cost from $99 to over 10k
You have over 70TB of data you want to backup? That’s a lot. How are you making backups of that for only 99/year?
I have a 190TB NAS, and Backblaze is $99 for unlimited. At least it was until now
Backblaze is definitely losing money on you every year, so good luck finding an alternative. I pay $100+/month just in power and network costs to have my own hardware colocated in a real data center, and that’s saving me money compared to renting 200TB anywhere else.
Oh, I definitely know I’m not a profitable customer for them. My home electricity bill for my NAS is already a multiple of my Backblaze subscription
Just price out S3 compatible storage and use backup software that can encrypt. Then it doesn’t matter who holds it.
Wasabi is reputable and has fair pricing. iDrive is well priced.
I’m still sending to B2 until the price actually changes for me.
I personally use Duplicati (and yes I’ve tested restores).
The changes come as the company has experienced a 40X year-over-year increase in AI data stored on its servers and has increased focus on its accelerating AI business.
If this means they just want to throttle AI companies, I don’t care. Go forth and prosper BackBlaze.
If it doesn’t, statement retracted.
An individual storing 10tb on their “unlimited” cloud backup: $8/ month
A company storing 10tb on their S3: $60/month
An ai company storing 10tb on their faster S3: $150/month AND must use multiple petabytes (at least $30k/month)
It’s easy to see which kind of customer they like to have
Oh good point. Yeah you are probably right.
What is an “AI storage service”?
Does that mean you just store your info in AI weights/contexts and hope it can regenerate an approximation of what you put in?
I assume it’s where the AI companies store the stolen data used to train their LLMs.
It’s s3 compatible storage (b2) you sell to companies using AI for twice as much.
b2 storage is $6/Tb/mo, AI storage is b2 storage at $15/Tb/mo.
The AI storage offers unlimited free egress, whereas the regular storage does not.
It’s like selling special gold shovels during a gold rush that are better at shoveling gold.
It’s like the wedding or funeral tax, where all items cost extra for no reason, other than exploitation.
And white components. And baby food.
And luxury car parts, which are more often than not from ordinary cars.
My VW is running Bugatti parts, for example.
Oh god, i know thats not possible and here come the startups to pitch it.
Backblaze is a service I really depend on, and one I’ve recommended. However they’re still not profitable and investor money isnt going to keep them afloat forever.
That’s not surprising with all of the data hoarders abusing the unlimited backups to store hundreds of terabytes.
“abusing” the “unlimited backups” to store “limited” terabytes.
If you can’t afford to offer unlimited backups, maybe put a limit?
How is that abuse? “Unlimited” is a pretty audacious plan to offer. Maybe Backblaze shouldn’t offer something impossible.
The software only allows local drives to be backed up, but some people use workarounds to make it backup a large NAS or server.
But that’s not who is being targeted with the changes Backblaze has made. By silently excluding sync folders, they’re casting a wide net and hoping it will catch those who use workarounds. It might, but in the process it reveals their comfort with deceptive business practices and harms users of the backup service who are not using workarounds.
Are they boosting their AI business in anticipation of breaking encryption and then training their models on everyone’s data? That’s what I would assume of a company I no longer trust.
About two weeks ago, Backblaze sent out an email
Headlines are clickbait. Literally the first line in the article. What more can they do than send an email?
The title is accurate.
You just failed to read the article part the first ten words.
However, roughly six months ago, Backblaze enacted a silent change that made its backup app stop uploading local data synced to “OneDrive, Google Drive, Dropbox, Box, iDrive, and others.”
so it seems like they should just “limit” the storage to a reasonable number of TB that is more than most desktops/laptops, and less than NASes with hundreds of TB.
And I recall Backblaze stating that those users are a minority and aren’t a big concern. I used to do that, but when I attempted to restore 7TB and it took well over a month to restore what I needed, I switched to other solutions.
Yeah, screw those people. I can’t think of a single other reason a profit driven company would cut corners while storage prices rise due to AI companies.
I’m asking as a genuine question, where or how should people backup large datastores? Also what counts as too large? I’ve heard Backblaze doesn’t cover NAS so i wouldn’t be able to backup my 2TB zfs RAID, but like is that too much?
I want to do 3-2-1 for my homelab to preserve all pictures in my immich and the backups of my LXCs and VMs, but I’m just not sure how to go about it, and I was considering archives of those files + backblaze…
A second offsite NAS with your friend? That’s what I did when I grew out of my old synology. My new NAS capacity is noticeably impacted by things like frequent local snapshots but I don’t need to back those up remotely and it saves space.
Local tape. If you need offsite, rotate tapes. If you need cloud, Amazon Glacier or equivalent (which are backed by tape, I assume).
Tape drives are expensive as fuck though.
Before the storage wars it was cheaper to just build a second shitty NAS and backup there
Don’t trust backblaze or any service that claims “unlimited”.
The overwhelming feedback I’ve seen is to “KISS” and use some combination of restic/borg/kopia and rclone to sync data or local backups to cloud storage like https://www.hetzner.com/storage/storage-box/.
Restic/borg are more for whole system backups, where kopia is more for data (a central kopia server/repository can deduplicate and version data from multiple machines). Rclone is good for syncing local backups to cloud services, or perhaps e2ee synchronisation between machines (though it doesn’t do versioning and multiple machines will cause problems).
This is the most flexible, long-term, as you can just update the storage backend and transfer or re-upload everything as necessary.
For a NAS, you can use Backblaze B2, but they certainly aren’t the cheapest. B2 doesn’t have the limitations that the personal and business plans have, but you pay by the TB.
There are lots of cloud backup providers. Just make sure it supports the OS on your NAS. Any of them that claim to be unlimited will not truly be unlimited.
Yeah, a few hundred outliers can really ruin it. People: have some self awareness and don’t be a douchebag.
You’re talking to the crowd where if it can be done, it should be done, and bragged about. Sadly.
Hard to say thats it if just having 2TB uploaded is enough to be considered in the top.
Especially if they’ve already started ignoring other cloud files in people’s backups












