Basically it works like this:
- I have syncthing moving files between all my devices. The larger the device, the more stuff I move there[2]. My phone only has my keepass file and a few other docs, my gaming PC has that plus all of my photos and music, etc.
- All of this ends up on a raspberry pi with a connected USB harddrive, which has everything on it. Why yes, that is very shoddy and short term! The pi is mirrored on my gaming PC though, which is awake once every day or two, so if it completely breaks I still have everything locally.
- Nightly a restic job runs, which backs up everything on the pi to an s3 compatible cloud[3], and cleans out old snapshots (30 days, 52 weeks, 60 months, then yearly)
- Yearly I test restoring a random backup, both on the pi, and on another device, to make sure there is no required knowledge stuck on there.
This is was somewhat of a pain to setup, but since the pi is never off it just ticks along, and I check it periodically to make sure nothing has broken.
[1] there is always weirdness with these tools. They don't sync how you think, or when you actually want to restore it takes forever, or they are stuck in perpetual sync cycles
[2] I sync multiple directories, broadly "very small", "small", "dumping ground", and "media", from smallest to largest.
[3] Currently Wasabi, but it really doens't matter. Restic encrypts client side, you just need to trust the provider enough that they don't completely collapse at the same time that you need backups.
I never trust them again with my data.
Not backing up .git folders however is completely unacceptable.
I have hundreds of small projects where I use git track of history locally with no remote at all. The intention is never to push it anywhere. I don't like to say these sorts of things, and I don't say it lightly when I say someone should be fired over this decision.
I had no idea that it was such a good bargain. I used to be a Crashplan user back in the day, and I always thought Backblaze had tiered limits.
I've been using Duplicati to sync a lot of data to S3's cheapest tape-based long term storage tier. It's a serious pain in the ass because it takes hours to queue up and retrieve a file. It's a heavy enough process that I don't do anything nearly close to enough testing to make sure my backups are restorable, which is a self-inflicted future injury.
Here's the thing: I'm paying about $14/month for that S3 storage, which makes $99/year a total steal. I don't use Dropbox/Box/OneDrive/iCloud so the grievances mentioned by the author are not major hurdles for me. I do find the idea that it is silently ignoring .git folders troubling, primarily because they are indeed not listed in the exclusion list.
I am a bit miffed that we're actively prevented from backing up the various Program Files folders, because I have a large number of VSTi instruments that I'll need to ensure are rcloned or something for this to work.
(as a side note, it's funny to see see them promoting their native C app instead of using Java as a "shortcut". What I wouldn't give for more Java apps nowadays)
That's a good warning
> Backblaze had let me down. Secondly within the Backblaze preferences I could find no way to re-enable this.
This - the nail in the coffin
"The Backup Client now excludes popular cloud storage providers [...] this change aligns with Backblaze’s policy to back up only local and directly connected storage."
I guess windows 10 and 11 users aren't backing up much to Backblaze, since microsoft is tricking so many into moving all of their data to onedrive.
This is another example in disguise of two people disagreeing about what "unlimited" means in the context of backup, even if they do claim to have "no restrictions on file type or size" [2].
[1] https://www.reddit.com/r/backblaze/comments/jsrqoz/personal_... [2] https://www.backblaze.com/cloud-backup/personal
I contacted the support asking WTF, "oh the file got deleted at some point, sorry for that", and they offered me 3 months of credits.
I do not trust my Backblaze backups anymore.
The one thing they have to do is backup everything and when you see it in their console you can rest assured they are going to continue to back it up.
They’ve let the desktop client linger, it’s difficult to add meaningful exceptions. It’s obvious they want everyone to use B2 now.
I don't quite understand why it's still like this; it's probably the biggest reason why git tends to play poorly with a lot of filesystem tools (not just backups). If it'd been something like an SQLite database instead (just an example really), you wouldn't get so much unnecessary inode bloat.
At the same time Backblaze is a backup solution. The need to back up everything is sort of baked in there. They promise to be the third backup solution in a three layer strategy (backup directly connected, backup in home, backup external), and that third one is probably the single most important one of them all since it's the one you're going to be touching the least in an ideal scenario. They really can't be excluding any files whatsoever.
The cloud service exclusion is similarly bad, although much worse. Imagine getting hit by a cryptoworm. Your cloud storage tool is dutifully going to sync everything encrypted, junking up your entire storage across devices and because restoring old versions is both ass and near impossible at scale, you need an actual backup solution for that situation. Backblaze excluding files in those folders feels like a complete misunderstanding of what their purpose should be.
</bzexclusions><excludefname_rule plat="mac" osVers="*" ruleIsOptional="f" skipFirstCharThenStartsWith="*" contains_1="/users/username/dropbox/" contains_2="*" doesNotContain="*" endsWith="*" hasFileExtension="*" />
That is the exact path to my Dropbox folder, and I presume if I move my Dropbox folder this xml file will be updated to point to the new location. The top of the xml file states "Mandatory Exclusions: editing this file DOES NOT DO ANYTHING".
.git files seem to still be backing up on my machine, although they are hidden by default in the web restore (you must open Filters and enable Show Hidden Files). I don't see an option to show hidden files/folders in the Backblaze Restore app.
I know this is besides the point somewhat, but: Learn your tools people. The commit history could probably have been easily restored without involving any backup. The commits are not just instantly gone.
But .git? It does not mean you have it synced to GitHub or anything reliable?
If you do anything then only backup the .git folder and not the checkout.
But backing up the checkout and not the .git folder is crazy.
My understanding is that a modern, default onedrive setup will push all your onedrive folder contents to the cloud, but will not do the same in reverse -- it's totally possible to have files in your cloud onedrive, visible in your onedrive folder, but that do not exist locally. If you want to access such a file, it typically gets downloaded from onedrive for you to use.
If that's the case, what is Backblaze or another provider to do? Constantly download your onedrive files (that might have been modified on another device) and upload them to backblaze? Or just sync files that actually exist locally? That latter option certainly would not please a consumer, who would expect the files they can 'see' just get magically backed up.
It's a tricky situation and I'm not saying Backblaze handled it well here, but the whole transparent cloud storage situation thing is a bit of a mess for lots of people. If Dropbox works the same way (no guaranteed local file for something you can see), that's the same ugly situation.
Complete lack of communication (outside of release notes, which nobody really reads, as the article too states) is incompetence and indeed worrying.
Just show a red status bar that says "these folders will not be backed up anymore", why not?
I know the post is talking about their personal backup product but it's the same company and so if they sneak in a reduction of service like this, as others have already commented, it erodes difficult-to-earn trust.
If you've got huge amounts of files in Onedrive and the backup client starts downloading everyone of them (before it can reupload them again) you're going to run into problems.
But ideally, they'd give you a choice.
Trying to audit—let alone change—the finer details is a pain even for power users, and there's a non-zero risk the GUI is simply lying to everybody while undocumented rules override what you specified.
When I finally switched my default boot to Linux, I found many of those offerings didn't support it, so I wrote some systemd services around Restic + Backblaze B2. It's been a real breath of fresh air: I can tell what's going on, I can set my own snapshot retention rules, and it's an order of magnitude cheaper. [2]
____
[1] Along the lines of "We have your My Documents. Oh, you didn't manually add My Videos or My Music for every user? Too bad." Or in some cases, certain big-file extensions are on the ignore list by default for no discernible reason.
[2] Currently a dollar or two a month for ~200gb. It doesn't change very much, and data verification jobs redownload the total amount once a month. I don't backn up anything I could get from elsewhere, like Steam games. Family videos are in the care of different relatives, but I'm looking into changing that.
Preferably cheap and rclone compatible.
Hetzner storagebox sounds good, what about S3 or Glacier-like options?
First thing I noticed is that if it can't download a file due to network or some other problem then it just skips it. But you can force it to retry by modifying its job file which is just an SQLite DB. Also it stores and downloads files by splitting them into small chunks. It stores checksums of these chunks, but it doesn't store the complete checksum of the file, so judging by how badly the client is written I can't be sure that restored files are not corrupted after the stitching.
Then I found out that it can't download some files even after dozens of retries because it seems they are corrupted on Backblaze side.
But the most jarring issue for me is that it mangled all non-ascii filenames. They are stored as UTF-8 in the DB, but the client saves them as Windows-1252 or something. So I ended up with hundreds of gigabytes of files with names like фикац, and I can't just re-encode these names back, because some characters were dropped during the process.
I wanted to write a script that forces Backblaze Client to redownload files, logs all files that can't be restored, fixes the broken names and splits restored files back into chunks to validate their checksums against the SQLite DB, but it was too big of a task for me, so I just procrastinated for 3 years, while keeping paying monthly Backblaze fees because it's sad to let go of my data.
I wonder if they fixed their client since then.
That would be nice, they'd be able to get their history back!
Indeed, the commits and blobs might even have still been available on the GitHub remote, I'm not sure they clean them on some interval or something, but bunch of stuff you "delete" from git still stays in the remote regardless of what you push.
No they are not. This is explicitly addressed in the article itself.
On macOS.
I assume when asking such a question, you expect an honest answer like mine:
rclone is my favorite alternative. Supports encryption seamlessly, and loaded with features. Plus I can control exactly what gets synced/backed up, when it happens, and I pay for what I use (no unsustainable "unlimited" storage that always comes with annoying restrictions). There's never any surprises (which I experienced with nearly every backup solution). I use Backblaze B2 as the backend. I pay like $50 a month (which I know sounds high), but I have many terabytes of data up there that matters to me (it's a decade or more of my life and work, including long videos of holidays like Christmas with my kids throughout the years).
For super-important stuff I keep a tertiary backup on Glacier. I also have a full copy on an external harddrive, though those drives are not very reliable so I don't consider it part of the backup strategy, more a convenience for restoring large files quickly.
Edit: on top of that I've built a custom one-page monitoring dashboard, so I see everything in one place (https://imgur.com/B3hppIW) - I'll opensource, it's decent architecture, I just need to cleanup some secrets from Git history...
I hope Backblaze responds to this with a "we're sorry and we've fixed this."
And, as a separate note, they shouldn't be balking at the amount of data in a virtualized onedrive or dropbox either considering the user could get a many-terabyte hard drive for significantly less money.
It would be reasonable to say that if you run the file sync in a mode that keeps everything locally, then Backblaze should be backing it up. Arguably they should even when not in that mode, but it'll churn files repeatedly as you stream files in and out of local storage with the cloud provider.
Always prefer businesses who are upfront and honest about what they can offer their users, in a sustainable way.
If a company uses the word unlimited to describe their service, but then attempts to weasel out of it via their T&Cs, that doesn't constitute a disagreement over the meaning of the word unlimited. It just means the company is lying.
I still like backblaze, they've been nice for the days where I was running windows. Their desktop app is probably one of the best in the scene.
JottaCloud is "unlimited" for $11.99 a month (your upload speed is throttled after 5TB).
I've been using them for a few years for backing up important files from my NAS (timemachine backups, Immich library, digitised VHS's, Proxmox Backup Server backups) and am sitting at about 3.5TB.
Edit: spelling errors and cleanup
See Fossil (https://fossil-scm.org/)
P.S. There's also (https://www.sourcegear.com/vault/)
> SourceGear Vault Pro is a version control and bug tracking solution for professional development teams. Vault Standard is for those who only want version control. Vault is based on a client / server architecture using technologies such as Microsoft SQL Server and IIS Web Services for increased performance, scalability, and security.
You should try downloading one of your backed up git repos to see if it actually does contain the full history, I just checked several and everything looks good.
The moment you call read() (or fopen() or your favorite function), the download will be triggered. It's a hook sitting between you and the file. You can't ignore it.
The only way to bypass it is to remount it over rclone or something and use "ls" and "lsd" functions to query filenames. Otherwise it'll download, and it's how it's expected to work.
When you have a couple terabytes of data in that drive, is it acceptable to cycle all that data and use all that bandwidth and wear down your SSD at the same time?
Also, high number of small files is a problem for these services. I have a large font collection in my cloud account and oh boy, if I want to sync that thing, the whole thing proverbially overheats from all the queries it's sending.
Or that they're targeting the mass retail market, where people are technically ignorant, and "unlimited" is required to compete.
And statistically-speaking, is viable as long as a company keeps its users to a normal distribution.
Why should a file backup solution adapt to work with git? Or any application? It should not try to understand what a git object is.
I’m paying to copy files from a folder to their servers just do that. No matter what the file is. Stay at the filesystem level not the application level.
I assume you don’t think that, so I’m curious, what would you propose positively?
> Bob (Backblaze Help)
> Aug 5, 2021, 11:33 PDT
> Hello there,
> Thank you for taking the time to write in,
> Unfortunately .git directories are excluded by Backblaze by default. File
> changes within .git directories occur far too often and over so many files
> that the Backblaze software simply would not be able to keep up. It's beyond
> the scope of our application.
> The Personal Backup Plan is a consumer grade backup product. Unfortunately we
> will not be able to meet your needs in this regard.
> Let me know if you have any other questions.
> Regards,
> Bob The Backblaze Team
It's that to back up a folder on a filesystem, you need to traverse that folder and check every file in that folder to see if it's changed. Most filesystem tools usually assume a fairly low file count for these operations.
Git, rather unusually, tends to produce a lot of files in regular use; before packing, every commit/object/branch is simply stored as a file on the filesystem (branches only as pointers). Packing fixes that by compressing commit and object files together, but it's not done by default (only after an initial clone or when the garbage collector runs). Iterating over a .git folder can take a lot of time in a place that's typically not very well optimized (since most "normal" people don't have thousands of tiny files in their folders that contain sprawled out application state.)
The correct solution here is either for git to change, or for Backblaze to implement better iteration logic (which will probably require special handling for git..., so it'd be more "correct" to fix up git, since Backblaze's tools aren't the only ones with this problem.)
Borg backup is a good tool in my opinion and has everything that I need (deduplication, compression, mountable snapshot.
Hetzner Storage Box is nothing fancy but good enough for a backup and is sensibly cheaper for the alternatives (I pay about 10 eur/month for 5TB of storage)
Before that I was using s3cmd [3] to backup on a S3 bucket.
This is a joke, but honestly anyone here shouldn't be directly backing up their filesystems and should instead be using the right tool for the job. You'll make the world a more efficient place, have more robust and quicker to recover backups, and save some money along the way.
So, in practice, you shouldn't have to download the whole remote drive when you do an incremental backup.
- - -
Hey, I tried restoring a file from my backup — downloading it directly didn't work, and creating a restore with it also failed – I got an email telling me contract y'all about it.
Can you explain to me what happened here, and what can I do to get my file(s?) back?
- - -
Hi Jan,
Thanks for writing in!
I've reached out to our engineers regarding your restore, and I will get back to you as soon as I have an update. For now, I will keep the ticket open.
- - -
Hi Jan,
Regarding the file itself - it was deleted back in 2022, but unfortunately, the deletion never got recorded properly, which made it seem like the file still existed.
Thus, when you tried to restore it, the restoration failed, as the file doesn't actually exist anymore. In this case, it shouldn't have been shown in the first place.
For that, I do apologize. As compensation, we've granted you 3 monthly backup credits which will apply on your next renewal. Please let me know if you have any further questions.
- - -
That makes me even more confused to be honest - I’ve been paying for forever history since January 2022 according to my invoices?
Do you know how/when exactly it got deleted?
- - -
Hi Jan,
Unfortunately, we don't have that information available to us. Again, I do apologize.
- - -
I really don’t want to be rude, but that seems like a very serious issue to me and I’m not satisfied with that response.
If I’m paying for a forever backup, I expect it to be forever - and if some file got deleted even despite me paying for the “keep my file history forever” option, “oh whoops sorry our bad but we don’t have any more info” is really not a satisfactory answer.
I don’t hold it against _you_ personally, but I really need to know more about what happened here - if this file got randomly disappeared, how am I supposed to trust the reliability of anything else that’s supposed to be safely backed up?
- - -
Hi Jan,
I'll inquire with our engineers tomorrow when they're back in, and I'll update you as soon as I can. For now, I will keep the ticket open.
- - -
Appreciate that, thank you! It’s fine if the investigation takes longer, but I just want to get to the bottom of what happened here :)
- - -
Hi Jan,
Thanks for your patience.
According to our engineers and my management team:
With the way our program logs information, we don't have the specific information that explains exactly why the file was removed from the backup. Our more recent versions of the client, however, have vastly improved our consistency checks and introduced additional protections and audits to ensure complete reliability from an active backup.
Looking at your account, I do see that your backup is currently not active, so I recommend running the Backblaze installer over your current installation to repair it, and inherit your original backup state so that our updates can check your backup.
I do apologize, and I know it's not an ideal answer, but unfortunately, that is the extent of what we can tell you about what has happened.
- - -
I gave up escalating at this point and just decided these aren’t trusted anymore.
The files in question are four year old at this point so it’s hard for me conclusively state, so I guess there might be a perfect storm of that specific file being deleted because it was due to expire before upgraded to “keep history forever”, but I don’t think it’s super likely, and I absolutely would expect them to have telemetry about that in any case.
If anyone from Backblaze stumbles upon it and wants to escalate/reinvestigate, the support ID is #1181161.
both services have internal backups to reduce the chance they lose data
both services allow some limited form of "going back to older version" (like the article states itself).
Just because the article says "sync is not backup" doesn't mean that is true, I mean it literally is backup by definition as it: makes a copy in another location and even has versioning.
It's just not _good enough_ backup for their standards. Maybe even standards of most people on HN, but out there many people are happy with way worse backups, especially wrt. versioning for a lot of (mostly static) media the only reason you need version rollback is in case of a corrupted version being backed up. And a lot of people mostly backup personal photos/videos and important documents, all static by nature.
Through
1. it doesn't really fulfill the 3-2-1 rules it's only 2-1-1 places (local, one backup on ms/drop box cloud, one offsite). Before when it was also backed up to backblaze it was 3-2-1 (kinda). So them silently stopping still is a huge issue.
2. newer versions of the 3-2-1 rule also say treat 2 not just as 2 backups, but also 2 "vendors/access accounts" with the one-drive folder pretty much being onedrive controlled this is 1 vendor across local and all backups. Which is risky.
So my idea is that it's a competency problem (lack of communication), not malice. But it's just a theory, based on my own experience.
In any case, this is a bad situation, however you look at it.
As for GUIs in general... Well, like I said, I just finished several years of bad experiences with some proprietary ones, and I wanted to see and choose what was really going on.
At this point, I don't think I'd ever want a GUI beyond a basic status-reporting widget. It's not like I need to regularly micromanage the folder-set, especially when nobody else is going to tweak it by surprise.
_____
[1] The downside to the dumb-store is a ransomware scenario, where the malware is smart enough to go delete my old snapshots using the same connection/credentials. Enforcing retention policies on the server side necessarily needs a smarter server. B2 might actually have something useful there, but I haven't dug into it.
Interestingly, rclone supports that on many providers, but to be able to backblaze support that, it needs to integrate rclone, connect to the providers via that channel and request checks, which is messy, complicated, and computationally expensive. Even if we consider that you won't be hitting API rate limits on the cloud provider.
Once growth slows, churn eats much of the organic growth and you need to spend money on marketing.
Doing a bait-and-switch on a percentage of your paying customers, no matter how small the percentage is, may be "viable" for the company, but it's a hostile experience for those users, and companies deserve to be called out for it.
So… Marketing has taken over, just as parent comment said. Got it.
When I backup my computer the .git folders are among the most important things on there. Most of my personal projects aren't pushed to github or anywhere else.
Fortunately I don't use Backblaze. I guess the moral is don't use a backup solution where the vendor has an incentive to exclude things.
You are using it to mean "maintaining full version history", I believe? Another important consideration.
For stuff I care about (mostly photos), I back them up on two different services. I don't have TBs of those, so it's not very expensive. My personal code I store on git repositories anyway (like SourceHut or Codeberg or sometimes GitHub).
It's the same reason why the postgres autovacuum daemon tends to be borderline useless unless you retune it[0]: the defaults are barmy. git gc only runs if there's 6700 loose unpacked objects[1]. Most typical filesystem tools tend to start balking at traversing ~1000 files in a structure (depends a bit on the filesystem/OS as well, Windows tends to get slower a good bit earlier than Linux).
To fix it, running
> git config -g gc.auto=1000
should retune it and any subsequent commit to your repo's will trigger garbage collection properly when there's around 1000 loose files. Pack file management seems to be properly tuned by default; at more than 50 packs, gc will repack into a larger pack.
[0]: For anyone curious, the default postgres autovacuum setting runs only when 10% of the table consists of dead tuples (roughly: deleted+every revision of an updated row). If you're working with a beefy table, you're never hitting 10%. Either tune it down or create an external cronjob to run vacuum analyze more frequently on the tables you need to keep speedy. I'm pretty sure the defaults are tuned solely to ensure that Postgres' internal tables are fast, since those seem to only have active rows to a point where it'd warrant autovacuum.
Nobody has turned the moon into a hard drive yet.
No, they are using it to mean “backed up”. Like, “if this data gets deleted or is in any way lost locally, it’s still backed remotely (even years later, when finally needed)”.
I’m astonished so many people here don’t know what a backup is! No wonder it’s easy for Backblaze to play them for fools.
> git config --global gc.auto 1000
with the long option name, and no `=`.
TLDR: Despite claiming to backup all your data, Backblaze quietly stopped backing up OneDrive and Dropbox folders - along with potentially many other things.
For ten years I have been using Backblaze for my personal computer backup. Before 2015 I would backup files to one of two large external hard discs. I then rotated these drives between, first my father’s house, and after I moved to the UK, my office drawers.
In 2015 Backblaze seemed like a good bet. Unlike Crashplan their software wasn’t a bloated Java app, but they did have unlimited storage. If you could cram it into your PC they would back it up. With their yearly Hard Drive reviews making good press, a lot of personal recommendations from my friends and colleagues, their service sounded great. I installed the software, ran it for several weeks, and sure enough my data was safely stored in their cloud.
I had further reason to be impressed when several years later one of my hard drives failed. I made use of their “send me a hard drive with my stuff on it service”. A drive turned up filled with my precious data. That for me was proof that this system worked, and that it worked well.
And so I recommended Backblaze for years. What do you do for backup? I would extoll the virtues of Backblaze, and they made many sales from such recommendations.
There were a few things I didn’t like. The app, could use a lot of memory, especially after doing a large import of photographs. The website, which I often used to restore single files or folders, was slow and clunky to use. The windows app in particular was clunky with an early 2000s aesthetic and cramped lists. There was the time they leaked all your filenames to Facebook, but they probably fixed that.
But no matter, small problems for the peace of mind of having all my files backed up.
Backup software is meant to back up your files. Which files? Well the files you need. Given everyone is different, with different workflows and filetypes, the ideal thing is to back up all your files. No backup provider knows what I will need in the future. The provider must plan accordingly.
My first troubling discovery was in 2025, when I made several errors then did a push -f to GitHub and blew away the git history for a half decade old repo. No data was lost, but the log of changes was. No problem I thought, I’ll just restore this from Backblaze. Sadly it was not to be. At some point Backblaze had started to ignore .git folders.
This annoyed me. Firstly I needed that folder and Backblaze had let me down. Secondly within the Backblaze preferences I could find no way to re-enable this. In fact looking at the list of exclusions I could find no mention of .git whatsoever.

This made me wonder - I had checked the exclusions list when I installed Backblaze 9 years before, had I missed it? Had I missed anything else?
Well lesson learned I guess, but then a week ago I came across this thread on reddit: “Doesn’t back up Dropbox folder??”. A user was surprised to find their Dropbox folder no longer being backed up. Alarmed I logged into Backblaze, and lo and behold, my OneDrive folder was missing.
I. Am. Fucking. Furious.
Backblaze has one job, and apparently they are unable to do that job. Back up my stuff. But they have decided not to.
Lets take an aside.
A reasonable person might point out those files on OneDrive are already being backed up - by OneDrive! No. Dropbox and OneDrive are for file syncing - syncing your files to the cloud. They offer limited protection. OneDrive and Dropbox only retain deleted files for one month. Backblaze has one year file retention, or if you pay per GB, unlimited retention. While OneDrive retains version changes for longer, Dropbox only retains version changes for a month - again unless you pay for more. Your files are less secure and less backed up when you stick them in a cloud storage provider folder compared to just being on your desktop.
And that’s assuming your cloud provider is playing ball. If Microsoft or Dropbox bans your account you may find yourself with no backup whatsoever.
For me the larger issue is they never told us. My OneDrive folder sits at 383GB. You would think that having decided to no longer back this up I might get an email, and alert or some other notification. Of course not.
Nestled into their release notes under “Improvements” we see:
The Backup Client now excludes popular cloud storage providers from backup, including both mount points and cache directories. This prevents performance issues, excessive data usage, and unintended uploads from services like OneDrive, Google Drive, Dropbox, Box, iDrive, and others. This change aligns with Backblaze’s policy to back up only local and directly connected storage.
First, I would hardly call this change in policy an improvement, its hard to imagine anyone reading this as anything other than a downgrade in service. Secondly does Backblaze believe most of its users are reading their release notes?
And if you joined today and looked at their list of file exclusions you would find no reference to Dropbox or OneDrive. No mention of Git either.
Here’s the thing, today they don’t back up Git or OneDrive. Who’s to say tomorrow they wont add to the list. Maybe some obscure file format that’s critical to your work flow. Or they will ignore a file extension that just happens be the same as one used by your DAW or 3D Modelling software. And they won’t tell you this. They wont even list it on their site.
By deciding not to back up everything, Backblaze has made it as if they are backing up nothing.
But really this feels like a promise broken. Back in 2015 their website proudly proclaimed:
All user data included by default No restrictions on file type or size
Protect the digital memories and files that matter most to you.
File backup is a matter of trust. You are paying a monthly fee so that if and when things go wrong you can get your data back. By silently changing the rules, Backblaze has not simply eroded my trust, but swept it away.
I wrote this to warn you - Backblaze is no longer doing their part, they are no longer backing up your data. Some of your data sure, but not all of it.
Finally let me leave you with Backblaze’s own words from 2015:
Unlimited, Simplified, Secure Personal Online Backup Cloud Storage
They promised to simplify backup. They succeeded - they don’t even do the backup part anymore.
I do wish it was a word that had to be completely dropped from marketing/adverting.
For example there is not unlimited storage, hell the visible universe has a storage limit. There is not unlimited upload and download speed, and what if when you start using more space they started exponentially slowing the speed you could access the storage? Unlimited CPU time in processing your request? Unlimited execution slots to process your request? Unlimited queue size when processing your requests.
Hence everything turns into the mess of assumptions.
I doubt they have those pipes, at least if every of their customers (or a sufficiently large amount) would actually make use of that.
Second question would be, how long they would allow you to utilize your broadband 24/7 at max capacity without canceling your subscription. Which leads back to the point the person I replied to was making: If you truly make use of what is promised, they cancel you. Hence it is not a faithful offer in the first place.
Not important here because backblaze only has to match the storage of your single device. Plus some extra versions but one year multiplied by upload speed is also a tractable amount.
Residential network access is oversold as everything else.
The only difference with storage is there’s a theoretical maximum on how much a single person can use.
But you could just as well limit backup upload speed for similar effect. Having something about fair use in ToS is really not that different.
Of course, in countries where the internet isn't so developed as in other parts of the world, this might make sense, but modern countries don't tend to do that, at least in my experience.
It shouldn't stress things to spend a couple weeks relaying a terabyte in small chunks. The most likely strain is on my upload bandwidth and yeah that's the cost of cloud backup, more ISPs need to improve upload.
Yes, indeed, most relevant in this case probably "time" and "bandwidth", put together, even if you saturate the line for a month, they won't throttle you, so for all intents and purposes, the "data cap" is unlimited (or more precise; there is no data cap).
> more ISPs need to improve upload.
I was yelling the same things to the void for the longest time, then I had a brilliant idea of reading the technical specs of the technology coming to my home.
Lo and behold, the numbers I got were the technical limits of the technology that I had at home (PON for the time being), and going higher would need a very large and expensive rewiring with new hardware and technology.
> the technical limits of the technology that I had at home (PON for the time being)
Isn't that usually symmetrical? Is yours not?
My parents have gotten hit by this. Dad was downloading huge video files at one point on his WiFi and his ISP silently throttled him.
A common term is "data cap": https://en.wikipedia.org/wiki/Data_cap
Wow, I knew that was generally true, didn't know it was true for internet access in the US too, how backwards...
> A common term is "data cap": https://en.wikipedia.org/wiki/Data_cap
I think most are familiar with throttling because most (all?) phone plans have some data cap at one point, but I don't think I've heard of any broadband connections here with data caps, that wouldn't make any sense.