

When you have a lot of files, this gets expensive fast, so I shut it down. I got a huge bill… so the Deep Archive tier charges $0.05 per 1,000 requests (put, copy, post, list). Plus, the API fees can cost a lot (especially for lots of small files).Īt first, I thought I’d backup my archive data (which rarely changes) to Glacier Deep Archive. The AWS Glacier tier of $0.99 cheaper than BackBlaze’s $5, but there is a minimum storage commitment of 180 days (change or delete a file early and you’ll pay for it upfront) and a retrieval delay of up to 12 hours. And if I had a catastrophic event that took out my laptop and local backups, $90/TB would be small potatoes in the grand scheme of things. However, the likelihood of losing two local copies is slim. S3 Intelligent – Tiering which automatically moves your data into the best tier based on usage patterns.Įgress from S3 is atrocious in addition to retrieval fees, data transfer fees to download your data to a location outside Amazon is $90/TB.S3 Glacier Instant Retrieval $4/TB/month.S3 Standard – Infrequent Access – $12/TB/month.Amazon AWS S3 TargetĪmazon S3 is not that simple… here’s the S3 pricing page.

There are also transaction fees (based on the number of API calls), but these are pretty minimal if using the rclone –fast-list option, so you can almost ignore them. This data is lots of small backup files that can reproduce VMs and Container block storage and probably changes a lot.īackBlaze is a simple $5/TB/month with no ingress fees. 0.46 TB NFS share to Proxmox Backup Server.1.65 TB SMB share (my Automatic Ripping Machine target is here, and lots of small documents and files as well).7z compressed files organized by year with the last couple of years uncompressed)

