Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Freeze – Amazon Glacier GUI Client for Mac OS X (freezeapp.net)
160 points by sebcode on Oct 25, 2015 | hide | past | favorite | 50 comments



I've been waiting for something like this for a good, long while. If it works half as well as the description implies, it's worth way more than $10.

Feedback:

* Help me figure out how to get at an access key and secret for my account. That could be as simple as including a button that opens up Safari with the IAM console, or as awesome as working with IAM. Right now, this is probably your product's biggest stumbling block, as Amazon's user experience is atrocious.

* "Do you want to initiate inventory retrieval for all non-empty vaults of region US East (N. Virginia)?" — I dunno, you tell me.

* They're always "OK" buttons, not "Ok"

* Inventory and Transfers visibility shouldn't be mutually exclusive


Thanks, that's great feedback. I admit that the app is not as user friendly as it should be, I'll work on that for the upcoming releases.


Why would anyone use glacier for consumer needs, when Google Drive provides almost the same price without any additional complexities? Glacier 1TB is $7 (and that's in the cheapest data center), Google Drive's 1TB is $10.

One advantage I can see is more flexible cost, you pay for what you use. Is there other differences?


Because Glacier adjusts the price exactly to the size of the data stored, Google Drive is by price steps.


Well...maybe...because your data is far less likely to be indexed and used to push advertising to you when stored on Glacier? ;)


Amazon sells ads just the same. it's actually the 4th biggest network.

and they are king of tracking users. nobody does better


Better than Google? Really? Source?


google only has volume.

if you require source to know that google has the least segmented audience, you are clearly not in advertisement.

why do you think google pushed so hard for g+? or why it keeps trying to keep people logged while searching? even with all that, they are still way behind.


It would be neat if this gave one real time cost visibility too.


That was my first thought.

Let me queue up a set of actions and tell me what those actions will cost. Also, track my daily/weekly/monthly cost and tell me what the new rate will be after a given set of actions.


Thanks for the feedback. I already thought about implementing cost estimate calculation, and that's definitely on the roadmap for an upcoming release.


App costs more than my annual glacier bill :), but like others I've been looking for something like this for a while. I'll definitely support it.


How does it stack up to Arq [0]?

[0] https://www.arqbackup.com/


Arq is a full featured backup solution. Freeze on the other hand is not really meant to be a backup solution, it's rather a file transfer client for Glacier where you get a raw view on your vault's inventories with upload/download features.


Thanks!


Anybody used Glacier for home backups? I still don't understand complicated billing of backup downloads, I fear that Amazon will charge me for many thousands of dollars.


It's actually not too hard to calculate the storage costs and it's really cheap. I've been using Glacier for backups since the service has been launched. The calculation of the retrieval costs are a bit tricky though. But if you use Glacier for disaster recovery only, it shouldn't matter too much.


For what it's worth, you can set price limits on a vault level basis so you can never accidentally charge yourself thousands of dollars. If a retrieval request is going to push you over the cost limit you've set, they'll automatically throttle the request.

When I needed to do a retrieval several months ago, I was able to limit my expense to $7/pcm and just let my client retrieve as fast as it could.

Details are on the aws blog here:https://aws.amazon.com/blogs/aws/data-retrieval-policies-aud...


You'd have to be storing absurd amounts of stuff for it to add up to thousands of dollars. from their docs:

Glacier is designed with the expectation that retrievals are infrequent and unusual, and data will be stored for extended periods of time. You can retrieve up to 5% of your average monthly storage (pro-rated daily) for free each month. If you choose to retrieve more than this amount of data in a month, you are charged a retrieval fee starting at $0.01 per gigabyte. Learn more. In addition, there is a pro-rated charge of $0.021 per gigabyte for items deleted prior to 90 days. Learn more


It's not storage that's expensive with glacier, it's retrieval costs. If you pull data down too quickly its conceivable that it could add up to thousands. In addition to the retrieval fee you have to pay for data out which is huge - $0.09 after the first GB in us-east-1.

Let's say you store 1TB and decide to download it all later. The retrieval fee is $9.73 (1024GBx0.95x$0.01) and data transfer is $92.07 (1023x$0.09). That's over $100 for just that one retrieval.


If you can afford to recover your data over the period of three weeks, you don't put anything, though


The pricing page for Glacier is horrifically misleading. That "learn more" link digs deep into their FAQ to the real costs. Retrieving a small amount of data in a rush can cost hundreds of dollars. Glacier is not a backup product, it's an archival product. There is an enormous difference between the terms "backup" and "archival", and anyone who thinks of using an archival solution for consumer backup needs is not understanding the differences and the related costs in the event of the need to retrieve data. Glacier is meant for corporations with deep pockets for disaster recovery, not for consumers.


I'm using it as a last resort backup for all out family photos. I have a local copy on a NAS drive as well, but if the house burns down then I know I can get my data back even if it costs a couple of hundred dollars to do so. The storage costs are pretty cheap but the retrieval costs are pricey. I hope I never have to retrieve anything from there to be honest.


This looks great. Anyway to push Time Machine backups to Glacier with the app?


Arq is the tool you need.


Files you backup with Arq are only visible through Arq. This application seems different:

> "No proprietary encoding of archive descriptions. No proprietary encryption or compression features that would make it complicated or even impossible to use other clients."

That makes it different and more usable (at least for me), even if its behavior isn't "Time Machine" alike.

A trial or a demo would be nice.


Arq's backup format isn't a secret. It's fully documented [1]

[1] https://www.haystacksoftware.com/arq/s3_data_format.txt


That matters and helps little to a regular user. If you want to access the data from the AWS web interface, or any other client (Transmit, etc.), you are out of luck if you use Arq.

Not being proprietary is an advantage. Freeze is as simple as it can be.


Correct me if I'm wrong, but de-dup and versions can't be done with the standard format. Which means Freeze can't do it.


Freeze is really meant to be a plain simple file transfer client, although it has some "helper features" like compare mode which shows differences between a vault's inventory and a local folder and the highlighting of duplicates.

But it is not meant to be a replacement for a full featured backup solution like Arq. I think some people like it a bit more "low-level" and simple and that's what Freeze is for.


> but de-dup and versions can't be done with the standard format

I really do not know.


Can the backups be locally encrypted before sending them to potentially insecure aws cloud?


Freeze has no encryption or compression features built-in, so you are responsible to prepare the data as you want it to be stored on Glacier before you upload it.

I don't like built-in encryption features in file transfer or backup apps too much, because they are often proprietary and that may make it hard to switch to another client.

As a workflow example, I personally encrypt my archives with gnupg and store them on external hard drives and additionally upload them to Glacier for disaster recovery.


I agree, proprietary implementations are exactly the problem and why I don't use arq, etc.

On the other hand I find manual local encryption a bit cumbersome workflow wise...


Another possible thought for the roadmap would be some sort of pluggable architecture for encryption. Let third-parties develop open encryption plugins that this runs inside prior to uploading.


Indeed, something like the pgp plugin for Apple Mail would be nice.


All data in Glacier is encrypted. Here's the FAQ entry:

> Amazon Glacier handles key management and key protection for you. Amazon Glacier uses one of the strongest block ciphers available, 256-bit Advanced Encryption Standard (AES-256). 256-bit is the largest key size defined for AES. Customers wishing to manage their own keys can encrypt data prior to uploading it.


I've been looking for something like this for a while, to the point where I started writing it yesterday. Thanks for saving my time! :D

I've been running it for the past few hours to put backups of my photos to Glacier. I have zips of each album which vary from a few megabytes to a few gigabytes - in total it's around 40GB. My internet connection isn't that reliable, so around half the uploads have failed with 'Connection Interrupted'. It would be great if they were automatically requeued (maybe at the end of the list?) and were able to be resumed without starting from scratch (I assume behind the scenes uploads are multipart?).


You're welcome :)

Yes, Freeze uses multipart uploads with 16 MB chunks. Resuming multipart uploads works chunk-wise, for example when you've uploaded 20 MB and then the connection drops, the upload then continues at chunk number 2. When you see that an upload restarts from scratch, then the connection was probably interrupted while trying to upload the first 16 MB.

About automatic retries: Freeze automatically retries 3 times and then you have retry manually.

For unreliable and slow internet connection I recommend setting max. parallel uploads to 1 and disabling the upload speed limit.

I plan make the chunk size configurable for a future update. I makes sense to use a smaller upload chunk size for unreliable internet connections.


Yesterday I uploaded 12GB to Glacier and big surprise for me was price per upload request:

$0.055 per 1,000 requests

Command line tool[1] split my 12GB into 1mb chunks and generate around 12k requests:

$0.055 per 1,000 requests 12,694 Requests $0.70

So bigger chunks are cheaper, but probably generate more retries.

[1] - https://github.com/uskudnik/amazon-glacier-cmd-interface


looks very neat. I did use glacier for backing up my photos and it was a lot more friendly when the feature to upload to s3 and then set an archiving policy so it automatically got sent to glacier was introduced. However when Amazon started offering unlimited storage in their cloud drive for $60 a year my usage of glacier ended. This still looks like a neat implemtation and if I have a use for glacier in the future I'll happy try this out.


This looks really good. Does anyone know of any Windows or Win+Linux options like this?


There's a cross platform solution built in Java, Simple Amazon Glacier Uploader (SAGU). [simpleglacieruploader.brianmcmichael.com] If on Linux there's a nice CLI option too. [https://github.com/MoriTanosuke/glacieruploader]

They're not half as sleek though.


Looks sleek, but any chance for a trial period?


It's $10 and the functionality is obvious. Just buy it.


And you can always get a refund as well.


Lack of trial doesn't bother me too much for $10, but I dislike buying things via the Mac App Store as it can't be automated via HomeBrew Cask.


I was looking for trial period too.


Wish this was a Web app.


How does this compare with Tarsnap?




Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: