Many amazon servers can't service requests from a browser (due to CORS headers), and all of them require an aws account.
Providing an AWS account that people could use is a security nightmare, and many of the operations have charges associated with them.
If I provide my aws credentials, then he could proxy through his server and request on my behalf, but I wouldn't want to give him my credentials and I might as well just play with the aws-cli which exposes pretty much all of the same calls anyways.
I was considering writing something similar: a "semi-mocked" implementation of all of AWS, getting you most of what you get from running OpenStack but from one static binary running in userland and with no persistence.
My use-case for it was that I'd write it all in Erlang, and then each public port on a mocked EC2 instance would have a single Erlang process hooked up to it to respond to messages coming to that instance-IP:port. So you could set up something that looked like a thousand-node distributed VPC, from the perspective of an AWS client talking to the API, and it would all sit in a couple MB of RAM and near-zero CPU.
This!
A local mock of AWS is sorely lacking for most services. The best one I know are Amazon's own DynamoDB and the independent FakeS3 tool which exposes an http api and saves the blobs and metadata as local files in a transparent way.
I've been waiting for someone to come along and say "I did this!"
Okay so I didn't do this, but I have seen several folks where I work create some stubbed AWS services to facilitate testing. I'll have to ask them if this turned out to be worth the (gargantuan) effort.
Edit: Looks like there are a couple[1] instances[2] where people have done this to some degree. Many of them are very old, though. The one built by a team where I work only implements SNS/SQS as well, so there's a pretty big gap for other interfaces. It seems like the recommended way to do this is to just use something like EasyMock or whatever and stub rather than have an actual API implemented for integration testing. I'm not sure I'm satisfied with that answer; on the other hand, it's substantially easy for a custom implementation to wildly differ from the actual AWS API behavior... Seems risky to base integration testing results on that?
I can't speak for all the libraries linked above, but I've used fake-s3 with a rails app and s3rver with a node app. They both worked great for development and testing. I'd definitely use em again.
I think it's worth noting that some of these libraries have different intended goals. For example: Riak CS is intended as an alternative that you can use in production, while s3rver is meant for testing.
+1 on just using the CLI but could also use STS to generate a token with a locked down policy client side and the risk of proxying would be much smaller
There was a similar HN post like this but for Elasticsearch a few weeks ago.
I'm kind of thinking that a command line tool for elastic search similar to the AWS command line isn't a bad idea. In general when I can't remember an AWS API call, instead of googling, I just use the AWS cli. I find it faster than google or a reference like this. Anyone else?
Providing an AWS account that people could use is a security nightmare, and many of the operations have charges associated with them.
If I provide my aws credentials, then he could proxy through his server and request on my behalf, but I wouldn't want to give him my credentials and I might as well just play with the aws-cli which exposes pretty much all of the same calls anyways.