Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Bash-My-AWS – CLI Commands for AWS (bash-my-aws.org)
155 points by mike-bailey on Jan 1, 2020 | hide | past | favorite | 63 comments



It's probably my fault if you haven't heard of Bash-My-AWS.

Bash-My-AWS is a simple but extremely powerful set of CLI commands for managing resources on Amazon Web Services. They harness the power of Amazon's AWSCLI, while abstracting away the verbosity. The project implements some innovative patterns but (arguably) remains simple, beautiful, readable and easily extensible.

The project started in 2014 and while many hundreds of hours have gone into it, far less has gone into promotion.

I'm speaking about it at LinuxConf and have created a documentation site at https://bash-my-aws.org

https://linux.conf.au/schedule/presentation/144/


Here's the video from my talk on Bash-my-AWS at LinuxConf 2020 in the Gold Coast Australia.

https://www.youtube.com/watch?v=UbH_cg7Ev1Q


for anyone on this thread that is interested. I run https://getcommandeer.com which is a tool to manage your AWS and IAC infrastructure from a desktop GUI. I love this bash-my-aws, as we are about to release Bash, Docker Compose, and Terraform Runners. We already have Serverless and Ansible runners. They enable you to run your command line system from a GUI, so that you can instantly switch between AWS accounts/regions and even LocalStack. Because it is a desktop, under the hood we are really running cli tools mixed in with some AWS JS SDK.


Why does your main website (linked in your comment) require JS to run? I've not seen anything that warrants that requirement.

I wish people would stop making websites that's require JS.


Spoiler alert: if you don’t want to run JavaScript in your browser, you’re not going to enjoy this application.


yea, it works terrible without a monitor as well. The site does not work headless. what was I thinking? It's a desktop application. The site is where you download the app.


To be a little more constructive with my answer. The website needs javascript for our chat service, our newsletter service, our ability for users to purchase licenses for the app, the night/dark mode switch. The codebase also shares components with our desktop app as they are both Vue, but with the desktop app utilizing electron. We will look to make the site be html only at some point, but it is not a focus of our dev team. If you download the app, you will see that it is a first rate experience for managing many AWS services. The app itself probably has about 1,000 man hours or more put into it. Would love to hear insight into how we can make it even more powerful for the community.


This looks like an awesome project!

Meta note: All things considered, Amazon has it pretty good. They put out a barely usable, bare-bones, but fully functional tool in awscli. Paying customers of AWS have to perform the engineering effort to make the API more usable, and some even open-source their projects like this. AWS is an incredible business model.


I have sometimes questioned whether I should be spending my personal time developing an open source tool so tied to a single companies services.

My reasons for continuing include:

- I prefer to use command line over ClickOps

- Using Bash-My-AWS makes me more effective at work

- The emergent UX is equally applicable to other services (e.g. bash-my-github, bash-my-spotify)

- The intrinsic satisfaction from creating

- Helping improve the experience for others


I looked, no one created packages with those 2 names that I could fine, although there are some clis for controlling spotify it seems. Were you referring to specific packages somewhere for github and spotify?


bash-my-github and bash-my-spotify are two things I've made a start on but are not public yet. They've been on the backburner for a while due to competing priorities.

I was able to write a simple command that returned all songs a friend and I had in common in our public playlists.

I forget the exact syntax but it was something roughly along the lines of:

  $ sort <(
      user-playlists alice | playlist-tracks 
      user-playlists bob | playlist-tracks
    ) | uniq --repeated


Thanks for the tool and reasons for building it. I will check it out tomorrow!


More charitably, AWS provides a CLI for most resources and commands and (in large measure) either provides backwards compatibility or a long depreciation window. This allows hackers to build syntactic sugar on top of their infrastructure.

I love that a community has an option to build components and share the same. It has made my work much more productive.

But we certainly agree on that last point: it’s an incredible business model.


AWSCLI backward compatibility has been so good, I've never had a Bash-My-AWS command fail due to a change.

AWS CLI v2 previews were released in Nov 19 and while this may contain some breaking changes, I wouldn't be surprised if all the commands BMA uses continue to work as normal.

https://aws.amazon.com/blogs/developer/aws-cli-v2-installers...


Anything above bare bones will be opinionated, imo this is the best solution for infrastructure provider - maximum freedom, but also providing a UI for simpler access.


The annoying part about AWS putting so little effort into smoothing out the rough edges in their tooling, ux and services, is that the whole ecosystem of helpers/wrappers developed to make the services usable by humans needing to get something done (rather than deep dive again and again into many rabbit holes of AWS's idiosyncrasies) - inevitably only cover some tiny subset of the services a typical shop is going to use.

Not to take anything away from the author of this project - Bash-my-aws looks fantastic - but it only helps you with a few core services. Same appears to be true of the commandeer tool that has also been mentioned in this thread. And the same is true for localstack, and on and on.

I really wish AWS would devote some resources to filling in these gaps themselves, and comprehensively.


How is this different than all of the git extensions that people publish.


Verbosity isn’t unusable, it has a steeper learning curve.


turning the flywheel


Coming from Google Cloud, I couldn't deal with the atrocity that is awscli, so I ended up eventually implementing the bare minimum of shell wrappers to at least start, stop, ssh into, rsync files to and from, etc, my aws instances _by name_, not by instance ID. Took me a couple of hours to cobble it together.

Google cloud CLI offers all of this out of the box. Why Amazon wants to make such basic commands difficult, I'll never understand.


This looks pretty great, while the AWS CLI is very comprehensive, I always struggle to remember which flags are needed for each command, and it's not very consistent.

One thing I've not been able to work out with bash-my-aws yet was how to easily switch between regions and accounts. I noticed you can use `region` on it's own to set the current default region, but I'm often working with multiple regions, and it'd be a pain to have to run `region us-west-1` separately each time I want to use a different region. I couldn't see a way to just specify a region for a given command (eg how you'd do `aws get-instances --region us-west-1`). I guess you could do this with the environment variable `AWS_DEFAULT_REGION=us-west-1 instances` but that's a bit verbose.

Similarly with AWS accounts, I use multiple AWS accounts, which are accessed with different access keys, which are defined as profiles in my ~/.aws/config. Normally I'd use these with the AWS CLI like `aws ec2 get-instances --profile production`, I couldn't see any way in the docs to use or set this?


They're good questions. I can tell you how I manage regions and accounts but am interested in learning how people think Bash-my-AWS might better support users in this regard.

The AWCLI, as well as SDKs all support grabbing Regions and account credentials from environment variables.

For Regions, I work tend to use the following aliases:

  alias au='export AWS_DEFAULT_REGION=ap-southeast-2'
  alias us='export AWS_DEFAULT_REGION=us-east-1'
  alias dr='export AWS_DEFAULT_REGION=ap-southeast-1'
I normally work in a single Region and swap when required by typing the 2 character alias.

To run a script or command (doesn't have to be Bash-my-AWS) across all Regions I use region-each:

  $ region-each stacks | column -t
  example-ec2-ap-northeast-1  CREATE_COMPLETE  2011-05-23T15:47:44Z  NEVER_UPDATED  NOT_NESTED  #ap-northeast-1
  example-ec2-ap-northeast-2  CREATE_COMPLETE  2011-05-23T15:47:44Z  NEVER_UPDATED  NOT_NESTED  #ap-northeast-2
  ...
  example-ec2-us-west-2       CREATE_COMPLETE  2011-05-23T15:47:44Z  NEVER_UPDATED  NOT_NESTED  #us-west-2
For AWS accounts, I type the name of the account and I'm in. For accounts using IDP (ldap/AD backed corporate logins) I generate aliases so I have tab completion and simple naming.

In accounts that are only setup to use AWS keys, I use aliases that export credentials kept in GPG encrypted files. Last time I looked, AWS docs suggested keeping these long lives credentials in plaintext files readable by your account. That's asking for trouble IMO, especially if they're kept in a known location that a compromised node library could exfiltrate them from.

AWSCLI v2 beta includes support for SSO so it's probably a good time to look at how BMA could include support for auth.


Interesting this requires ‘jq’ when JMESpath is built into AWS CLI already.

http://jmespath.org/


jq is only used in three of the >120 functions (for sort-keys functionality). All the rest use JMESPath.

If anyone can help with a solution I'd be delighted to remove the dependency on jq.

https://github.com/bash-my-aws/bash-my-aws/blob/b74d92a902bb...


I would definitely keep jq dependency there, especially if you plan to expand the code base to provide ecs and Fargate commands. You will quickly run into use cases where JMESpath is not capable of parsing json outputs for different tasksdefintion commands.


jmespath has quite a few limitations, even the official AWS CLI documentation states that for the more advanced stuff `jq` is probably the go to tool.

https://docs.aws.amazon.com/cli/latest/userguide/cli-usage-o...

"For more advanced filtering that you might not be able to do with --query, you can consider jq, a command line JSON processor. You can download it and find the official tutorial at http://stedolan.github.io/jq/."


I find jq’s language a lot nicer than JMESpath and trend to use it whenever possible


My HN comment detailing several limitations of JMESPath: https://news.ycombinator.com/item?id=16400320


Just the other day I was looking for an official docker image that includes the AWS CLI. On top of that, and mainly, I was looking to find more documentation or tooling to better automate the deployment of new AWS projects.

Does anyone here have any experience of (starting from scratch or with no AWS resources) setting up policies/users/resources/configurations via something similar to the Deployment Managers of GCP and Azure?.. preferably something declarative or via templates?

Bash-my-AWS looks like a great step towards the goal I have in mind but I may also just be unaware of other tooling or AWS capabilities.


Like it or not (I do...) terraform is the de-facto industry standard, and pretty much the only mature cloud resources management tool I'm aware of.

It is unwise IMHO to use CloudFormation currently unless you're provisioning resources so obscure they didn't yet make it to tf aws provider.

BTW your Dockerfile pretty much boils down to:

    FROM alpine:3.10

    RUN apk add --no-cache \
        python3

    RUN pip3 install awscli

    COPY config /root/.aws/
    COPY credentials /root/.aws/


Have a look at CDK. It's a framework on top of CF to use python/javascript/etc made by AWS. I've been trying it out recently to try to move away from TF and it's a promising alternative.


My problem with CF is the CF part, not the yaml. It takes just a few times getting stuck in a rollback loop to hate CF forever.

Especially when you contact AWS support and they tell you the only thing you can do is wait.


ansible and serverless are also very powerful IAC tools that let you deploy on top of CloudFormation but give you a much nicer way to do so. Terraform does require state which is a pain point of it for some. Ansible let's you just run their scripts and you don't have to worry about state in S3 or Dynamo DB.


> unless you're provisioning resources so obscure they didn't yet make it to tf aws provider.

Isn't there precedent for terraform getting support for things before cloudformation?


I'd say it's more and more common that CF doesn't support X resource or pattern than anything else.

We've got custom resources _everywhere_ instead and only just started on our journey of using TF instead. CDK is trying to drive up adoption though I've not used it yet so can't provide any opinions.


if you don’t want to copy your credentials into the container you can supply them via env vars when you docker run commands in the container


I would strongly recommend to use cloud formation through a typed proxy like troposphere. Also would not recommend to use terraform at all since you will run into warts and fundamental issues quickly. I have done projects with both and my current blessed workflow is a custom python driver which uses CF via troposphere and minimal boto3 as glue. Also I work at AWS.


Several of the warts in Terraform were fixed in 0.12.

While I think the HCL DSL was a mistake and prefer the CloudFormation YAML, CloudFormation has its share of warts as well, and the TF community has been doing better than CF in staying up-to-date with the AWS API updates - which reflects quite poorly on AWS actually.

> would not recommend to use terraform at all since you will run into warts and fundamental issues

It's not a good look to be employed by the 800 pound gorilla and bash your company's competitor without mentioning specifics.


0.12 fixed and introduced warts. It's a buggy mess, but it is at least has better coverage than CF.


Have you tried the CDK yet instead of using Troposphere?


Gruntwork has a lot of Open Source tooling around AWS and their new guides are pretty great for some of what you're mentioning

https://gruntwork.io/guides/

I am in no way affiliated with them other than being a customer


If I’m understanding you correctly, I think you want CloudFormation?


Thank you again for this direction!

I just finished a POC that generates 90% of the AWS services I use per client/project/application. The remaining 10% is DNS stuff that I can easily do by hand, but with a few clicks I get everything provisioned with much less human error (buckets, Lambdas, API Gateways, Cloudfront distributions, etc.)

The formation definition is ~1000 lines of JSON, but it explicitly describes everything I need and it takes in parameters - it's wonderful! Thank you again!


Thanks, AWS CloudFormation looks like what I've experienced with other cloud service providers.


There is also AWS Cloud Development Kit, which generates CloudFormation from Typescript, C#, Java, or Python.

https://aws.amazon.com/blogs/developer/getting-started-with-...


The other alternative is terraform:

https://www.terraform.io/


At work we are using Terraform to manage everything that is related to AWS resources, including accounts, IAM policies and groups. We also used Serverless framework and CloudFormation, but Terraform is what works for us and I can recommend it as a main IaaS tool


We use Pulumi to manage both our GCP and AWS resources, and we really like it.

You might consider using Terraform directly if you want something more mature.


Interacting with individual DNS records in Route 53 is very hard using AWSCLI. I wrote a Python wrapper around the Route 53 API to make it easier to do command line records management (and also to do dynamic DNS with your own Route 53 hosted domain):

https://github.com/ericfitz/r53


Is this primarily required because AWS CLI is not good enough at listing resources in desired format (json, jsonpath, yaml, table..)?


AWSCLI is amazingly flexible and powerful.

Bash-My-AWS thinly wraps AWSCLI commands that would otherwise be too long to type. So you're still using AWSCLI and can improve your skill with it by inspecting the source of Bash-My-AWS functions.

https://news.ycombinator.com/item?id=21931298


If you want to easily manipulate your AWS environment from the command line use the AWS cmdlets for PowerShell. The fact that PowerShell cmdlets work on objects instead of text makes them miles better than this or the AWS CLI because you don't spend most of your time figuring out how to wrangle text into meaningful output.


Windows users have commented at how Bash-My-AWS's innovative use of unix streams reminds them of PowerShell.

The listing functions output lines of tokens. The first token is the resource identifier. Piping that output into functions for that resource type results in the resource IDs only being used.

  $ instances
  i-03dfa28fc8235df7b  t3.nano  running  prometheus  2019-12-31T14:10:45.000Z  ap-southeast-2a  vpc-9def06f8
  i-0fd7a4c81051f2718  t3.nano  running  huginn      2019-12-31T14:10:44.000Z  ap-southeast-2a  vpc-9def06f8
  i-0abcd6e9c302f35bb  t3.nano  running  rails-demo  2019-12-31T14:10:47.000Z  ap-southeast-2b  vpc-9def06f8

  $ instances | grep rails-demo | instance-asg | asg-capacity
  rails-demo-AutoScalingGroup-14SBR6O3W1FBL  0  1  2

  $ instances | grep rails-demo | instance-asg | asg-
  asg-capacity              asg-launch-configuration  asg-processes_suspended   asg-stack
  asg-desired-size-set      asg-max-size-set          asg-resume                asg-suspend
  asg-instances             asg-min-size-set          asg-scaling-activities

  $ instances | grep rails-demo | instance-asg | asg-desired-size-set 2
  $ instances | grep rails-demo | instance-asg | asg-capacity
  rails-demo-AutoScalingGroup-14SBR6O3W1FBL  0  2  2


Do you have any insights on how someone who is used to the text-only world of Bash transition to using Powershell cmdlets?

The problem I run into is that it just feels like so much typing to me. I have to read documentation. All the attributes HaveReallyLongNamesThatContainCapitalLetters. By the time I've made my beta version of the command I want to run, I feel like I need to open a text editor to finish it. Maybe add some error checking. Some comments too. Maybe a unit test or three. And now I have an entire project and all I wanted to do was add a line of text to the end of a file.

Part of the problem on my part is my own ignorance of the APIs and what commands are available to me. But it all seems too verbose to use practically. The Powershell language seems very good for what you would write a shell script to do, but for interactive commands, I have a hard time believing that people use it. It's just so verbose.


We are starting to solve the command line problem in Commandeer. https://getcommandeer.com/iac-running-suite In the next few weeks we will be rolling out a Bash Runner. This is a preview of the Bash Runner Page - https://imgur.com/Eruzzv7


Hasn’t AWSCLI supported toggling the cmd output to either text, json or csv for quite some time now or have I misunderstood your comment here?


Bash-My-AWS wraps AWSCLI as thinly as possible and makes use of JMESPath and the text output.

The result is you have a simple set of commands that don't require you to type hundreds of characters.

  instances() {
    local instance_ids=$(__bma_read_inputs)
    local filters=$(__bma_read_filters $@)

    aws ec2 describe-instances                                            \
      $([[ -n ${instance_ids} ]] && echo --instance-ids ${instance_ids})  \
      --query "
        Reservations[].Instances[][
          InstanceId,
          InstanceType,
          State.Name,
          [Tags[?Key=='Name'].Value][0][0],
          LaunchTime,
          Placement.AvailabilityZone,
          VpcId
        ]"                                                               \
      --output text       |
    grep -E -- "$filters" |
    LC_ALL=C sort -b -k 6 |
    column -s$'\t' -t
  }


You don’t get the impedance mismatch of text to objects that bash has when dealing with the complexity of AWS resources.


I have developed something similar on top of the AWS CLI that incorporates a bunch of integrations with other tools like the cloudinit and various bits of Batch-related instrumentation: https://github.com/kislyuk/aegea


The first thing I'd do after installing this is to alias prefixing "aws-" to every commands. I like namespacing things as I suck at remembering names...


I don't like to rely on my memory either! I forget the names of commands and use tab completion to list them.

You can just type bma[TAB][TAB] and it will list them all.

If you know the type of resource you are working with, you can use TAB completion to see it's commands:

  $ stack-
  stack-arn            stack-exports        stack-tag-apply
  stack-asg-instances  stack-failure        stack-tag-delete
  stack-asgs           stack-instances      stack-tags
  stack-cancel-update  stack-outputs        stack-tags-text
  stack-create         stack-parameters     stack-tail
  stack-delete         stack-recreate       stack-template
  stack-diff           stack-resources      stack-update
  stack-elbs           stack-status         stack-validate
  stack-events         stack-tag


What really sells me on this tool is the ability to examine the underlying awscli command and transformations. I’ll be giving this a go in the new year!


Thanks!

The intent has always been to enhance rather than replace AWCLI (which is an amazing tool!).

If you're ever wondering how a Bash-My-AWS command works, use `bma type` (it even supports tab completion for all the commands).

  $ bma type instances
  instances is a function
  instances () 
  { 
      local instance_ids=$(__bma_read_inputs);
      local filters=$(__bma_read_filters $@);
      aws ec2 describe-instances $([[ -n ${instance_ids} ]] && echo --instance-ids ${instance_ids}) --query "
          Reservations[].Instances[][
            InstanceId,
            InstanceType,
            State.Name,
            [Tags[?Key=='Name'].Value][0][0],
            LaunchTime,
            Placement.AvailabilityZone,
            VpcId
          ]" --output text | grep -E -- "$filters" | LC_ALL=C sort -b -k 6 | column -s' ' -t
  }




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: