
Show HN: Bash-My-AWS – CLI Commands for AWS - mike-bailey
https://bash-my-aws.org/
======
mike-bailey
It's probably my fault if you haven't heard of Bash-My-AWS.

Bash-My-AWS is a simple but extremely powerful set of CLI commands for
managing resources on Amazon Web Services. They harness the power of Amazon's
AWSCLI, while abstracting away the verbosity. The project implements some
innovative patterns but (arguably) remains simple, beautiful, readable and
easily extensible.

The project started in 2014 and while many hundreds of hours have gone into
it, far less has gone into promotion.

I'm speaking about it at LinuxConf and have created a documentation site at
[https://bash-my-aws.org](https://bash-my-aws.org)

[https://linux.conf.au/schedule/presentation/144/](https://linux.conf.au/schedule/presentation/144/)

~~~
bob33
for anyone on this thread that is interested. I run
[https://getcommandeer.com](https://getcommandeer.com) which is a tool to
manage your AWS and IAC infrastructure from a desktop GUI. I love this bash-
my-aws, as we are about to release Bash, Docker Compose, and Terraform
Runners. We already have Serverless and Ansible runners. They enable you to
run your command line system from a GUI, so that you can instantly switch
between AWS accounts/regions and even LocalStack. Because it is a desktop,
under the hood we are really running cli tools mixed in with some AWS JS SDK.

~~~
dastx
Why does your main website (linked in your comment) require JS to run? I've
not seen anything that warrants that requirement.

I wish people would stop making websites that's require JS.

~~~
Hamuko
Spoiler alert: if you don’t want to run JavaScript in your browser, you’re not
going to enjoy this application.

~~~
bob33
yea, it works terrible without a monitor as well. The site does not work
headless. what was I thinking? It's a desktop application. The site is where
you download the app.

~~~
bob33
To be a little more constructive with my answer. The website needs javascript
for our chat service, our newsletter service, our ability for users to
purchase licenses for the app, the night/dark mode switch. The codebase also
shares components with our desktop app as they are both Vue, but with the
desktop app utilizing electron. We will look to make the site be html only at
some point, but it is not a focus of our dev team. If you download the app,
you will see that it is a first rate experience for managing many AWS
services. The app itself probably has about 1,000 man hours or more put into
it. Would love to hear insight into how we can make it even more powerful for
the community.

------
nahikoa
This looks like an awesome project!

Meta note: All things considered, Amazon has it pretty good. They put out a
barely usable, bare-bones, but fully functional tool in awscli. Paying
customers of AWS have to perform the engineering effort to make the API more
usable, and some even open-source their projects like this. AWS is an
incredible business model.

~~~
failmode
I have sometimes questioned whether I should be spending my personal time
developing an open source tool so tied to a single companies services.

My reasons for continuing include:

\- I prefer to use command line over ClickOps

\- Using Bash-My-AWS makes me more effective at work

\- The emergent UX is equally applicable to other services (e.g. bash-my-
github, bash-my-spotify)

\- The intrinsic satisfaction from creating

\- Helping improve the experience for others

~~~
NotSammyHagar
I looked, no one created packages with those 2 names that I could fine,
although there are some clis for controlling spotify it seems. Were you
referring to specific packages somewhere for github and spotify?

~~~
failmode
bash-my-github and bash-my-spotify are two things I've made a start on but are
not public yet. They've been on the backburner for a while due to competing
priorities.

I was able to write a simple command that returned all songs a friend and I
had in common in our public playlists.

I forget the exact syntax but it was something roughly along the lines of:

    
    
      $ sort <(
          user-playlists alice | playlist-tracks 
          user-playlists bob | playlist-tracks
        ) | uniq --repeated

------
m0zg
Coming from Google Cloud, I couldn't deal with the atrocity that is awscli, so
I ended up eventually implementing the bare minimum of shell wrappers to at
least start, stop, ssh into, rsync files to and from, etc, my aws instances
_by name_, not by instance ID. Took me a couple of hours to cobble it
together.

Google cloud CLI offers all of this out of the box. Why Amazon wants to make
such basic commands difficult, I'll never understand.

------
wjoe
This looks pretty great, while the AWS CLI is very comprehensive, I always
struggle to remember which flags are needed for each command, and it's not
very consistent.

One thing I've not been able to work out with bash-my-aws yet was how to
easily switch between regions and accounts. I noticed you can use `region` on
it's own to set the current default region, but I'm often working with
multiple regions, and it'd be a pain to have to run `region us-west-1`
separately each time I want to use a different region. I couldn't see a way to
just specify a region for a given command (eg how you'd do `aws get-instances
--region us-west-1`). I guess you could do this with the environment variable
`AWS_DEFAULT_REGION=us-west-1 instances` but that's a bit verbose.

Similarly with AWS accounts, I use multiple AWS accounts, which are accessed
with different access keys, which are defined as profiles in my ~/.aws/config.
Normally I'd use these with the AWS CLI like `aws ec2 get-instances --profile
production`, I couldn't see any way in the docs to use or set this?

~~~
failmode
They're good questions. I can tell you how I manage regions and accounts but
am interested in learning how people think Bash-my-AWS might better support
users in this regard.

The AWCLI, as well as SDKs all support grabbing Regions and account
credentials from environment variables.

For Regions, I work tend to use the following aliases:

    
    
      alias au='export AWS_DEFAULT_REGION=ap-southeast-2'
      alias us='export AWS_DEFAULT_REGION=us-east-1'
      alias dr='export AWS_DEFAULT_REGION=ap-southeast-1'
    

I normally work in a single Region and swap when required by typing the 2
character alias.

To run a script or command (doesn't have to be Bash-my-AWS) across all Regions
I use region-each:

    
    
      $ region-each stacks | column -t
      example-ec2-ap-northeast-1  CREATE_COMPLETE  2011-05-23T15:47:44Z  NEVER_UPDATED  NOT_NESTED  #ap-northeast-1
      example-ec2-ap-northeast-2  CREATE_COMPLETE  2011-05-23T15:47:44Z  NEVER_UPDATED  NOT_NESTED  #ap-northeast-2
      ...
      example-ec2-us-west-2       CREATE_COMPLETE  2011-05-23T15:47:44Z  NEVER_UPDATED  NOT_NESTED  #us-west-2
    

For AWS accounts, I type the name of the account and I'm in. For accounts
using IDP (ldap/AD backed corporate logins) I generate aliases so I have tab
completion and simple naming.

In accounts that are only setup to use AWS keys, I use aliases that export
credentials kept in GPG encrypted files. Last time I looked, AWS docs
suggested keeping these long lives credentials in plaintext files readable by
your account. That's asking for trouble IMO, especially if they're kept in a
known location that a compromised node library could exfiltrate them from.

AWSCLI v2 beta includes support for SSO so it's probably a good time to look
at how BMA could include support for auth.

------
Terretta
Interesting this requires ‘jq’ when JMESpath is built into AWS CLI already.

[http://jmespath.org/](http://jmespath.org/)

~~~
failmode
jq is only used in three of the >120 functions (for sort-keys functionality).
All the rest use JMESPath.

If anyone can help with a solution I'd be delighted to remove the dependency
on jq.

[https://github.com/bash-my-aws/bash-my-
aws/blob/b74d92a902bb...](https://github.com/bash-my-aws/bash-my-
aws/blob/b74d92a902bb54ab7527eb40e79d397b3895bf0c/lib/stack-functions#L326)

~~~
kafana
I would definitely keep jq dependency there, especially if you plan to expand
the code base to provide ecs and Fargate commands. You will quickly run into
use cases where JMESpath is not capable of parsing json outputs for different
tasksdefintion commands.

------
TheSpiciestDev
Just the other day I was looking for an official docker image that includes
the AWS CLI. On top of that, and mainly, I was looking to find more
documentation or tooling to better automate the deployment of new AWS
projects.

Does anyone here have any experience of (starting from scratch or with no AWS
resources) setting up policies/users/resources/configurations via something
similar to the Deployment Managers of GCP and Azure?.. preferably something
declarative or via templates?

Bash-my-AWS looks like a great step towards the goal I have in mind but I may
also just be unaware of other tooling or AWS capabilities.

~~~
avip
Like it or not (I do...) terraform is the de-facto industry standard, and
pretty much the only mature cloud resources management tool I'm aware of.

It is unwise IMHO to use CloudFormation currently unless you're provisioning
resources so obscure they didn't yet make it to tf aws provider.

BTW your Dockerfile pretty much boils down to:

    
    
        FROM alpine:3.10
    
        RUN apk add --no-cache \
            python3
    
        RUN pip3 install awscli
    
        COPY config /root/.aws/
        COPY credentials /root/.aws/

~~~
Galanwe
Have a look at CDK. It's a framework on top of CF to use python/javascript/etc
made by AWS. I've been trying it out recently to try to move away from TF and
it's a promising alternative.

~~~
Aeolun
My problem with CF is the CF part, not the yaml. It takes just a few times
getting stuck in a rollback loop to hate CF forever.

Especially when you contact AWS support and they tell you the only thing you
can do is wait.

------
efitz
Interacting with individual DNS records in Route 53 is very hard using AWSCLI.
I wrote a Python wrapper around the Route 53 API to make it easier to do
command line records management (and also to do dynamic DNS with your own
Route 53 hosted domain):

[https://github.com/ericfitz/r53](https://github.com/ericfitz/r53)

------
alpb
Is this primarily required because AWS CLI is not good enough at listing
resources in desired format (json, jsonpath, yaml, table..)?

~~~
failmode
AWSCLI is amazingly flexible and powerful.

Bash-My-AWS thinly wraps AWSCLI commands that would otherwise be too long to
type. So you're still using AWSCLI and can improve your skill with it by
inspecting the source of Bash-My-AWS functions.

[https://news.ycombinator.com/item?id=21931298](https://news.ycombinator.com/item?id=21931298)

------
dopylitty
If you want to easily manipulate your AWS environment from the command line
use the AWS cmdlets for PowerShell. The fact that PowerShell cmdlets work on
objects instead of text makes them miles better than this or the AWS CLI
because you don't spend most of your time figuring out how to wrangle text
into meaningful output.

~~~
dvtrn
Hasn’t AWSCLI supported toggling the cmd output to either text, json or csv
for quite some time now or have I misunderstood your comment here?

~~~
failmode
Bash-My-AWS wraps AWSCLI as thinly as possible and makes use of JMESPath and
the text output.

The result is you have a simple set of commands that don't require you to type
hundreds of characters.

    
    
      instances() {
        local instance_ids=$(__bma_read_inputs)
        local filters=$(__bma_read_filters $@)
    
        aws ec2 describe-instances                                            \
          $([[ -n ${instance_ids} ]] && echo --instance-ids ${instance_ids})  \
          --query "
            Reservations[].Instances[][
              InstanceId,
              InstanceType,
              State.Name,
              [Tags[?Key=='Name'].Value][0][0],
              LaunchTime,
              Placement.AvailabilityZone,
              VpcId
            ]"                                                               \
          --output text       |
        grep -E -- "$filters" |
        LC_ALL=C sort -b -k 6 |
        column -s$'\t' -t
      }

------
ak217
I have developed something similar on top of the AWS CLI that incorporates a
bunch of integrations with other tools like the cloudinit and various bits of
Batch-related instrumentation:
[https://github.com/kislyuk/aegea](https://github.com/kislyuk/aegea)

------
iCarrot
The first thing I'd do after installing this is to alias prefixing "aws-" to
every commands. I like namespacing things as I suck at remembering names...

~~~
failmode
I don't like to rely on my memory either! I forget the names of commands and
use tab completion to list them.

You can just type bma[TAB][TAB] and it will list them all.

If you know the type of resource you are working with, you can use TAB
completion to see it's commands:

    
    
      $ stack-
      stack-arn            stack-exports        stack-tag-apply
      stack-asg-instances  stack-failure        stack-tag-delete
      stack-asgs           stack-instances      stack-tags
      stack-cancel-update  stack-outputs        stack-tags-text
      stack-create         stack-parameters     stack-tail
      stack-delete         stack-recreate       stack-template
      stack-diff           stack-resources      stack-update
      stack-elbs           stack-status         stack-validate
      stack-events         stack-tag

------
pensatoio
What really sells me on this tool is the ability to examine the underlying
awscli command and transformations. I’ll be giving this a go in the new year!

~~~
failmode
Thanks!

The intent has always been to enhance rather than replace AWCLI (which is an
amazing tool!).

If you're ever wondering how a Bash-My-AWS command works, use `bma type` (it
even supports tab completion for all the commands).

    
    
      $ bma type instances
      instances is a function
      instances () 
      { 
          local instance_ids=$(__bma_read_inputs);
          local filters=$(__bma_read_filters $@);
          aws ec2 describe-instances $([[ -n ${instance_ids} ]] && echo --instance-ids ${instance_ids}) --query "
              Reservations[].Instances[][
                InstanceId,
                InstanceType,
                State.Name,
                [Tags[?Key=='Name'].Value][0][0],
                LaunchTime,
                Placement.AvailabilityZone,
                VpcId
              ]" --output text | grep -E -- "$filters" | LC_ALL=C sort -b -k 6 | column -s' ' -t
      }

