Hacker News new | comments | show | ask | jobs | submit login
LocalStack – A fully functional local AWS cloud stack (github.com)
392 points by manojlds on Mar 27, 2017 | hide | past | web | favorite | 39 comments



Huh. One of the biggest 'drawbacks' of using AWS as a production platform is that making your development environment look like production is hard.

Having to deploy to test is cumbersome and having a cost associated with each test can definitely introduce some sort of 'stress' and encourage people to not test incrementally.

I wonder if this changes that. Having services like S3, Lambda and SQS available locally sounds super interesting.

I will definitely keep an eye on this. One other thing I would like to see is CloudFront, which can be very hard to start with due to its opaqueness. Having to wait 20+ minutes between configuration changes is very demotivating.


> your development environment look like production is hard.

The hard part is the cost IMHO


This is only for automated testing. They're real services.


Really impressive. Although the title sounds a bit misleading, it's only mocking stuff from what I can see (which makes sense).

I know there's some actual open source reimplementations of bits of the AWS products, such as S3. Might be worth integrating into it.


"Minio" as a drop in local replacement for s3 has been great for local development. Small Golang binary.


Minio is awesome, I use it in production to store many TBs of data and never had any issues. The clustering functionality could use some additional work (or I just havn't understood it right) and it would be nice to be able to generate multiple keys/users.


Out of curiosity how have you found the performance of clustered Minio? I've been checking out Riak S2 but the simplicity of Minio is attractive. Thanks!


I don't think Minio is clustered.



Ah. I stand corrected. I'm leery of their consistency model in that mode, but could just be that I haven't dug into it far enough to make sense out of it.


> I'm leery of their consistency model in that mode

Amazon S3 is eventually consistent, so is this really any worse?


For authorization/authentication policy updates it isn't.


How would you say minio compares to the real thing in terms of performance and semantics?


I haven't used it, but there's Eucalyptus as well. API compatible with AWS. Has ELB, EC2, S3, and some other pieces.

https://en.m.wikipedia.org/wiki/Eucalyptus_(software)


There are distributed filesystems (?) that offer an S3-compatible API such as Ceph http://docs.ceph.com/docs/master/radosgw/s3/


Another option along the same lines:

https://github.com/eucalyptus/eucalyptus


Interestingly, that Atlassian co-publishes this on GitHub, keeping the main development on BitBucket.


GitHub has become the defacto standard place for open source. I've heard it said that if you care about contributors or users of open source, you should publish to GitHub even if that's not where you develop it. I guess Atlassian have realised this too...


I understand what you're saying and unfortunately it's true but it's also silly because Gitlab, Bitbucket and others can use Open Auth APIs to make contribution easier. I for example use Google Auth on Gitlab but Github didn't have that back when I signed up. If they did I would have used it there too.


I guess GitHub got mindshare early and then became the entrenched option. So its less about technical merits than not, although anecdotally, while GitHub has its flaws, in many ways I prefer its UX to that of GitLab and Bitbucket. For example, Bitbucket's overview page isn't very useful to me at all and yet its the default repo page.


Am I missing the documentation somewhere? I see directions for setup and installation but nothing about what's supported for each mock. Obviously they haven't rebuilt all the service from scratch to run on my local machine, so what can I do here? If I create an S3 bucket will the API return that bucket in my next ListBuckets call? If so, is that persistent? Etc.


My concern with relying on things like these for testing is they're not supported by AWS, so behaviours can easily diverge without you realising.

I think we've got a lot of mileage in the past out of just mocking request/responses when testing AWS interactions. The only time we've ever used a pseudo-service was when testing against the dynamoDB JAR they provide with the SDK.


That is my first concern as well but testing and bug counts are not an all-or-nothing thing. If something like this can reduce bug counts in a complicated distributed system while reducing costs vs testing against "the real thing" it still may very well be worth it.


Yup. There's nothing to stop some to have a lot of local iterations for speed, and then later when you are satisfied, have the test run on real AWS.

Would save some money too.


I've tried localstack once a few months ago to mock SQS and found it really slow to boot, had a pretty high memory footprint, crashed very often (was running inside their official docker container I recall), and lacked documentation.

If you just need to mock SQS, I recommend using elasticmq. I've used it for a few months now and haven't had any problems at all so far.

https://github.com/adamw/elasticmq/blob/master/README.md https://hub.docker.com/r/expert360/elasticmq/


This is something I hadn't really thought about before. Aside from this, how have AWS devs done local development (have they?).


Same way as any other API really, mocking and/or real dry run calls where supported (which is most of the API in AWS' case).

Then a staging version of the real environment for final testing.


Dry run calls for AWS just check IAM permissions according to docs. Am I missing some sort of dry run feature?


You just have dev instances and production instances. It's definitely not perfect and scales terribly as the team grows.


Looks interesting, so far we've been using elasticmq[1] (SQS) and s3rver[2] (S3) to do e2e integration testing on local and staging. We spin them up using azk[3].

[1] https://github.com/adamw/elasticmq [2] https://github.com/jamhall/s3rver [3] http://www.azk.io/


Has anyone figured out how to create lambdas? I can't seem to create one via the CLI because I don't have the correct IAM role setup?


Thank you for confirming the accuracy of the AWS Lambda test model.


This looks excellent, I used moto to do some unit testing of a Python application and was wondering if anything existed to make unit testing other codebases easier. There's always the risk of diversion from AWS resources but I think it's a small risk, generally AWS seems to stay fairly consistent for legacy when new features are introduced.


AWS supports an official DynamoDBLocal. I wish they supported local implementations of their other services as well.


I applaud everyone using such deep testing!


Could this be used for learning AWS instead of starting the clock on the free year of AWS?


If you want to learn the ins and outs of AWS i would recommend using the real thing. You can always create a new account when you have something in production and want to take advantage of free-tier pricing.


Why are we calling this "local" and not a "private cloud"?


Did you read the page linked to?

Or: Because it's not a "private cloud".




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: