For owners of small websites running on DigitalOcean, GCP or AWS, how do you handle DDoS and DoS attacks?
For context, while exploring the load testing tool Siege running on a VPS, I was able to bring down multiple sites running on shared hosting, and some running on small VPS by setting a high enough concurrent number of users. This is not a DDoS, but it goes to show how easy it is to cause damage. Note: I only brought down sites that I own, or those of friends with their permission.
What tools are useful in fighting DDoS attacks and script kiddies? Mention free and paid options.
What are the options to limit damage in case of an attack? How do you limit bandwidth usage charges?
There was a previous discussion on this topic 6 years ago https://news.ycombinator.com/item?id=1986728
The simple advice for layer 7 (application) attacks:
1. Design your web app to be incredibly cacheable
2. Use your CDN to cache everything
3. When under attack seek to identify the site (if you host more than one) and page that is being attacked. Force cache it via your CDN of choice.
4. If you cannot cache the page then move it.
5. If you cannot cache or move it, then have your CDN/security layer of choice issue a captcha challenge or similar.
The simple advice for layer 3 (network) attacks:
1. Rely on the security layer of choice, if it's not working change vendor.
On the L3 stuff, when it comes to DNS I've had some bad experiences (Linode, oh they suffered) some pretty good experiences (DNS Made Easy) and some great experiences (CloudFlare).
On L7 stuff, there's a few things no-one tells you about... like if you have your application back onto AWS S3 and serve static files, that the attack can be on your purse as the bandwidth costs can really add up.
It's definitely worth thinking of how to push all costs outside of your little realm. A Varnish cache or Nginx reverse proxy with file system cache can make all the difference by saving your bandwidth costs and app servers.
I personally put CloudFlare in front of my service, but even then I use Varnish as a reverse proxy cache within my little setup to ensure that the application underneath it is really well cached. I only have about 90GB of static files in S3, and about 60GB of that is in my Varnish cache, which means when some of the more interesting attacks are based on resource exhaustion (and the resource is my pocket), they fail because they're probably just filling caches and not actually hurting.
The places you should be ready to add captchas as they really are uncacheable:
* Login pages
* Shopping Cart Checkout pages
* Search result pages
Ah, there's so much one can do, but generally... designing to be highly cacheable and then using a provider who routinely handles big attacks is the way to go.