Hacker News new | past | comments | ask | show | jobs | submit login
Taco Bell Programming (widgetsandshit.com)
28 points by sebmellen 9 months ago | hide | past | favorite | 7 comments



I read this years ago and it really resonated with me at the time. Why reinvent the wheel, right?

Reading it again now, it seems to advocate for going from one extreme to another without exercising judgement. A shell script might be fine in one context and irresponsible in another... but I guess nuance and circumstance don't make for interesting blog posts.


Cobbling together a script is great for one-off work if it gets the job done. What it's not good for is your core business, if it's going to evolve and need refinement over a longer duration with continuous feature additions and changes. Maybe someone who's brilliant could start with a python script and evolve it into Django during the project, but could also simply switch to Django early on.


Yeah I totally agree, but I do think there are times when starting with the absolute most basic implementation that meets the business needs is the way to go. That could be a simple bash script, or a python script without Django, etc.

In my experience, the key is making those tradeoff decisions in an educated manner. Ideally, product and engineering and in sync and have a somewhat informed view of the future, so appropriate tradeoffs on velocity (simple bash script) versus extensibility (Python with Django) can be discussed.


Definitely true at startups where it's that or what else you could/should be doing, and maybe in a few months a pivot means it's mostly discarded for a new adventure.


What happens if your network drops in the middle of crawling?

This is really just demonstrating 'happy path' code, which is fine for one offs and glue code imo, really anything where the cost of rerunning is negligible

Imagine trying to scrape the entire internet with the code samples provided though.

Everything is easy until something breaks, something unexpected happens, or your workload grows large enough. That's why those tools were invented.

Now, if you're just doing a simple web scrape of like, a single http website with no js? Yeah xargs and grep is fine, we do it all the time. This is especially the case for local build tooling in most projects.

Building the windows kernel though? Oh man I hope you have some cache for artifacts.


Why not step it up a notch and write all software with nothing but `if/fi`, `cat`, `while/do/done` and pipes/redirections


My take-away is that you should use reliable tools, and as few as possible. If you read it like that then the reason to not do what you said is that `find` and `xargs` are more reliable than what I realistically be able to achieve with ifs and cats.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: