A coworker of mine dislikes it as it bundles everything into a single binary. For example, to have ACME DNS-01 challenges for certificate issuance working, I need to compile in a Google DNS-specific plugin.
But then it... just works. Try the same with most other web servers/proxies and you're in for a world of pain. Having this much functionality bundled into a single binary is as much a curse as it is a blessing.
That said, having your own little 'Cloudflare Workers' in the form of Nginx Unit with wasm sounds great. Not sure Caddy can do that.
For me, the promise of Caddy and especially tools around it like FrankenPHP make the "everything in a single binary" idea the MORE enticing option, not less.
Sure we already have repeatable infrastructure, containers, etc. but I also love the idea of just building and shipping a PHP app binary that includes the webserver. It makes server provisioning even less of a priority, especially if I have reasons to not use serverless or PaaS tools.
It's great until you want to include a non-standard plugin and need to compile your own binaries.
Now that single binary deployment requires you to compile the software yourself. Caddy has nice tooling for this but it'd be far more convenient to just drop a dll/so file in the right directory.
Single binary deployments are great if someone else did the compiling for you. If you need to compile yourself it truly does not matter if you need to ship a single binary or a directory or whatever.
I was in the same boat as you and wanted to try out what Caddy is capable of. I was immediately convinced. So many features, where you expect them. Consistent configuration language. Environment interpolation, everywhere. Flexible API. It’s really all there.
From the first glance it doesn't look convincingly better than a generic and manually polished nginx configuration. Are there any other benefits to Caddy?
If you choose to start the project with docker compose, you’ll notice how it will immediately bring up a fully functional reverse proxy setup with TLS support on localhost; set the SITE_DOMAIN environment variable to your proper domain instead, and you’ll find that configured as well, along with a proper, ACME-issued certificate. Add a bit more effort, and you’ll also get mTLS for all services automatically.
All of this is more or less doable with nginx, I’ve done it often enough. But read the Caddyfile and tell me this isn’t miles ahead in clarity.
It does all the letsencrypt stuff for you - certbot is not a massive hassle if you're just serving the one domain of course but I really liked it for that when I was setting up a redirect server (corps do love buying TheirBrand.everytld haha)
Set the config up with CI/CD and can now just edit the config and git push knowing Caddy will just handle the rest
It's a fine project right up to the point of you needing additional functionality that's split into one of the plugins. Since Go applications do not support proper .so in practice, you have to build your own binaries or rely on their build service, and this puts the responsibility of supporting and updating such custom configuration on you.
So no setting up unattended-upgrades and forgetting about it.
Eh, it's a bit over hyped imo although I do like the config format and built-in acme. My production clusters all run nginx though and give me minimal fuss with a lot of flexibility.
The problem with the validation part is that the human has to go through all the code written by AI. Guess what, understanding someone's code takes almost as much effort as writing the code yourself. Therefore you require just about the same amount of human beings verify AI's code.
Not enough time to even understand the requirements of a 40h job.
10h would be
minimum.
Unless you are just checking that the style guide is correct and obvious
missed null checks.
For this reason I feel code
reviews are a bit silly and pair programming is way better (pairing with a mix of sync and async work) but that is an aside.
In addition a code review being short is usually because the coder and reviewer are both very competent. Once AI
enters the chat the reviewer needs to look very closely at every line.
Code reviews are more of a formality, a pointless one. There's little to no value in one code reviewing a piece of code without the background business problem being solved by it.
Sure but a “superficial” review is what most corps currently run on. Certain crucial code gets looked at more throughly. If there are any bugs QA catches it.
It’s going to be the same with AI. It pumps out a lot of code. Human does spot checks and probably uses other AI to help with the code review. A real human does the really hard, critical code directly. A more robust QA team makes sure it all works.
I think you are right in the near future but not today. I don’t think we are quite there yet with AI but the pace of improvement is breathtaking and producing code (and more generally “logic” which could be code but also machine code, system design, LLM prompts and so on) is such a pot of gold the AI companies will go for it.
Yeah thats because then they don‘t really look at it and certainly don‘t really understand it.
In some way reviewing code is more complicated and hard than actually writing it.
I really like the „reverse centaur“ metaphor by Doctorow regarding the automation AI will bring us - humans having to double check the stuff AI wrote for correctness in AI pace.
AI might erode some of the gains we got from camera techonology because the better it gets at manipulating images, the less humans trust digital images as evidence
reply