> He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes.
This hits home. Not because I did it as a kid; I'm a bit old for that. But because I've done this exact thing two or three times. You stare and know, just know, that somewhere in this byzantine interface there is the raw power to do lots of cool 3D stuff. But damn. It's quite an interface.
> That is not a bug in how he’s using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
Yeah. For me it was an old, beat-up 286 that I couldn't get anyone to upgrade and and loving devotion to MS-DOS, old EGA Sierra games, TSR programs, TUIs, GeoWorks, and just not being able to get enough of it.
When I finally saved up enough to buy a 486 motherboard, I installed Linux because it seemed cool (and was cool) and never looked back. But that 286 sparked my obsession with computers that has influenced almost every aspect of my life.
I still love to revive old hardware and push it beyond it limits. Mostly because i think it's fun, but also because it's dirt cheap or free. Back then made an old GPS system play Monkey Island or mp3's or read E-books. Reinstalled lots of old Android phones and tablets. Made photo frames out of them. Made webcams out of them. Transformed old laptops into Chromebooks. Make lots of old NAS devices work again. Stuff like that.
> In most cases an IPO isn't worth it for founders because an IPO means you lose operational control.
This is counterintuitive to me.
If you’re acquired, you’re giving up ownership and you tend to lose operational control unless you have agreements in place that say otherwise.
With an IPO it seems like you have a better chance to retain control: you can control the share allocations going into an IPO to give you solid voting power. While you’re accountable to a board of directors and theoretically accountable to stockholders, in reality management often runs the show, at least until the board runs out of patience with bad earnings.
The problem is if you go public as a small company, it can be hard to survive. You need to meet expectations every time you do an earnings call or watch your stock get crushed, and it’ll never be given another chance. The burdens are also a lot higher in terms of the cost.
You don’t really see companies under $10 billion going public anymore. That may continue to be the case, but it’s terrible for entrepreneurs.
That’s just not true. At the end of the 90s the US had a budget surplus and there was a discussion of how we were going to handle it.
Then George W. Bush enacted a big tax cut in 2001 that no one remembers because it was heavily weighted toward the top 1%. Suddenly we didn’t have a surplus problem anymore.
That wasn't necessarily Bush personally, who was never the sharpest knife in the drawer, but his strategists. The Republicans had convinced themselves that surpluses encouraged government spending so by wiping them out and "starving the beast" as they put it the resulting financial crunch would create a need to slash spending, cut social welfare, and reduce the size of government.
Actually now that it's set out like that, the strategists were just as much in la-la land as Bush was.
But they never did starve anything. Even DOGE didn't cut a significant fraction of the budget. They just eliminated a bunch of ideological enemies and quit early.
They didn't even pretend to reduce the entitlement programs that they claim to hate, but are fiercely defended by the elderly, who overwhelmingly vote for them.
Here they show the debt increasing through the 90s but by less than most other decades. I don't know if it takes into account inflation though so maybe that would have made the debt have less value. Seems like they didn't use any of the surplus to pay off the debt.
It’s basically a royalty model. That’s common in some industries and with some products. I haven’t looked lately but both Unity and Unreal Engine had royalty models; game devs would pay either a fixed per-unit fee or a percentage of revenue after a certain volume of sales.
To be viable as a business plan, this requires that a certain percentage of your customers have viable products.
Here’s the thing though: anyone who has a high volume of sales will want to shed the royalty. This could be by negotiating different terms or just rewriting to avoid the component or service that wants the royalty.
For Unity and Unreal, it’s pretty common knowledge that AAA studios have separately negotiated licenses, presumably to reduce or eliminate the per-unit royalty. Some studios write their own engine, though that has its own costs.
For vibe coding I have real doubts about this model. There’s effectively no moat and no defensive IP (ie: patents), so anyone making enough revenue to pay $$$ on royalties will probably end hiring SWEs to rewrite their software to avoid royalties.
The difference is obvious: it doesn't cost Epic anything if you download their engine flail around for 5 years and release a buggy bomb. 5 years of tokens would cost a lot.
> Senior engineer looks under the hood, sees 500k lines of incomprehensible spaghetti mess with emoji comments everywhere, runs out the door and never looks back.
Senior engineering _consultant_ looks at those 500k lines of incomprehensible spaghetti mess and sees $$$: months or years of contracts and likely very dysfunctional management that is willing to pay multiple times the cost of full time employees to keep the burn on a non-payroll line and/or keep the “AI first” story rolling on.
> Senior engineering _consultant_ looks at those 500k lines of incomprehensible spaghetti mess and sees $$$: months or years of contracts and likely very dysfunctional management that is willing to pay multiple times the cost of full time employees to keep the burn on a non-payroll line and/or keep the “AI first” story rolling on.
That's not been my experience. Even pre-AI, when I was asked to find a bug in some hacked-together codebase, sticker shock was often the result.
"What do you mean, billing for a week? The guy who created this is an actual software engineer and you're billing just as much as he did!"
I've got a list of small ex-clients who won't get work from me anymore, unless they are happy with "Here's my weekly rate, 1 week minimum".
Hourly rates don't work on a client who considers $200/m to be overpaying for s/ware development services.
I suspect there’s a middle ground that involves either keeping tests more proprietary or a copyright license that bars using the work for AI reimplementation, or both.
I think it’s entirely reasonable to release a test suite under a license that bars using it for AI reimplementation purposes. If someone wants to reimplement your work with a more permissive license, they can certainly do so, but maybe they should put the legwork in to write their own test suite.
Management often has a perverse short-term incentive to make labor feel insecure. It’s a quick way to make people feel insecure and work harder ... for a while.
Also, “AI makes us more productive so we can cut our labor costs” sounds so much better to investors than some variation of “layoffs because we fucked up / business is down / etc”
I prefer pushing the constraints to motivate a different solution instead of asking them to do an unmotivated exercise.
“Google Sheets is a great solution for two people, but let’s say the department expands and now it’s ten people. How does this change your answer?”
It’s easy to break Google Sheets as a workflow by increasing the number of users, adding complex business logic, etc.
It’s interesting to see what candidates come up with and how they think. Sometimes the solutions are genuinely interesting. Mostly they’re not, which is okay. If you don’t give yourself the opportunity to learn as an interviewer, you’re missing out.
Knowing Google, there’s a good chance it will turn out like AMP [0]: concerning, but only spotty adoption, and ultimately kind of abandoned/irrelevant.
While I'm glad AMP never got truly widespread adoption, it did get adopted in places that mattered -- notably, major news sites.
The amount of times I've had to translate an AMP link that I found online before sending it onwards to friends in the hopes of reducing the tracking impact has been huge over the years. Now there are extensions that'll do it, but that hasn't always been the case, and these aren't foolproof either.
I do hope this MCP push fizzles, but I worry that Google could just double down and just expose users to less of the web (indirectly) by still only showing results from MCP-enabled pages. It'd be like burning the Library of Alexandria, but at this point I wouldn't put the tech giants above that.
AMP lives on, mostly as AMP for Email and used by things like Google Workspace for performing actions within an email body (allow listed javascript basically).
Don't forget the all-important last step: abruptly killing the product - no matter how popular or praiseworthy it is (or heck: even profitable!) if unnamed Leadership figures say so; vide: killedbygoogle.com
> You don't talk about all the assembly high level languages make, or at least it's no longer how people view things.
Speak for yourself. I routinely look at assembly when worrying about performance, and occasionally drop into assembly for certain things. Compilers are a tool, not a magic wand, and tools have limits.
Much like LLMs. My experience with Claude Code is that it gets significantly worse the further you push it from the mean of its training set. Giving it guidance or writing critical “key frame” sections by hand keep it on track.
People who think this is the end of looking at or writing code clearly work on very different problems than I do.
This hits home. Not because I did it as a kid; I'm a bit old for that. But because I've done this exact thing two or three times. You stare and know, just know, that somewhere in this byzantine interface there is the raw power to do lots of cool 3D stuff. But damn. It's quite an interface.
> That is not a bug in how he’s using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
Yeah. For me it was an old, beat-up 286 that I couldn't get anyone to upgrade and and loving devotion to MS-DOS, old EGA Sierra games, TSR programs, TUIs, GeoWorks, and just not being able to get enough of it.
When I finally saved up enough to buy a 486 motherboard, I installed Linux because it seemed cool (and was cool) and never looked back. But that 286 sparked my obsession with computers that has influenced almost every aspect of my life.
reply