Hacker News new | past | comments | ask | show | jobs | submit | jonathaneunice's comments login

Cursor.

All in on tab completion and its other UI/UX advances (generate, chat, composer, ...)


Exactly right. Cursor makes it easy to get to "adequate." Which in the hundreds of places that I'm not expert or don't have a strong opinion, is regularly as good as and frequently better than my first pass. Especially as it never gets tired whereas I do.

It's a great intern, letting me focus on the few but important places that I add specific value. If this is all it ever does, that's still enormously valuable.


> The scarcest resource [in primitive society] wasn't food. It was protection.

Interesting meta-claim.


Not so different from modern society. Safety == freedom.


Now that's an interesting perspective, it makes me think of the (A)GPL and Free compared to Open Source software.


A single example is insufficient, especially when that example is Drupal.

In a former life I evaluated WordPress, Drupal, and some other open source CMSs. Drupal seemed remarkably terrible. Maybe I just wasn't the target user...but UI, UX, and DX all seemed blighted and the ecosystem much less vibrant than WP's. Came away thinking it was a vestige of yesteryear, and not something in which to invest any time, energy, or money.

Maybe that's a harsh take on Drupal, but one already-seemingly-in-decline property from which one PE firm extracted value... if over-extraction and under-investment really is a systemic problem of PE, there must be many more, much better examples.


If you look at the chart, Drupal starts declining immediately upon being purchased. Hard to imagine Vista was able to destroy the community that quickly; seems more likely they bought high on Acquia, or even Acquia was doing something sneaky to juice its sale price.


It's stable and mature.

Maybe you don't hear the constant clickity-clack of endless change as a result. Also true that Ruby's most popular in writing web apps and services, where it and Rails continue to shine.

webdev FWIW is a ginormous part of this little internet thing. You know, those Web-based applications and services that run the entire global economy and human society, such as it is.


My goodness, the world never changes. Saw the same "new enterprise software much worse than the incumbent, terrible UI/UX, everyone hates it" dynamic play out *30 years ago* with SAP R/3. When I check back in 2054, expect it will not be any different. Technology changes, but people and organizational dynamics largely do not.


But it is verifiable. We could quibble over "who judges novelty," but I bet if there were regular examples of it doing so, and there were some community agreement the ideas were indeed suitably novel, we'd pretty quickly shout "existence proof!" and be done.


Depends on the LLM, perhaps, and/or the problem being solved. I get very good output from 10K–25K token submissions to Anthropic's Claude API.


Every new system wants to be a mainframe when it grows up. VMS, Unix, Linux, NT...they all started "small" and gradually added the capabilities and approaches of the Bigger Iron that came before them.

Call that the mainframe--though it too has been evolving all along and is a much more moving target than the caricatures suggest. Clustering, partitions, cryptographic offload, new Web and Linux and data analytics execution environments, most recently data streaming and AI--many new use modes have been added since the 60s and 70s inception.


> Every new system wants to be a mainframe when it grows up. VMS, Unix, Linux, NT...they all started "small" and gradually added the capabilities and approaches of the Bigger Iron that came before them

MacOS started on desktop, moved from there to smartphones and from there to smartwatches. Linux also moved ‘down’ quite a bit. NT has an embedded variant, too (https://betawiki.net/wiki/Windows_NT_Embedded_4.0, https://en.wikipedia.org/wiki/Windows_XP_editions#Windows_XP..., https://en.wikipedia.org/wiki/Windows_IoT).


True. Every new system wants to be just about everything when it grows up. Run workstations, process transactions, power factors, drive IoT, analyze data, run AI...

"Down" however is historically a harder direction for a design center to move. Easier to add features--even very large, systemic features like SMP, clustering, and channelized I/O--than to excise, condense, remove, and optimize. Linux and iOS have been more successful than most at "run smaller, run lighter, fit into a smaller shell." Then again, they also have very specific targets and billions of dollars of investment in doing so, not just hopeful aspirations and side-gigs.


In 1981, maybe, but there's a long time lag between design and flight for spacecraft. Cheap/light/sturdy dot matrix printers weren't yet available in the 1970s when the Shuttle was being designed. Nor had the idea of using commercial/off-the-shelf (COTS) components yet taken root. That would come years after the STS was already built and in service.


But the story says the drum printer was selected at the last minute as an "interim" solution?


Fair point. But not so "last minute" that a pre-existing military design couldn't be investigated and reworked, custom print heads cast, etc. That puts it back to...what? 1979 or 1980? In that era, "let's build it to our exacting specifications, high tolerances, and unique mission requirements" pervaded NASA / aerospace engineering and procurement. However much we admire it today, COTS was not their way, and wouldn't be for at least another decade.


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: