Hacker Newsnew | past | comments | ask | show | jobs | submit | doug_durham's commentslogin

I became a manager so I could solve bigger problems. Good managers do dive into the details. It's a mistake to think that as a manager, you don't have to concern yourself with the minutia. You still have to do homework and deep thinking. you just don't have to write the code

I'll push it back against this a little bit. I find any type of deliberative thinking to be a forcing function. I've recently been experimenting with writing very detailed specifications and prompts for an LLM to process. I find that as I go through the details, thoughts will occur to me. Things I hadn't thought about in the design will come to me. This is very much the same phenomenon when I was writing the code by hand. I don't think this is a binary either or. There are many ways to have a forcing function.

I think it's analogous to writing and refining an outline for a paper. If you keep going, you eventually end up at an outline where you can concatenate what are basically sentences together to form paragraphs. This is sort of where you are now, if you spec well you'll get decent results.

I agree, I felt this a bit. The LLM can be a modeling peer in a way. But the phase where it goes to validate / implement is also key to my brain. I need to feel the details.

In what way is "AI being shoved down you throat"? Did you think that SwiftUI was shoved down your throat? Did you think that CoreData was shoved down your throat. Perhaps develop a more nuanced critique.

> In what way is "AI being shoved down you throat"?

Ask Microsoft, they have much more experience with that.

> Did you think that SwiftUI was shoved down your throat?

On a scale of 1 to 10, it has been shoved down our throats at level 1 or maybe 2. Thankfully it's optional.

> Did you think that CoreData was shoved down your throat

No.

> Perhaps develop a more nuanced critique.

I believe most people who used Xcode perfectly know what I'm talking about.


How are you forced to use it in Xcode? If you don't opt to use it then you don't see it.

I saw multiple comments on HN complaining about Firefox adding AI. I use FF every day and what happened is there was a single popup asking you if you want to opt in to try using it next to an icon you can hide. In the year since I said no to both I haven’t been bothered once.

People just like to complain


> In what way is "AI being shoved down you throat"?

This is a very strange question. It more correct to ask "In what way is AI NOT being shoved down your throat".

> Did you think that SwiftUI was shoved down your throat?

Yes

> Did you think that CoreData was shoved down your throat.

No


Copilot being added to the Xbox app on iOS is the latest ridiculous example I've seen of AI being shoved down everyone's throat.

it really is getting ridiculous; atlassian has this other totally useless ai called rovo that invents events/meetings and notes when it tries to summarize a tree of documents and offers random useless "suggestions" for jira docs...

Most people don't have the time to "self host". I could easily self host, but I don't because it's not worth my time.

Unless you are in the business of writing flight control software, OS kernels, or critical financial software, I don't think your own code will reach the standards you mention. The only way we get "correct under all conceivable scenarios" software is to have a large team with long time horizons and large funding working on a small piece of software. It is beyond an individual to reach that standard for anything beyond code at the function level.

Ample evidence of production software being produced with the aid of AI tools has been provided here on HN over the last year or more. This is a tiresome response. A later response says exactly what they produce.

Most of what I see are toys. Could you point us to the examples of production software from AI? I feel like I see more "stop spamming us with AI slop" stories from open source than production software from AI. Would love some concrete examples. Specifically of major refactors or ground up projects. Not, "we just stared using AI in our production software." Because it can take a while to change the quality (better or worse) of a whole existing code base.

I imagine people who are shipping with AI aren’t talking about it. Doing so makes no business sense.

Those not shipping are talking about it.


So, “trust me bro”? When people find a good tool, the can’t stop talking about it. See all the tech conferences.

Absence of evidence, while not the only signal, is a huge fucking signal.


From what I've seen it's not even "trust me bro", but "we are having so much fun 'building', we don't have time for anything else".

"Us"??? Most of "us" don't need to be convinced that AI as a software development tool has merit. The comment literally two below my comment says that they develop banking software. At this point you can be confident that most of the software that you use that has had recent updates has been developed with the aid of AI. Its use is ubiquitous.

I didn't say AI as a software development doesn't have merit. I asked what production software was being produced from or predominantly with AI tools. I just see a lot more examples of "stop the slop" than I do of positive stories about AI being used to build something from scratch. I was hoping you had a concrete example in all of the hay. Are my expectations based on the hype too high?

That wasn't supposed to be an opportunity for you to get defensive, but an opportunity for you to show off awesome projects.


Sounds like you’ve got multiple ways to write off any example you’re given charged up and at the ready.

I was just asking for a non-confounded example of what was claimed. But okay.

Ok, the web portal/learning management site for the university I work at. I’m part of a small team of 5 devs but not a single one of us has developed without the use of AI tooling in two years.

I’d say it’s rarer to find a dev who doesn’t use AI tools in their arsenal these days, that’s why your question sounds so odd to me.


There will always be a niche for any form of expression. However technologies change practice. It is your responsibility to be able to solve problems that balance performance, cost, schedule, and quality. Use the right tool for the job.

One of the tools require constant use to justify its existence though.

A better formulation is "every feature is a liability". Taking it to the line of code level is too prescriptive. Occasionally writing more verbose code is preferable if it makes it easier to understand.

> A better formulation is "every feature is a liability". Taking it to the line of code level is too prescriptive.

Amount of code is a huge factor but maybe not the best wording here. It's more a thing of complexity where amount of code is a contributing metric but not the only one. You can very easily have a feature implemented in a too complex way and with too much code (esp. if an LLM generated the code but also with human developers). Also not every feature is equal.

> Occasionally writing more verbose code is preferable if it makes it easier to understand.

Think this is more a classic case of "if the metric becomes a goal it ceases to be a metric" than it being a bad metric per se.


This sounds wrong, features have to be the value of your code. The required maintenance and slow down to build more features (technical debt) are the liability, which is how I understood the relationship to "lines of code" anyway.

Wrong or not, the industry embraced it.

I can sort of understand it if I squint: every feature is a maintenance burden, and a risk of looking bad in front of users when you break or remove it, even if those users didn't use this feature. It's really a burden to be avoided when the point of your product is to grow its user base, not to actually be useful. Which explains why even Fischer-Price toys look more feature-ful and ergonomic than most new software products.


You need to explicitly tell it to debate you. Also you cite and example of someone using ChatGPT to discuss personal issues. We are talking about technical discussions here.

You attribute more literary depth to Asimov than really existed. He was a Chemist and liked to write speculative fiction. The three laws gave him a logical framework to push against to write speculative fiction. That's really all the depth there is to it. That said I love Asimov and I love the robot stories.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: