Hacker Newsnew | past | comments | ask | show | jobs | submit | prmph's commentslogin

Careful there. I've resolved (and succeeded somewhat) to tone down my swearing at the LLMs, because, even though the are not sentient, developing such a habit, I suspect, has a way to bleeding into your actual speech in the real world

It does. But then, it's how i talk to myself. More generally, it's how i talk to people i trust the most. I swear curse and insult, it seems to shock people if they see me do it (to the llm). If i ask claude or chatgpt to summarize the tone and demeanor of my interactions, however, it replies "playful" which is how im actually using the "insults".

Politeness requires a level of cultural intuition to translate into effective action at best, and is passive aggressive at worst. I insult my llm, and myself, constantly while coding. It's direct, and fun. When the llm insults me back it is even more fun.

With my colleagues i (try to) go back to being polite and die a little inside. its more fun to be myself. maybe its also why i enjoy ai coding more than some of my peers seem to.

More likely im just getting old.


To be honest “no dummy” is how you would swear at a 4-year-old.

I often use things like: “I’ve told you no a bilion times, you useless piece of shit”, or “what goes through your stipid ass brain, you headless moron”

I am in full Westworld mode.

But at least when that thing gets me fired for being way faster at coding than I am, at least I’d haves that much frustration less. Maybe?

mostly kidding here


They all are. And once the context has rotted or been poisoned enough, it is unsalvageable.

Claude is now actually one of the better ones at instruction following I daresay.


In my tests it's worst with adding extra formatting or output: https://aibenchy.com/compare/anthropic-claude-opus-4-6-mediu...

For example, sometimes it outputs in markdown, without being asked to (e.g. "**13**" instead of "13"), even when asked to respond with a number only.

This might be fine in a chat-environment, but not in a workflow, agentic use-case or tool usage.

Yes, it can be enforced via structured output, but in a string field from a structured output you might still want to enforce a specific natural-language response format, which can't be defined by a schema.


No, you can't do real work on a $350 windows machine. No way such a setup is suitable for anything beyond browsing a tab or two and connecting to servers using SSH.

And, the whole shittiness of the experience will even distract you attempting real work: the horrible touchpad, the bad screen, the forced windows updates when you trying to start the machine to do something urgent, ads in Windows, the lack of proper programmability of Windows (unless you use WSL).... Add the fact that the toy is likely to break in a year or two. These issue exist on far more expensive Windows machines, how much more a $350 machine.

Leaving Windows machines and OS behind for more than a decade has been a continuing breath of fresh air. I have several issues with the Apple devices and macOS (as I have with Linux too), but on the whole they are far better than Windows. The only good thing about Windows that I miss on Macs is the file explorer and window management, not sure why Apple stubbornly refuses to copy those.


A lot of $350-ish Windows machines also don’t have SSDs but instead eMMC storage, which is dog slow and will make modern SSD-mandatory Windows feel even more awful to use.

If Windows/Linux/x86 is non-negotiable and that’s your budget, I would never in a million years recommend anything brand new. This is when you go pick up a $350 used midrange ThinkPad on eBay. It won’t outperform a Neo in terms of CPU and battery life but I guarantee it’ll be a better experience than the garbage routinely sold at this price point.


Of course you can. You can do real work on an $80 Amazon Fire. Yes, some things will be potentially impossible or frustrating but that's also true of the MacBook Neo, just a bit higher of a bar. A lot of this also depends on your definition of "real work".

$350 USD can get you a decent laptop with a SSD, 16GB RAM and something like an Intel N100 or N95. And they pretty comparable to a decent Intel Skylake CPU which are still pretty usable.

https://www.amazon.com/NIAKUN-Computer-Processor-Keyboard-Fi...

https://www.amazon.com/AOC-Computer-Processor-Laptops-Window...

Yes, the Neo has a faster CPU but it also has less RAM and less storage and costs more and has less ports. Besides ray traced games what can the Neo do that the others can't? They'll take longer but they'll get there.

And if you're willing to go used? That $350 goes a lot further.


> Yes, the Neo has a faster CPU but it also has less RAM and less storage and costs more and has less ports.

8GB on Apple Silicon is far better than 16 GB on Wintel, and I don't event trust the quality of 16GB of RAM on a bottom of the barrel Windows machine.

Would you prefer a machine that is still good 7 years from now with less ports, or one with more ports that you have to replace in 2 years? Yes it is more expensive now, but over 7 years it is an absolute bargain.


16 GB physical RAM is just better. Apple isn't magic. Gimme a break. Both devices have SSDs for fast swapping and have RAM compression. You can't spin up a VM that has 8GB RAM on the Neo, you can't load a large spreadsheet or do a decently sized digital painting. I could maybe buy a claim that 8GB is better on Mac than 8GB on Windows.

Why would you have to replace it in 2 years? How do we know Apple will even be offering updates to Neo in 7 years? Will 8GB still be usable in 7 years really? 8GB is barely on the fence already.

I wouldn't be surprised if Apple drops the Neo from software support in less than 7 years.


The ThinkBook 14 Gen 6 at Costco for $380 has a single thread passmark score of 2800. The laptop I use to develop most of my SaaS products, with IDEs and claude open etc, has a score of 2000. I run Linux, but win10 iot runs fine on it too.

> No, you can't do real work on a $350 windows machine.

Sigh. I mean, even absent the obvious answers[1], that's just wrong anyway. You're being a snob. Want to run WSL? Run WSL. Want to run vscode natively? Ditto. Put it on a cheap TV and run your graphical layout and 3D modelling work. I mean, obviously it does all that stuff. OBVIOUSLY, because that stuff is all cheap and easy.

All the complaining you're doing is about preference, not capability. You're being a snob. Which is hardly weird, we're all snobs about something.

But snobs aren't going to buy the Neo either. Again, the business question here is whether the $350 junk users can be convinced to be snobs for $600.

[1] "Put Linux on it", "All of your stuff is in the cloud anyway", "It's still a thousand times faster than the machine on which I did my best work", etc...


You mean that machine from 30 years ago that was running 30 year old software that has nothing in common with today’s development? And how well does Linux run on 4GB?

So weird to see this kind of flaming more than a decade after it got stale and silly. I mean, yeah, kinda: a 64MB K6-300 was pretty great!

But as to the 4G quip, that's showing some ignorance of where the market is. The value segment is filled with devices like this: https://www.amazon.com/HP-Stream-BrightView-N4120-Graphics/d...

That's a 16G windows box which will happily run multiple VMs for whatever your deployment environment is, something the Neo is actually going to struggle with. The Jasper Lake CPU is indeed awfully slow, but again for routine "dev" tasks that's just not a limit.

You would obviously refuse out of taste, but if you were actually forced to use this machine to do your job... you absolutely could.


But this has no real SSD. Back to external SSD like on Apple devices?

Are they going to extend my subscription time as a result? It ends today, but I was locked out an hour or so ago, and I'm not sure if that was actually due to this outage.

All the vibe coding is clearly not working out too well.


What about using the file name itself as the metadata storage?

I have used this approach with exiftool to add custom tags for “album”.

https://stackoverflow.com/a/68284922

Here is my source file for it. It was so long ago I don’t recall all the details but you can retrieve this information using exif commands.

https://github.com/jmathai/elodie/blob/master/configs/ExifTo...


That works but has strict length limits and is visibly ugly. Fine for very limited cases I guess

Indeed, that's why it bothers me that major technology vendors, especially Apple, would like to do away with the very concept of files, at least for non-power users.

They make it seem like data is bound up with apps, and should not have an independent existence. They also make it hard to import/export data, except via backups, which a very crude way of taking control of your personal data.

I'm working on a tool to work around these issues to the extent possible, allowing users to extract their information, as granular files, from device backups into their personal digital library. For immutable data, archival is ok, but for editable data, the biggest challenge is how to make the extracted data "live", i.e., available and editable on the devices again, preferably in the same apps used to create them. There seems to be no good solutions.


Apple keep their casual users on a short leash, but for those willing to tinker, iOS device backups are actually quite good for that archive use case, as you can extract most of your important data in the form of SQLite database files. The whole process is high friction and very user-unfriendly, and open documentation of these databases is lacking, but the fact that this route still exists at all is encouraging (in the sense that all hope is not yet lost). They could very easily have hidden all these files behind an Apple-managed encryption key. On the other hand, this niche affordance may serve to placate the power users who would be most likely to cause noise and revolt if our personal needs were not met. On the gripping hand, the device backup mechanism via iTunes feels so ancient that they probably just haven’t thought about it.

> The whole process is high friction and very user-unfriendly, and open documentation of these databases is lacking

Yeah, that's what I want to abstract way. There are tools that do this to some extent, but they stop at backup extraction, without helping you to integrate the extracted data into a proper personal digital library with easy retrieval.


I couldn't disagree more.

I have found KDE excellent and intuitive from the get go without much customization. To me GNOME is very primitive in comparison and ugly too.

KDE is the DE that made shed the bias again linux UIs as having that crummy look that set them apart from commercial desktops.

Sure it has issues (which mostly crop up when you are doing deep customization) but for the basics I don't even think any other Linux DE come close.


I won’t disagree with you, KDE is certainly usable from a clean install. But calling GNOME primitive in comparison feels off to me. It was actually KDE’s applets and overall fit and finish that pushed me toward GNOME.

Never expected someone to call GNOME straight up ugly. IMO it's currently the most stylish DE out there by far (comparing to the default look of other DEs). Opinions, huh.

GNOME is stylish, but what really sells it for me is that the apps match the aesthetic and feel cohesive.

Brainer d(aemon). Totally cool.

I have always seen it as Brain Nerd

Technically how will vibe code be identified? And how does one determine the level of human involvement that would make code copyrightable? What of the prompts? Are those copyrightable? What about the architectural and tactical design of the code if I do those myself?

I don't vibe code; I am firmly in charge of the architecture and code style of my projects, and i frequently give detailed instructions to AI tools I use. But, to me, this is leading to a weird place. Why would the result of using a tool to create something new not be copyrightable simply due to the specific tool used?

I think this whole hullabaloo is self inflicted. Code or an other creative work should stand on its merits. There is no issue with copyright and no issue with the ship of Theseus. The current copyright approach is still applicable: code (or any other creative work) that appears to be lifted verbatim from another work could be a copyright violation. Work that is sufficiently original (irrespective of how it was created) is likely not a copyright violation.


It's the courts' opinions that count. And they say that copyright only attaches to human creative work, and that does not include LLM output.

I can see there's going to be some huge court fights over this in the next ten years - there's no way some of the big media companies are going to be OK with their content being public domain, and no way are they going to just miss out on being able to produce it so cheaply with an LLM.


Indeed, 8gb is plenty, even for serious work and coding, if you use the machine well.

If you think getting more and more RAM solves every performance problem, I've got news for you: People are having beachballs on machines with 32GB and more.


I agree generally that on Mac you can 'get by' with 8gb and for the target audience on this, and how they'll likely use it - it's totally acceptable.

But if it's for serious work, this is not the device. 'Managing' the software to 'use the machine well' to get serious work done is unacceptable in 2026. It needs to just work and disappear into the background. I have enough to think about and micro managing the software running is out of the question.


> 'Managing' the software to 'use the machine well' to get serious work done is unacceptable in 2026

I agree, I just don't think the rush to get more and more RAM and storage is the root of the problem.

Why on earth does a browser need more than 10 GB to display web pages?? Why does macOS keep piling/hiding trash that should be deleted in "System Data"?

And, if you need to keep device backups, put them on an external drive; that's what those things are for.


Web pages are very complicated and there's no pressure on people to make less complicated ones, nor is there any way there could be pressure on them.

Images, complicated CSS, JavaScript ads, they can all use lots of memory!


It depends on how you define "serious work". Is it to get the best results possible, or is it to tax a computer as much as possible? Programmers would usually answer the latter, while users would answer the former.

That's why programmers put their stuff into Kubernetes which go into virtual machines, which go into eleven layers of javascript abstraction which go into twelve thousand node packages, which go into something else to end up with something with very basic functionality, which usually doesn't work very well.

Other pro computer users are focused on the results, so they use professional office software, calendars, communications, photo and video editing and effects, photo-realistic 3D editors, studio level audio and music editing software. All which lives perfectly fine on 8GB of RAM.


As always - it depends on the kind of ostensible "serious work" you do.

I've got 32GB and often work with legacy .NET Winform/WPF applications on a Macbook. That means spinning up a Windows 11 ARM distro virtual machine and running Microsoft Visual Studio. The VM has 8GB of ram allocated to it, and based on qemu-system memory pressure, it hovers around ~4-6GB of that.

I also do a lot of colorgrading and video editing with longform 4K videos using Davinci Resolve - scrubbing in an uncompressed format would absolutely thrash the hell out of your swap with only 8GB.


Add much as I'd like to be more efficient, modern toolchains absolutely need these kinds of numbers for big projects. My 48GB system will OOM trying to link clang unless I'm extremely careful. The 64GB system is a bit more forgiving, but I still have to go for lunch while it's working.

Sure, might be ambitious to do that sort of workload on a budget conscious laptop, but it'd be nice y'know?


If you're trying to link clang, this laptop is not for you. It's for people that would consider a chromebook for their use case.

Usually the problem then is more fundamental.

Rust exists. If you insist on using (or need to use) languages with horrendous build architectures like C++, then you probably need a proper build server then anyways.

I don't have XCode on my Macbook and have resolved not to do iOS development any time soon (although ideally I'd have wanted to dabble in it sometimes), because I've accepted I don't want to run the rat race of always needing beefier and beefier machines to keep up with Apple's bad habit of bloating it up for each version up for no good reason.

I don't run local LLMs on my machine, since even with 100s of GB of RAM, I hear the performance you can expect is abysmal.

I think it is a good idea to put pressure on hardware and software vendors to make their products more efficient.


Rust has similar issues with memory usage during linking as C++.

I can use a build server when I want one, but that's not always appropriate. Local builds are useful.


>People are having beachballs on machines with 32GB and more.

Well, sure, because the beachball means the main thread is hung, and that can happen for many reasons unrelated to memory pressure.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: