I use linux at home (with a HiDPI screen) and MacOS for work. The screen works well with both computers. I mostly just use a text editor, a browser, and a terminal though.
Linux has bugs, bug MacOS does too. I feel like for a dev like me, the linux setup is more comfortable.
Same here. I stick to 100% scaling and side step the whole hi dpi issue. I even have a single USB type c cable that connects my laptop to the laptop stand and that laptop stand is what connects to the monitor, keyboard, and mouse.
I know people will say meh but coming from the world of hurt with drivers and windows based soft modems — I was on dial up even as late as 2005! — I think the idea that everything works plug and play is amazing.
Compare with my experience on Windows — maybe I did something wrong, I don't know but the external monitor didn't work over HDMI when I installed windows without s network connection and maybe it was a coincidence but it didn't work until I connected to the Internet.
I use a fork of sqlx in SQLPage [1]. I think my main complaint about it is runtime errors (or worse, values decoded as garbage) when decoding SQL values to the wrong rust type.
In the second handpicked example they give, GPT-4.5 says that "The Trojan Women Setting Fire to Their Fleet" by the French painter Claude Lorrain is renowned for its luminous depiction of fire. That is a hallucination.
There is no fire at all in the painting, only some smoke.
There have always been cycles of hype and correction.
I don't see AI going any differently. Some companies will figure out where and how models should be utilized, they'll see some benefit. (IMO, the answer will be smaller local models tailored to specific domains)
It will be upheld as prime example that a whole market can self-hypnotize and ruin the society its based upon out of existence against all future pundits of this very economic system.
Running inference for a model, even when you have all the weights, is not trivial.