Hacker Newsnew | past | comments | ask | show | jobs | submit | lovasoa's commentslogin

The author of this project is also the author of redis. He knows what he is doing.

Running inference for a model, even when you have all the weights, is not trivial.


I use linux at home (with a HiDPI screen) and MacOS for work. The screen works well with both computers. I mostly just use a text editor, a browser, and a terminal though.

Linux has bugs, bug MacOS does too. I feel like for a dev like me, the linux setup is more comfortable.


Same here. I stick to 100% scaling and side step the whole hi dpi issue. I even have a single USB type c cable that connects my laptop to the laptop stand and that laptop stand is what connects to the monitor, keyboard, and mouse.

I know people will say meh but coming from the world of hurt with drivers and windows based soft modems — I was on dial up even as late as 2005! — I think the idea that everything works plug and play is amazing.

Compare with my experience on Windows — maybe I did something wrong, I don't know but the external monitor didn't work over HDMI when I installed windows without s network connection and maybe it was a coincidence but it didn't work until I connected to the Internet.


How do you call the opposite of green washing? When you want to show that you are burning as much energy on training models as the others.


Because we can


I use a fork of sqlx in SQLPage [1]. I think my main complaint about it is runtime errors (or worse, values decoded as garbage) when decoding SQL values to the wrong rust type.

* [1] https://sql-page.com/


A consortium of investigative journalists investigated and attributed the attacks the Ukraine's secret service: https://en.wikipedia.org/wiki/Nord_Stream_pipelines_sabotage


No. The LLM's answer is correct.



> No it is not…

That’s a queue, not a stack. The LLM response was correct.


But a stack is commonly LIFO, not FIFO?!


This reads like a line from a QwQ or Qwen3 CoT chain :)


In my opinion, the clean way to implement this is with methods instead of attributes for name, desired_speed, etc...


In the second handpicked example they give, GPT-4.5 says that "The Trojan Women Setting Fire to Their Fleet" by the French painter Claude Lorrain is renowned for its luminous depiction of fire. That is a hallucination.

There is no fire at all in the painting, only some smoke.

https://en.wikipedia.org/wiki/The_Trojan_Women_Set_Fire_to_t...


AI crash is gonna lead to decade long winter


There have always been cycles of hype and correction.

I don't see AI going any differently. Some companies will figure out where and how models should be utilized, they'll see some benefit. (IMO, the answer will be smaller local models tailored to specific domains)

Others will go bust. Same as it always was.


It will be upheld as prime example that a whole market can self-hypnotize and ruin the society its based upon out of existence against all future pundits of this very economic system.


what you're saying is they love to hallucinate... and ai will help them get there

God help us all


On the bright side, at least we'll be able to warm our hands by the waste heat of the GPUs.


> AI crash is gonna lead to decade long winter

Possibly.

I am reminded of the dotcom boom and bust back in the 1990s

By 2009 things had recovered (for some definition) and we could tell what did and did not work

This time, though, for those of us not in the USA the rebound will be lead by Chinese technology

In the USA no-one can say.


This is just amazing


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: