Hacker News new | past | comments | ask | show | jobs | submit | pcf's comments login

The reason it's increasingly an "echo chamber" is because liberals are so offended by actual free speech that they stopped posting there. To blame conservatives for this development is illogical.


> They go looking for confirmation, rather than new information. This is why they're hard to untangle.

This applies to most readers of most things, not just fringe content on the Left or the Right.

Most people are stuck in their confirmation biases, and few make an intellectual effort to look at topics from multiple angles and via multiple media outlets on various sides of the political spectrum.


Do you think this bias is part of the replication crisis in science?


The UK is moving rapidly towards 1984, so it only seems fitting that Orwell's letters and various material should just be... memory-holed.

And at the very end of the day, no one will understand why "He loved Big Brother" was not a happy ending.


The publishers are probably embarrassed that they turned down the book because of their pro-Stalinist vision. Holodomor? What Holodomor?


How can you call "alt-right" and anti-vaxxers as "extremism", and especially compare it to terrorism?

Seems like you're the one living on a different planet.


It is: https://www.mactrast.com/2024/04/whatsapp-signal-threads-and...

Still possible to access with VPNs, though.


Not sure what you referred to directly, but GPT-4o gives me these two similar quotes:

"The more corrupt the state, the more numerous the laws." - Tacitus (Roman historian)

"Show me the man, and I’ll show you the crime." - Lavrentiy Beria (head of Stalin's secret police)


Ah yeah, the general sentiment was definitely "there are laws that everyone breaks every day, so they can always get you on something". Mostly because I remembered the anecdote about the USSR outlawing fax machines that no business could do without, so they could always charge any business with a crime.


Out of curiosity – why do you wear them in the shower? Are they meant to withstand that?


The first few times were by accident, but once I realized they are durable enough I started wearing them sometimes while showering if I've got a good audiobook or YT video that I don't want to put down.

I think they're supposed to withstand some degree of moisture, but I don't believe they're designed specifically to be submerged. However, one of mine (gen 2 airpods) got fully submerged for maybe 3 seconds in the bathtub but it managed to start working again when I let it dry out.

I'm not saying I recommend others treat their AirPods as if they're water resistant but, in my experience, all the generations of AirPods can take a bit of a water beating. The only ones I've never done this with are any of the Pro models.


In some brief testing, I discovered that the same models (Llama 3 7B and one more I can't remember) are running MUCH slower in LM Studio than in Ollama on my MacBook Air M1 2020.

Has anyone found the same thing, or was that a fluke and I should try LM Studio again?


Just chiming in with others to help out:

By default LM Studio doesn't fully use your GPU. I have no idea why. Under the settings pane on the right, turn the slider under "GPU Offload" all the way to 100%.


That froze the whole computer, and even disabled the possibility of clicking both the internal and external trackpad.

The model is Dolphin 2.9.1 Llama 3 8B Q4_0.

I set it to 100% and wrote this: "hi, which model are you?"

The reply was a slow output of these characters, a mouse cursor that barely moved, and I couldn't click on the trackpads: "G06-5(D&?=4>,.))G?7E-5)GAG+2;BEB,%F=#+="6;?";/H/01#2%4F1"!F#E<6C9+#"5E-<!CGE;>;E(74F=')FE2=HC7#B87!#/C?!?,?-%-09."92G+!>E';'GAF?08<F5<:&%<831578',%9>.='"0&=6225A?.8,#8<H?.'%?)-<0&+,+D+<?0>3/;HG%-=D,+G4.C8#FE<%=4))22'*"EG-0&68</"G%(2("

Help?


Maybe so the web browser etc. still has some GPU without swapping from main memory? What % does it default to?


Two replies to parent immediately suggest tuning. Ironically, this release claims to feature auto-config for best performance:

“Some of us are well versed in the nitty gritty of LLM load and inference parameters. But many of us, understandably, can't be bothered. LM Studio 0.3.0 auto-configures everything based on the hardware you are running it on.”

So parent should expect it to work.

I find the same issue: using a MBP with 96GB (M2 Max with 38‑core GPU), it seems to tune by default for a base machine.


Make sure you turn on the use of the GPU using the slider. By default it does not leverage the full speed.


Yeah, me. Even without other applications running in the background and without any models loaded, the new 0.3 UI is stuttering and running like a couch-locked crusty after too many edibles on my Macbook Air 2021, 16GB. When I finally get even a 4B model loaded, inference is glacially slow. The previous versions worked just fine (they're still available for download).


Don’t forget to tune your num_batch


On macOS it actually uses less resources, and you can use Chrome extensions.


On Mac? Just tell them to use Safari, why install twice the same browser?


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: