Hacker Newsnew | past | comments | ask | show | jobs | submit | opponent4's commentslogin

> You'd have to be specific what you mean by AGI

Well, they obviously can't. AGI is not science, it's religion. It has all the trappings of religion: prophets, sacred texts, origin myth, end-of-days myth and most importantly, a means to escape death. Science? Well, the only measure to "general intelligence" would be to compare to the only one which is the human one but we have absolutely no means by which to describe it. We do not know where to start. This is why you scrape the surface of any AGI definition you only find circular definitions.

And no, the "brain is a computer" is not a scientific description, it's a metaphor.


> And no, the "brain is a computer" is not a scientific description, it's a metaphor.

Disagree. A brain is turing complete, no? Isn't that the definition of a computer? Sure, it may be reductive to say "the brain is just a computer".


Not even close. Turing complete does not apply to the brain plain and simple. That's something to do with algorithms and your brain is not a computer as I have mentioned. It does not store information. It doesn't process information. It just doesn't work that way.

https://aeon.co/essays/your-brain-does-not-process-informati...


> Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.

This article seems really hung up on the distinction between digital and analog. It's an important distinction, but glosses over the fact that digital computers are a subset of analog computers. Electrical signals are inherently analog.

This maps somewhat neatly to human cognition. I can take a stream of bits, perform math on it, and output a transformed stream of bits. That is a digital operation. The underlying biological processes involved are a pile of complex probabilistic+analog signaling, true. But in a computer, the underlying processes are also probabilistic and analog. We have designed our electronics to shove those parts down to the lowest possible level so they can be abstracted away, and so the degree to which they influence computation is certainly lower than in the human brain. But I think an effective argument that brains are not computers is going to have to dive in to why that gap matters.


It is pretty clear the author of that article has no idea what he's talking about.

You should look into the physical church turning thesis. If it's false (all known tested physics suggests it's true) then well we're probably living in a dualist universe. This means something outside of material reality (souls? hypercomputation via quantum gravity? weird physics? magic?) somehow influences our cognition.

> Turning complete does not apply to the brain

As far as we know, any physically realizable process can be simulated by a turing machine. And FYI brains do not exist outside of physical reality.. as far as we know. If you have issue with this formulation, go ahead and disprove the physical church turning thesis.


That is an article by a psychologist, with no expertise in neuroscience, claiming without evidence that the "dominant cognitive neuroscience" is wrong. He offers no alternative explanation on how memories are stored and retrieved, but argues that large numbers of neurons across the brain are involved and he implies that neuroscientists think otherwise.

This is odd because the dominant view in neuroscience is that memories are stored by altering synaptic connection strength in a large number of neurons. So it's not clear what his disagreement is, and he just seems to be misrepresenting neuroscientists.

Interestingly, this is also how LLMs store memory during training: by altering the strength of connections between many artificial neurons.


ive gotta say this article was not convincing at all.


A human is effectively turning complete if you give the person paper and pen and the ruleset, and a brain clearly stores information and processes it to some extent, so this is pretty unconvincing. The article is nonsense and badly written.

> But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

Really? Humans don't ever develop memories? Humans don't gain information?


probably not actually turing complete right? for one it is not infinite so


> And no, the "brain is a computer" is not a scientific description, it's a metaphor.

I have trouble comprehending this. What is "computer" to you?


Cargo cults are a religion, the things they worship they do not understand, but the planes and the cargo themselves are real.

There's certainly plenty of cargo-culting right now on AI.

Sacred texts, I don't recognise. Yudkowsky's writings? He suggests wearing clown shoes to avoid getting a cult of personality disconnected from the quality of the arguments, if anyone finds his works sacred, they've fundamentally misunderstood him:

  I have sometimes thought that all professional lectures on rationality should be delivered while wearing a clown suit, to prevent the audience from confusing seriousness with solemnity.
- https://en.wikiquote.org/wiki/Eliezer_Yudkowsky

Prophets forecasting the end-of-days, yes, but this too from climate science, from everyone who was preparing for a pandemic before covid and is still trying to prepare for the next one because the wet markets are still around, from economists trying to forecast growth or collapse and what will change any given prediction of the latter into the former, and from the military forces of the world saying which weapon systems they want to buy. It does not make a religion.

A means to escape death, you can have. But it's on a continuum with life extension and anti-aging medicine, which itself is on a continuum with all other medical interventions. To quote myself:

  Taking a living human's heart out without killing them, and replacing it with one you got out a corpse, that isn't the magic of necromancy, neither is it a prayer or ritual to Sekhmet, it's just transplant surgery.

  …

  Immunity to smallpox isn't a prayer to the Hindu goddess Shitala (of many things but most directly linked with smallpox), and it isn't magic herbs or crystals, it's just vaccines.
- https://benwheatley.github.io/blog/2025/06/22-13.21.36.html


This was, still is and for the foreseeable future it'll be bad advice. Stay on Windows 10 as long as you can. With LTSC IoT that's 2032. We will figure out something then.

It doesn't work. Right now the main issue is Wayland vs X where Wayland is not working and will never work because the underlying ideas and goals do not align with that of a desktop. Someone described X as ALSA, Wayland as PulseAudio and we are waiting for PipeWire to arrive. Maybe Phoenix will sweep in to save the day, maybe something else will.

Also, hardware and software issues will always be there because the incentives are not there.

I swear Linux on the desktop adherents sound like they have some sort of Stockholm Syndrome but of course in reality just cognitive dissonance explains it.


Is the IoT version of Windows 10 a full desktop? I tried it on my Raspberry Pi when it was released for free but I didn't get past booting. I dropped it because I didn't understand what it was, or what I was supposed to do with it and I wasn't curious enough to find out. This is your suggestion for my grandmother over some simple Linux distro with a browser?


> Wayland is not working and will never work because the underlying ideas and goals do not align with that of a desktop.

Can you elaborate on this?

I don't use Wayland because it lacks something I need (unprivileged Scroll Lock LED control) but I'm curious about what else keeps people from using it.


This was true for a majority of users even fairly recently, but the 'niche' for which Linux is a better option has grown such that those who are worse off with Linux (better off with Windows) are becoming the 'niche'.


>Wayland is not working

What's your source on that? I have been using it for years. Skill issue?


I'm still on X because I'm too comfortable with dwm and I've never had problems with Wayland the few times I've tried KDE/Gnome. That said blaming the user for lacking skill for basic stuff like copy/paste between applications or screen sharing isn't constructive. Those features should just work and shouldn't require skill. Building a niche window manager from source is a different discussion.


Alright, good point. Sorry for my bad first comment, I am just a bit defensive after all these years of people saying Wayland is unusable. The problems you describe with screen sharing are likely because of missing xdg-portal implementation (the implementation depends on your setup).

>Those features should just work and shouldn't require skill.

Maybe, yeah. But we are on hacker news, I assumed people would be open to hacking on things, but the comments are always very much not like that.


> That being said, there are some serious costs and reasonable reservations to AI development.

Neither this nor the discussion here so far mentions ethics. It should.

According to latest reports AI now consumes more water than the global bottled water industry. These datacenters strain our grids and where needs can't be met they employ some of the least efficient ways to generate electricity generating tons of pollution. The pollution and the water problems are hitting poorer communities as the more affluent ones can afford much better legal pushback.

Next, alas, we can't avoid politics. The shadow that Peter Thiel and a16z (who named one of the two authors of the Fascist Manifesto their patron saints) casts over these tools is very long. These LLMs are used as a grand excuse to fire a lot of people and also to manufacture fascist propaganda on a scale you have never seen before. Whether these were goals when Thiel & gang financed them or not, it is undeniable they are now indispensable in helping the rise of fascism in the United States. Even if you were to say "but I am using code only LLMs" you are still stuffing the pockets of these oligarchs.

The harm these systems cause is vast and varied. We have seen them furthering suicidal ideation in children and instructing them on executing these thoughts. We have seen them generating non-consensual deepfakes at scale including those of children.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: