Hacker News new | past | comments | ask | show | jobs | submit login

The ability to "want" to reproduce is not necessary to worry about the impacts of replication and evolution. Biological viruses can only respond to external cues, never do anything by themselves, and certainly don't harbour "wants" or other emotions in any meaningful sense, but their replication and evolution have massive effects on the world.



That's an extremely poor analogy.

Computer hardware does not spontaneously multiply based on external factors. If you're talking about the software propagating by itself, it would still need full access not just to the originating machine, but to the remote machine to which it is attempting to propagate.


Viruses passively hijack the mechanisms of their much more sophisticated host organisms, getting them to import the virus and actively read and act upon its genetic code. Is it really such a stretch to imagine a sufficiently convincing software artifact similarly convincing its more complex hosts to take actions which support the replication of the artifact? I genuinely don't see where the analogy breaks down.


You're completely misunderstanding the differences between LLM models and current AI vs. viruses, and also the complexity gap between them. Viruses are incredibly old things programmed by evolution to help themselves self-propagate. This is coded into their genetic structure in ways that go completely outside the scope of what anyone can hope to do with current AI. It literally has no parameters or self-organizing internal mechanisms for behaving in any major way like a virus.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: