Hacker News new | past | comments | ask | show | jobs | submit login

> with a non-reversible function f into f(x) then you are losing information.

A non-reversible function f does not necessarily lose information. Some non-reversible functions, like one-way functions used in cryptography, can be injective or even bijective but are computationally infeasible to invert, which makes them practically irreversible while retaining all information in a mathematical sense. However, there is a subset of non-reversible functions, such as non-injective functions, that lose information both mathematically and computationally. It’s important to distinguish these two cases to avoid conflating computational irreversibility with mathematical loss of information.






On the arguments involving modeling inference as simply some function f, the specific expression OP used discounts that each subsequent application would have been following some backpropagation and so implies a new f' at each application, rendering the claim invalid.

At that point, at least chaos theory is at play across the population of natural language, if not some expressed, but not yet considered truth.

This invalidates the subsequent claim about the functions which are convolved as well, I think all the GPUs might have something to say whether the bits changing the layers are random or correlated.


if a hash can transform any size input, into a fixed length string, then that implies irreversibility due to the pigeonhole principle. It's impossible, not infeasible

Hashes with that property are just a special case of one-way functions.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: