Hacker News new | past | comments | ask | show | jobs | submit login

> a stable diffusion-like algorithm which ingests the HTML/CSS/JS specification documents and the requested website response and "simply" outputs the adequate framebuffer

haa, that's a fun idea! setting aside efficiency, though, neural networks aren't usually Turing-complete, so arbitrary JS isn't going to work. but, I could imagine building a very strict, minimalist browser engine (think XHTML and Scheme rather than HTML and JS), and learning a transformation between the two.

and for perf to be attainable, rather than an NN you could learn a bunch of syntax transformation rules between the two.




Precisely, fun is the operative word. As I was writing I was thinking that perhaps you would still need to run client-side non-layout JS, however, "attention is all you need": it seems that there are some architectures for Turing complete neural networks [1].

I wouldn't worry about performance: Nvidia breaks world records with H100 [2], Intel is going for 6 GHz processors [3], for performance you just have to be patient.

[1] 2019, Jorge Pérez et. al, On the Turing Completeness of Modern Neural Network Architectures, https://arxiv.org/pdf/1901.03429.pdf

[2] https://blogs.nvidia.com/blog/2022/09/08/hopper-mlperf-infer...

[3] https://www.tomshardware.com/news/intel-teases-8-ghz-raptor-...


The neural network architectures in use are all Turing complete, it's just that halting is a precondition of meeting their reward function.

It would be interesting to see if a search engine's worth of raw data, and enough training on Chrome's output, could build a JS interpreter. I'm skeptical but don't see why not in principle.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: