Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
sdenton4
87 days ago
|
parent
|
context
|
favorite
| on:
Post Apocalyptic Computing
Lots of models are trained with dropout, which is kinda like bitrot at very high rates...
clayhacks
87 days ago
[–]
But that’s during training, not during inference. Also it’s more structured in where the dropout is happening. I do think that points to them being somewhat resilient but we haven’t had LLMs exist long enough for a good test
Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: