Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
|
from
login
Why Embedded Models Must Hallucinate: A Boundary Theory (RCC)
(
effacermonexistence.com
)
1 point
by
formerOpenAI
7 hours ago
|
past
|
2 comments
LLMs don't hallucinate – they hit a structural boundary (RCC theory)
(
effacermonexistence.com
)
5 points
by
formerOpenAI
1 day ago
|
past
|
3 comments
RCC: Why LLMs Still Hallucinate Even at Frontier Scale (Axioms Included)
(
effacermonexistence.com
)
2 points
by
noncentral
2 days ago
|
past
|
7 comments
RCC: A Boundary Theory Explaining Why LLMs Still Hallucinate
(
effacermonexistence.com
)
2 points
by
noncentral
3 days ago
|
past
|
4 comments
Are LLM failures – including hallucination – structurally unavoidable? (RCC)
(
effacermonexistence.com
)
4 points
by
noncentral
4 days ago
|
past
|
4 comments
RCC: A boundary theory explaining why LLMs hallucinate and planning collapses
(
effacermonexistence.com
)
2 points
by
noncentral
5 days ago
|
past
|
3 comments
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: