Hacker News new | past | comments | ask | show | jobs | submit login
Brain Floating Point (Bfloat16) (wikipedia.org)
3 points by ml_basics on May 28, 2023 | hide | past | favorite | 1 comment



bfloat16 is probably familiar only to ML practitioners; it's a reduced precision floating point format designed for ML models. I was surprised to learn that the "b" stands for "brain", as in the team at google that developed it along with many other advances in machine learning.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: