Hacker News new | past | comments | ask | show | jobs | submit login

I will say one thing: I pity the person (grad student?) that has to do error propagation analysis on a research project using posits (I'm the original implementor)



Yeah, I pretty much think that posit only makes sense for 32 bit and smaller, and that you want your 64 bit numbers to be closer to float64 (although with the posit semantics for Inf/NaN/-0).


Oh I actually only think it's useful for machine learning. I have some unpublished, crudely done research showing that the extended accumulator is only necessary for the Kronecker delta stage of the back propagation (posits trivially convert to higher precision by zero-padding).. you can see what I'm talking about I'm the Stanford video.

Fun fact: John sometimes claims he invented the name, but this is untrue; my old college website talks about building "positronic brains" and it's long been a goal of mine to somehow "retcon" Asimov/tng data, into being a real thing, and when this opportunity for some clever wordsmithing arrived I coined the term with the hopes that someone would make a posit-based perceptron, or "positron".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: