Hacker News new | past | comments | ask | show | jobs | submit login

> If the JSON[B] is {amount: 12.09} then you have no guarantee that someone picking it up outside of postgres uses float conversion which could lead to 12.0899999. With BSON, amount is a decimal128 and stays that way, fromBytes() and toBytes().

But how is that relevant? With BSON as transfer format you still don't have that guarantee that the client won't misinterpret the data it receives by using bfl8 as its own internal data format after deserialization.

All languages I know have libraries that can decode the numbers from JSON strings into their relevant arbitrary-precision decimal type. If the user doesn't use that, why would it be a failure of JSON[B]/PostgreSQL?




With BSON I am guaranteed that, for example, in Java, when I do `Document d = BSON.fromBytes()`, then `BigDecimal amount = d.getDecimal128('amount')` is non-lossy. `amount` is typed; `double amount = d.getDouble('double')` is an error. Again, it's not a failure of JSON[B]/postgres. It's what happens when the data carrier (JSON) is just a string with a small inferred type suite. How many times have we run into issues when `{"v": 1.3}` works and `{"v": 1.0}` works but `{"v":1}` gets turned into an `int` by accident?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: