Unfortunately JSON numbers are 64 bit floats, so if you're standards compliant you have to treat them as such, which gives you 53 bits of precision for integers.
Also hey, been a while ;)
Edit: I stand corrected, the latest spec (rfc8259) only formally specifies the textual format, but not the semantics of numbers.
However, it does have this to say:
> This specification allows implementations to set limits on the range/and precision of numbers accepted. Since software that implements IEEE 754 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision.
In practice, most implementations treat JSON as a subset of Javascript, which implies that numbers are 64-bit floats.
I'm being pedantic here, but JSON numbers are sequences of digits and ./+/-/e/E. Whether to parse those sequences into 64-bit floats or something else is left up to the implementation.
However what you say is good practice anyway. The spec (RFC 8259) has this note on interoperability:
> This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available.
JSON does not define a precision for numbers, so: it's often float64 (but note -0 is allowed, but NaN and +/-Inf are not), but it depends on your language, parser config, etc.
Many will produce higher precision but parse as float64 by default. But maximally-compatible JSON systems should always handle arbitrary precision.
Also hey, been a while ;)
Edit: I stand corrected, the latest spec (rfc8259) only formally specifies the textual format, but not the semantics of numbers.
However, it does have this to say:
> This specification allows implementations to set limits on the range/and precision of numbers accepted. Since software that implements IEEE 754 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision.
In practice, most implementations treat JSON as a subset of Javascript, which implies that numbers are 64-bit floats.