
BigInt: Arbitrary precision integers in JavaScript - fagnerbrack
https://github.com/tc39/proposal-bigint
======
johnsonjo
Since this is stage 3 this means browsers are beggining to implement it. If
you want to try it Chrome has had it since version 67 [1].

[1]:
[https://developers.google.com/web/updates/2018/05/bigint](https://developers.google.com/web/updates/2018/05/bigint)

------
coke12
What is the value of JSON.parse(JSON.stringify(BigInt(Number.MAX_SAFE_INTEGER)
+ 2n))?

> Finally, BigInts cannot be serialized to JSON.

Hmm, ok. I guess that's fine. But what about large numbers coming from over
the network? Can we get a BigInt-aware JSON.parse() standardized ASAP?

~~~
olliej
no. This is not related to parsing, it’s related to the JSON format.

JSON is a stable interchange format, and for that reason we’ve got a huge base
of tooling that uses and emits it. It’s all interoperable because there is
just that one syntax.

So if you say “add JSON.parse() API”, you’re saying “I want to transmit non-
Json strings but claim they’re json “.

That breaks:

* old browsers - not supporting bigints would mean they couldn’t parse any of your “json” data, even if you weren’t using the bigints. Eg feature detection wouldn’t work.

* all shipped products that read json, because as with the browsers they could not parse any “json” that contained invalid data.

[edit:

Ok, let's try to do something about the endless downvotes:

Say you do JSON.parse("9007199254740994") - 2^53+2

Should this parse to a BigInt or a Number? In JavaScript today it will parse
as a Number, without losing precision, but it is beyond the range of exactly
representable integers in JS. You can see this by doing
console.log(9007199254740995) and see that the output is 9007199254740996. So
what happens if I do that following:

JSON.parse("[9007199254740992, 9007199254740993, 9007199254740994,
9007199254740995]")?

the first and third are exactly representable as a double, the second and
forth are not. So should this parse as Number, BigInt, Number, BigInt? Or
BigInt, BigInt, BigInt, BigInt? What value should be the trigger for treating
an integer as a BigInt vs. a Number? What if we're doing

JSON.parse("[9007199254740992, 9007199254740993, 9007199254740994,
9007199254740995]").map(x=>x+2)

If any of the values get interpreted as a BigInt this will throw. But in
existing browsers it won't.

]

~~~
deathanatos
> _It’s all interoperable because there is just that one syntax._

And that syntax and specification leaves open the possibility of arbitrary
precision integers. From the JSON RFC[1]:

> _This specification allows implementations to set limits on the range and
> precision of numbers accepted._

Those limits vary by implementation. The subsequent paragraphs warn, of
course, that not all implementations support sending or receiving numbers
outside of certain ranges, and explicitly calls out the contiguous integer
range of an IEEE double as a good range for good interoperability.

> _“I want to transmit non-Json strings but claim they’re json “._

Those strings are already JSON.

There are serializers out there that support arbitrary precision integers.
Python, for example, will happily serialize such integers:

    
    
      In [6]: json.dumps(2 ** 512)
      Out[6]: '13407807929942597099574024998205846127479365820592393377723561443721764030073546976801874298166903427690031858186486050853753882811946569946433649006084096'
    

(This will parse in today's browsers, too, but you'll get an imprecise result,
as JavaScript's Number cannot represent that value exactly. Larger numbers
result in Infinity.)

[1]:
[https://tools.ietf.org/html/rfc8259#section-6](https://tools.ietf.org/html/rfc8259#section-6)

~~~
coke12
Yup, this is exactly the problem. Currently JSON.parse() will silently
truncate large numbers. This behavior made sense when there was only one
numeric type in the language, but BigInt opens the door for arbitrary
precision integers.

~~~
testvox
What is the one language? json serialized data can be consumed in many
different languages which have very different numeric data types. The json
spec even says

> JSON is agnostic about the semantics of numbers. In any programming
> language, there can be a variety of number types of various capacities and
> complements, fixed or floating, binary or decimal. That can make interchange
> between different programming languages difficult. JSON instead offers only
> the representation of numbers that humans use: a sequence of digits. All
> programming languages know how to make sense of digit sequences even if they
> disagree on internal representations. That is enough to allow interchange

and later more precisely defines a number in a way that does not restrict its
maximum size.

> A number is a sequence of decimal digits with no superfluous leading zero.
> It may have a preceding minus sign (U+002D). It may have a fractional part
> prefixed by a decimal point (U+002E). It may have an exponent, prefixed by e
> (U+0065) or E (U+0045) and optionally + (U+002B) or – (U+002D). The digits
> are the code points U+0030 through U+0039.

[http://www.ecma-international.org/publications/files/ECMA-
ST...](http://www.ecma-international.org/publications/files/ECMA-
ST/ECMA-404.pdf)

------
TeMPOraL
Not trying to be too snarky, but does that mean that JavaScript is going to be
the first language that gets bigints before getting _actual integers_? How is
that a reasonable sequence of steps in language evolution?

~~~
ThrowMeDown01
> _before getting actual integers_

Javascript has always had "actual integers". As long as you stay within
Number.MIN_SAFE_INTEGER...Number.MAX_SAFE_INTEGER
(-9007199254740991..9007199254740991) the internal representation is an
integer representation.

[http://2ality.com/2013/10/safe-integers.html](http://2ality.com/2013/10/safe-
integers.html)

~~~
plopz
To get the operators to actually work correctly you need to use typed arrays
however. For example

let a = new Uint32Array(3);

a[0] = 5;

a[1] = 2;

a[2] = a[0]/a[1]

------
Schampu
I use them in a Vulkan API for node.js to handle 64bit interopability when
mapping Vulkan memory. Mapping Vulkan memory returns you a numeric address to
the memory region. To handle the address I use BigInt, which can then be used
to create an ArrayBuffer as a direct JS-side memory view where you can write
e.g. your texture data into. See [0] how its used on node-side and [1] the C++
implementation using the V8 API

[0] [https://github.com/maierfelix/node-
vulkan/blob/master/exampl...](https://github.com/maierfelix/node-
vulkan/blob/master/examples/cube/buffer.mjs#L1-#L7)

[1] [https://github.com/maierfelix/node-
vulkan/blob/master/genera...](https://github.com/maierfelix/node-
vulkan/blob/master/generated/1.1.85/src/index.cpp#L21-#L34)

------
gcbirzan
So, you can add a string to a BigInt, but not a Number? Their 'explanation'
for not allowing Number and BigInt to be mixed is that you could lose
precision. Not if you return a BigInt.

~~~
maxxxxx
That seems a pretty odd choice in both cases. Adding a number to BigInt should
return a BigInt. I am also always wary of languages that allow mixing of
strings and numbers. From my experience this can cause a lot of problems.

~~~
gcbirzan
It's hilarious, because:

    
    
      "1" + 1 === "11"
      "1" - 1 === 0
      "1" + 1n === "11"
      "1" - 1n: Exception

~~~
maxxxxx
Our testers write a lot scripts in PHP and I have spent countless hours
debugging stuff like this. A special kind of fun is to figure out what
expressions count as TRUE or FALSE. JavaScript seems to have the same,
slightly different, problems.

------
falcrist
Shouldn't this be called either "arbitrary size integers" or "arbitrary
precision integer arithmetic"?

Integers by definition already have infinite precision...

~~~
rocqua
Javascript actually uses floats to describe integers, so with very large
integers, you actually lose precision.

This does 2 things. It allows for arbitrary size integers (whereas before,
javascript had integers up to FLOAT.MAX), and it represents all these integers
with perfect precison.

~~~
falcrist
As someone mostly stuck in the C, C++, C#, assembly, and BASIC, this is new to
me.

In those languages, an integer is actually infinite precision with limited
range. A quick Google shows that the "number" primitive in JavaScript is a
double. Are you telling me JavaScript has NO integer type?

Maybe I haven't seen enough languages yet, but that seems pretty crazy to me.

~~~
rocqua
I agree with you on all points. I don't think I've ever written more than 100
lines of javascript. And a lot more of the C like stuf.

Yet, I do see the morbid sense in only using floats.

------
z3t4
Hopefully JavaScript will get a new Number system before 2038, which is the
new Y2k problem.

------
qwerty456127
What about BigDecimal?

~~~
Aardwolf
I wonder the same. The readme says:

"The / operator also work as expected with whole numbers. However, since these
are BigInts and not BigDecimals, this operation will round towards 0, which is
to say, it will not return any fractional digits."

What BigDecimal are they referring to? I don't find any tc39 big decimals,
only third party libraries

~~~
thomasfoster96
There isn’t a BigDecimal proposal anywhere, the authors are just making it
clear that their proposal is limited to integers.

~~~
qwerty456127
> There isn’t a BigDecimal proposal anywhere

Why? I find it so weird there still is no BigDecimal in JavaScript.

~~~
JeremyBanks
It is relative common for software to need integers larger than 53 bits. It
seems quite uncommon for software to require Decimal arithmetic. Do you have a
motivating example?

~~~
qwerty456127
Anything that deals with money. Every e-commerce or fintech app.

