
New Features in ES2019 - sveingjoby
https://javascript.christmas/2019/7
======
pythux
Really liking these new developments! JavaScript is not the horrible language
it used to be anymore (in my very subjective opinion). When I started writing
JavaScript in 2016 after a Python background I was frustrated every day...
Then after some time and learning about which dark corners to avoid, which
tools to use, etc. it became quite an enjoyable experience. Nowadays I mostly
use a mix of JavaScript and TypeScript depending on the projects and it's been
great.

One thing that I am a bit afraid about though is that the language might
become more complex and complicated over time because of backward
compatibility (C++-like?). I saw that they will be introducing a new function
"replaceAll(...)" to avoid the weird semantics of "replace(...)" (stopping
after first occurrence unless a 'global' RegExp is passed as argument). And
that's great, but that's a new function, and maybe 'replace(...)' should have
had this behavior from the start. I hope to be wrong on this one!

~~~
mantap
The difference between JS and C++ is that C++ is addressing a complex problem
domain whereas JS is addressing a simple one. So JS can put its complexity
budget towards features that improve productivity whereas such features in C++
require huge amounts of complexity. Take for instance anonymous functions in
JS compared with lambdas since C++11, the C++ version needs a complicated
capture syntax, capability for capture by value and reference, type inference,
templating, and more. All that complexity is useful and needs to be there (I
have used most of it in my own code) - in JS anonymous functions are immensely
simpler. That's why JS will never become even 1/2 as complex as C++.

~~~
lightgreen
> C++ version needs a complicated capture syntax

C++ doesn't actually need it. Rust proved that capture everything by reference
or capture everything by move is enough for all practical use cases.

C++ lambdas is another example where C++ committee chose complex uber-
universal solution instead of much simpler which solves 99% use cases.

~~~
masklinn
> C++ doesn't actually need it. Rust proved that capture everything by
> reference or capture everything by move is enough for all practical use
> cases.

That's kinda true but kinda not: precise "capture clauses" is a common design
pattern in rust[0] showing that the flexibility is extremely useful, and I
think I've seen rumblings about adding them to the language.

I mean technically you only ever need `move` closure, "ref" closures are
already a convenience.

[0]
[http://smallcultfollowing.com/babysteps/blog/2018/04/24/rust...](http://smallcultfollowing.com/babysteps/blog/2018/04/24/rust-
pattern-precise-closure-capture-clauses/)

------
IgorPartola
Not ES directly, but you know what I need on an almost daily basis? A JSON
date type. It’s obnoxious to have to pass a string back and forth and parse it
on either end between server and browser.

~~~
BillinghamJ
There's plenty of good reasons one doesn't exist. JSON is not JS-specific. It
is a standard interchange format used by thousands of languages, many of which
have very different date implementations.

If you did have a JSON date, how would you decide what it was? Would it be a
timestamp, or a civil date-time? Would it have timezones? Offsets? Locations?
Would there be a database along with it required to understand it correctly?

Just stick with ISO8601 strings.

~~~
ryanbrunner
A 'date' type could be as simple as syntax indicating that a particular
ISO8601 string IS a date. Right now, there's no reasonable way to infer that
unless you're already aware of the schema of the JSON object you're receiving.

You could make the same argument with numbers - there's no reason you couldn't
just pass strings containing the number - but there's advantages to being able
to distinguish between a string and a number when the schema is unknown.

~~~
BillinghamJ
If you're not aware of the schema, how would you use the value? Same as any
other use of strings - enums, types, etc

------
chrismorgan
[This comment is wrong; see masklinn’s response for my misunderstanding.]

> _Now the function would rather insert an escape character before the
> character code so that the result is still readable and valid UTF-8 /UTF-16
> code:_

> _JSON.stringify( '\uD83D');_

> _/ / '"\\\ud83d"'_

I’m inclined to consider this a misfeature, unbreaking something that I’m
_glad_ was broken and should have _remained_ broken.

Unpaired surrogates are (to simplify terminology a little) basically invalid
Unicode. On the web platform, the _only_ situation where unpaired surrogates
should be encountered is as a transient state on user input, on platforms that
send non-BMP characters through in two pieces (which is most browsers on
Windows). Beyond that, nothing should _ever_ deal in unpaired surrogates,
because they _will_ make things blow up and your life miserable.

Unpaired surrogates cannot be represented in UTF-8, or in well-formed UTF-16
(and that’s where JavaScript did the annoyingly bad thing that we’re paying
for decades later, going with UTF-16 and not requiring well-formedness). Hence
I’d quibble with the description of the result as “valid UTF-8/UTF-16 code”.
(Sure, the actual JSON byte stream will be valid, but it contains an escaped
string that is not valid.) U+FFFD REPLACEMENT CHARACTER was at least as valid,
arguably more valid.

Various JSON parsers will choke on such strings, as observed in what I’d
consider the canonical JSON spec:
[https://tools.ietf.org/html/rfc7159#section-8.2](https://tools.ietf.org/html/rfc7159#section-8.2).

In the I-JSON restricted subset, which is what I’d say everything should
_actually_ limit itself to, such strings are disallowed and will cause parse
failure:
[https://tools.ietf.org/html/rfc7493#section-2.1](https://tools.ietf.org/html/rfc7493#section-2.1).

~~~
masklinn
You're misreading the article section, although the article is also partially
wrong: the old behaviour was to output the lone surrogate in the JSON
stream[0] (the "�" here is confusing: it is a display artefact of the specific
font, not U+FFFD), the new behaviour is to replace the unpaired surrogate by
_the literal 6-character long escaped representation of the unpaired
surrogate_.

That is, formerly U+D83D would pass through straight as U+D83D (or possibly be
replaced by U+FFFD in some implementations), whereas it is now "encoded" as
the sequence U+005C U+0075 U+0044 U+0038 U+0033 U+0044.

That is, _the entire point_ of the "well-formed stringify" proposal[1] was to
_stop generating broken JSON_ :

> Rather than return unpaired surrogate code points as single UTF-16 code
> units, represent them with JSON escape sequences.

[0] it's possible that some implementations would replace the lone surrogate
by U+FFFD but according to MDN:

> Before this change JSON.stringify would output lone surrogates if the input
> contained any lone surrogates; such strings could not be encoded in valid
> UTF-8 or UTF-16

this is also the behaviour I observe on an old safari, and the one which is
quoted in the tc39 proposal[1]:

> JSON.stringify can return strings including code points that have no
> representation in UTF-8 (specifically, surrogate code points U+D800 through
> U+DFFF). And contrary to the description of JSON.stringify, such strings are
> not "in UTF-16" because "isolated UTF-16 code units in the range
> D800₁₆..DFFF₁₆ are ill-formed" per The Unicode Standard, Version 10.0.0,
> Section 3.4 at definition D91 and excluded from being "in UTF-16" per
> definition D89.

[1] [https://github.com/tc39/proposal-well-formed-
stringify](https://github.com/tc39/proposal-well-formed-stringify)

~~~
chrismorgan
Ah, you’re right; thanks for the correction.

The article _does_ contain an error here, for the code point on the page _is_
actually U+FFFD, not U+D83D (which of course is unrepresentable there). I
trusted that (being a little surprised that it’d do the replacement, which is
more what I’d expect of Python, but accepting it all the same) and didn’t pull
out an old browser to confirm what _actually_ happens, which is, as you say,
an unpaired surrogate code point.

The article should probably say '"\ud83d"' instead of '"�"' (even though the
number of backslashes still then requires thought), because yielding U+FFFD
there would be a completely valid (and preferable, in my opinion) solution.

~~~
masklinn
> The article does contain an error here, for the code point on the page is
> actually U+FFFD, not U+D83D (which of course is unrepresentable there).

The MDN page also does that, that's a bit misleading but TFA commits way worse
a sin: it actually states the unpaired codepoint is swapped for the
replacement character:

> In earlier versions, these would be replaced with a special character:

which is just wrong.

> yielding U+FFFD there would be a completely valid (and preferable, in my
> opinion) solution.

I agree. The proposal does not explain whether U+FFFD was considered, I expect
they picked the escaped version to limit or avoid data loss: if you get an
U+FFFD you have no idea what it used to be.

------
ravenstine
> Optional catch binding

Eh... that's great and all, but why not go the whole way and allow me to just
use `try { }` without a catch-block? I'm sure that was part of a conversation
somewhere, and I wonder why they chose not to go that far.

~~~
rictic
I expect it's because of `finally`. You can write a try without a catch today,
provided there's a finally, however the error will continue to propagate in
that case.

    
    
        try {throw new Error('')} // exception is caught
        
        try {throw new Error('')} // uncaught exception
        finally {console.log('finally')}
    

That would be a bit of a footgun, as adding a finally clause that does
something innocuous like logging would result in an uncaught exception!

~~~
ravenstine
That makes sense. I guess `try` means something a bit different in JavaScript
than `rescue` does in Ruby.

It seems like this code in Ruby:

```

begin

    
    
      do_something
    

rescue => e

    
    
      handle_error e
    

ensure

    
    
      log_something
    

end

```

is equivalent to this in JavaScript:

```

try {

    
    
      doSomething();
    

} catch(e) {

    
    
      handleError(e);
    

} finally {

    
    
      logSomething();
    

}

```

I'm not saying that they're 100% equivalent, but I was thinking of the `try`
block as if it's `rescue` when really it has more in common with `begin` in
Ruby in that it's defining a block of code that can be rescued/caught if an
error occurs.

So I guess my confusion comes from this type of block being named `try` rather
than something like `do` or `begin`.

TL;DR I was just thinking about this in the wrong way from what I can tell.

EDIT: Yet I think that my point might still stand in that, if `try` is just a
block that has its own scope, like an if-block or `begin` in Ruby, then it
should be possible to not require either `catch` or `finally` since its own
behavior has little to do with error handling.

For instance, this is possible:

```

let x = 'foo';

try {

    
    
      let x = 'bar'; 
    

} catch(err) {

    
    
      // noop
    

}

console.log(x); // foo

```

The try-block would be way more useful if it could just define scope without
being tied specifically to error handling. There are lots of circumstances
where defining scope would be handy outside of conditionals and creating
functions for a similar purpose.

I guess what might have made the most sense, if JS could have started over, is
that there'd be no `try` statement and that `do` could be used in its place.

```

const words = ['foo', 'bar', 'baz'];

let i = 0;

do {

    
    
      console.log(words[i]);
      
      i++;
    

} catch(err) {

    
    
      console.error('whoops!');
    

} while (i < words.length);

```

We can't go back now since JS, for good reasons, is remaining mostly
backwards-compatible. But the ability to do the above without extra gymnastics
would be pretty cool.

~~~
rictic
> The try-block would be way more useful if it could just define scope without
> being tied specifically to error handling. There are lots of circumstances
> where defining scope would be handy outside of conditionals and creating
> functions for a similar purpose.

This is the case today! With `let` and `const`, any {} block creates a scope.
So:

    
    
        let foo = 'outside';
        {
          let foo = 'inside';
          console.log(foo); // logs 'inside'
        }
        console.log(foo); // logs 'outside'

------
jamilbk
I don’t mean to hate on ECMAScript, but is anyone else slightly surprised
these features weren’t in earlier?

Like I wonder how many bugs were introduced because the developer didn’t
realize Array.sort() was unstable.

~~~
haxiomic
Another sort() quirk that catches people out is not realising it uses string
comparison by default

    
    
        [1,2,10].sort()
    

= [1,10,2]

~~~
alkonaut
It’s the curse of the optional arguments. Same thing with parsing numbers
where the second argument magically specifies the base.

If the comparison function argument was mandatory in sort() this wouldn’t be a
problem.

~~~
masklinn
> It’s the curse of the optional arguments. Same thing with parsing numbers
> where the second argument magically specifies the base.

Stupid defaults is not "the curse of optional arguments", it's the curse of
stupid defaults.

Most languages have defaults for these two operations yet have proper defaults
rather than stupid ones. Python's list.sort() and int() certainly do, so do
Java's Collections.sort() and Integer.parseInt().

Uniquely awful defaults is a historical (and defining) feature of javascript,
not of having default values, or optional arguments.

~~~
alkonaut
The horror that arises in js is due to a combination of unfortunate things
like function arity, and coercion. The one I was thinking of is the famous

[1,2,10].map(parseInt)

It’s several things that on their own aren’t mad which taken together produces
a crazy result. 10 as the default for an optional second argument of parseInt
is fine. But map should only use a single argument closure by default. So
map(parseInt) must be equivalent to map(x -> parseInt(x)).

------
qwtel
fromEntries(), aside from the symmetry with entries() is a really neat
feature.

For example `Object.fromEntries(new URLSearchParams(window.location.search))`
will parse a query string into a nice JS object. This works because the
iterator of the search params class returns tuples.

What is already available today is `new
URLSearchParams(Object.entries(params))` for turning a "params" object into a
query string.

------
haecceity
I wonder what kind of shenanigans you could do with Function.toString(). It'd
be even better if they had Function.toAST().

~~~
Nullabillity
If I remember correctly, AngularJS 1.x would use it for dependency injection.

~~~
nkozyra
How so? Were these eval()ed by Angular?

I don't follow what it would use function strings for determining missing -
presumably - polyfills that you couldn't do in better ways.

~~~
matsemann
In angular 1, if you had a function on the form

    
    
      myApp.controller('MyCtrl', function($myCoolService) {
        //...
      });
    

Angular would see that you are looking for a parameter named myCoolService,
see if it already has it, and then inject it when calling your controller.

This of course broke in various ways with minification when the parameter
names were mangled. So one had the opportunity to use "array syntax",
requesting a dependency by a string. Or use a preprocessor in the build script
adding the array syntax before minifying.

~~~
wging
Did it by any chance pull parameter names out by using a magic-number index
into the function text? If so, that'd be broken by the 'function /* a comment
*/ foo () {}' change in the article (and might be what I remember having seen
a long time ago). Glad I don't have to deal with that sort of thing. Does
Angular 2 do it the same way?

------
0_gravitas
No pipe operator, no TCO, no pattern matching- _sigh_.

I wonder if Java will get pattern matching before JS
[https://medium.com/better-programming/top-5-new-features-
exp...](https://medium.com/better-programming/top-5-new-features-expected-in-
java-14-82c0d85b295e)

------
galaxyLogic
One thing I would fix about JavaScript is that currently "return" followed by
a newline returns undefined. I would say that only return followed by ';'
should return undefined. Else it should return the value of the expression
following "return", whether on the same line or not, until the next ";".

Not sure how how much backwards compatibility this would break, but at some
point we must allow for newer better less error-prone versions of the language
to co-exist in different modules. That is not difficult at all. Consider the
case of "use strict"; which does just such a thing. We could have "use strict
2020" etc.

------
heisenhuegel
Demanding a stable Array.sort is surprising. I find it questionable, as it
implies a performance trade-off. Why not add a new stableSort function?

Trying to retroactively fixing bugs in code that did not follow the standard
is not a good idea IMHO.

~~~
masklinn
The "stable Array.sort" proposal was actually made because all major browsers
had switched to a stable sort implementation:
[https://github.com/tc39/ecma262/pull/1340](https://github.com/tc39/ecma262/pull/1340)

It was actually proposed by the v8 / chrome people
([https://v8.dev/features/stable-sort](https://v8.dev/features/stable-sort))
after they'd finally come around to implement one of the oldest v8 feature
requests:
[https://bugs.chromium.org/p/v8/issues/detail?id=90](https://bugs.chromium.org/p/v8/issues/detail?id=90)

And stable does make for a better default than unstable: it offers stronger
guarantees and more reliable behaviour. That's doubly important because by and
large developers work to the implementation not the spec. And it's unlikely
you'll ever change that.

The last one to switch was Chakra (Edge), and that one was _weird_ as it used
a stable sort up to 512 elements, and unstable above.

Funnily enough, Mozilla had originally switched to a stable sort because MSIE
used a stable sort:
[https://bugzilla.mozilla.org/show_bug.cgi?id=224128](https://bugzilla.mozilla.org/show_bug.cgi?id=224128)

------
zzo38computer
These features are good (except I think Function.prototype.toString is a
mistake).

(An alternative to String.prototype.trimStart would be to use
String.prototype.replace, although trimStart would probably be more
efficient.) Still some things missing includes: a goto command, ability to
disable automatic semicolon insertion, macros, enhanced regular expressions,
and a built-in regular expression quotation function (it is easily enough to
implement in JavaScript, but it seem to me the kind of thing that should be
built-in).

------
javajosh
Good stuff! Normally I cringe to see new language features, but these are
pretty good (Java after lambdas, even arguably after generics, is pretty
crufty, IMHO)

One thing I'd to request is that `Object.fromEntries()` simply skip undefined
entries, rather than do...strange things. I wrote an `mapObject` function that
looks like `(a,fn) => Object.fromEntries(Object.entries(a).map(fn))` and
occasionally the fn needs to skip an entry and the easiest way to do this is
just return `undefined`.

~~~
masklinn
> One thing I'd to request is that `Object.fromEntries()` simply skip
> undefined entries, rather than do...strange things.

Not sure what strange things it does. fromEntries simply takes each entry,
sets its first item as key (stringifying if necessary as object keys are
necessarily strings) and the second item as value. Naturally falls from these
that a null or undefined entry will error, and an empty entry will create an
{undefined: undefined} item.

> I wrote an `mapObject` function that looks like `(a,fn) =>
> Object.fromEntries(Object.entries(a).map(fn))` and occasionally the fn needs
> to skip an entry and the easiest way to do this is just return `undefined`.

Use flatMap instead?

~~~
javajosh
Yes, it errors out. And flatMap isn't applicable here. Here's some complete
test code you can try in your browser right now:

``` let p = (a,fn) => Object.fromEntries(Object.entries(a).map(fn)) p({a:1,
b:2}, ([k,v])=> v === 2 ? undefined : [k,v]) ```

My intent is to skip the entry with `b:2`. However, both map and flatMap error
out. It would be nice if `fromEntries` did what I think is the appropriate
thing, which is just skip undefined and null. Is this not the behavior you'd
expect?

Note that without special treatment in fromEntries, there's _no way_ for the
mapping function to indicate "skip this entry". You have to do it in another
pass, or some other way.

~~~
masklinn
> And flatMap isn't applicable here.

It's absolutely applicable, flatMap can trivially act as a filterMap by
returning an empty array in the "remove this element" case:

    
    
         let p = (a,fn) => Object.fromEntries(Object.entries(a).flatMap(fn))
         p({a:1, b:2}, ([k,v])=> v === 2 ? [] : [[k,v]])
    

there you go.

You can even make the adaptation transparent by wrapping the fn:

    
    
         let p = (a,fn) => Object.fromEntries(Object.entries(a).flatMap((v) => {
             let r = fn(v);
             return r == null ? [] : [r]; 
         }));
         p({a:1, b:2}, ([k,v])=> v === 2 ? undefined : [k,v])
    

> My intent is to skip the entry with `b:2`. However, both map and flatMap
> error out.

Return an empty list from flatmap to remove the entry and a singleton list
containing the new pair to alter it.

> Is this not the behavior you'd expect?

No. I would much rather have the existing function which has a very clear and
straightforward behaviour and is not prone to silently misbehaving on buggy
code.

> Note that without special treatment in fromEntries, there's no way for the
> mapping function to indicate "skip this entry".

I fail to see an issue with that. If you want to remove an entry, remove the
entry.

------
topicseed
I love Javascript and TypeScript, but with all the technologies gravitating
around JS, the setup for a project gets clunkier and clunkier.

But overall, loving the direction JS is taking.

~~~
gatherhunterer
I hear this a lot, but everything can be handled with a single tool: Webpack.
Even a TS project can compile with Webpack and all of its assets, minification
and compression needs can be handled in the same file. While you’re at it you
can set up a development server with source mapping. Just take an afternoon to
learn Webpack and be done with complaining about setup overhead. I set up a
multi-stage build for TS Project References but that’s only necessary if you
want to share code between the front and back ends. Aside from that you should
only need a single build step.

~~~
proxybop
> take an afternoon to learn we pack

If only it was that easy. The weeks I’ve spent hunting down webpack specific
bugs because the plugins don’t always quite work with typescript
transformations and source maps...

~~~
gatherhunterer
If you have spent multiple weeks on a Webpack bug then you are not trying to
fix it. You could have asked Stack Overflow over and over again in a matter of
weeks.

There are many options for source maps due to their performance overhead and
the docs describe then in detail and provide a guide for when to use each one.

------
homero
How are these used on a browser? Do you have to wait for browsers to update?
It's something I don't understand

~~~
gatherhunterer
Every individual feature has a support graph on MDN.

[https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Refe...](https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Reference/Global_Objects/AsyncFunction)

------
recursive
I've wished for a stable sort. But there doesn't seem to be any reliable way
to determine whether the current implementation is stable or not. For most
language features, you can do detection to find out whether you can use it or
not. This one seems to be different.

------
jaequery
i just wish for something like lodash to be baked into js

~~~
irrational
A lot of the newer functionality in JS was obvious influenced by jQuery. Maybe
the same thing will happen with lodash over time.

