Hacker News new | past | comments | ask | show | jobs | submit login

it's very common in javascript to have chained methods, to be able to do things like

    _(array)
    .compact()
    .map(function(x) {
       return x * x;
    })
    .filter(function(x) {
        return x > 100;
    }).value()
(example from underscore/lodash)

The problem with this is to be able to add a custom function to this pipeline, you have to extend the prototype of the library with your own functions, or re-implement it.

Extending the prototype is considered bad practice for a lot of reasons, for example due to polluting code in other modules or upstream changes breaking your code.

If instead of chaining methods we chained functions we are left with this awkward syntax:

    custom(filter(map(compact(array), function(x) {
         return x * x;
    }), function(x) {
         return x > 100;
    }));
it's much cleaner to write it like this:

    compact(arr)
    |> map(function(x) {
        return x * x;
    })
    |> filter(function(x) {
        return x > 100;
    })
    |> custom
The linked page has a nice example and explanation about this concerning promises https://github.com/mindeavor/es-pipeline-operator#sample-usa...



> The problem with this is to be able to add a custom function to this pipeline, you have to extend the prototype of the library with your own functions, or re-implement it.

That's exactly what transducers try to solve, and fortunately they do exist for js (http://jlongster.com/Transducers.js--A-JavaScript-Library-fo...) where processing of individual elements is separated from plumbing, so you can add your own custom computation in the chain. It even has pretty good performances (http://jlongster.com/Transducers.js-Round-2-with-Benchmarks), all with the standard data structures !

Of course transducers are useful only when you want to transform data, not when you want to act on them in the chain.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: