It's pretty clear that Joe Armstrong respecting that the speed of light is a thing and that data locality/data gravity are real is starting to pay off in big ways.
I do wonder if maybe streaming large data chunks over Erlang distribution might be a problem and a secondary data channel (e.g. over udp or sctp) might be worth playing with.
Looking forward to NX transformations that take distributed training next level.
> I do wonder if maybe streaming large data chunks over Erlang distribution might be a problem and a secondary data channel (e.g. over udp or sctp) might be worth playing with.
You may want to take a look at the partisan[0] library written in Erlang. It is basically that, a reimagination of distributed Erlang, except that it can be multiplexed over multiple connections.
Yeah but partisan gives you a "ton of stuff you might not need" plus the point is to treat distribution as a control plane and separate concerns from the data plane. There used to be things to worry about using Erlang distribution in general -- irrespective of backend, iirc, like HOL blocking (I think those are resolved now).
> It's pretty clear that Joe Armstrong respecting that the speed of light is a thing and that data locality/data gravity are real is starting to pay off in big ways.
I'm familiar with Joe Armstrong and Erlang/Elixir, but do you have a particular reference in mind where he was specifically discussing this? Is it one of his papers or talks? Just looking for another interesting thing Joe Armstrong said or thought. :)
I don't have a reference offhand, but I have seen it. It's mostly a vibe. Remember that Joe was a physicist before a programmer: the synchronicity problem is pervasive in the design of the platform. Local immediate access to data is generally a special cased situation via an escape hatch with tons of big red warning signs.
I do wonder if maybe streaming large data chunks over Erlang distribution might be a problem and a secondary data channel (e.g. over udp or sctp) might be worth playing with.
Looking forward to NX transformations that take distributed training next level.