A slightly spread out translation of the function to J below. The idea is to take lengths 2 2 4 8 16... up to the required length, and then repeatedly interleave the two smallest ones to get a fractal pattern. The interleaved values are differences of white noise, and a final sum undoes these differences to leave a sum of white noise at several different frequencies.
pink =: {{
len =. (1>.i.@>.)&.:(2&^.) 2>.y
diffs =. (2-~/\0,[:(--.)?@$&0)&.> len
+/\ y {. >,@,.&.>/|. diffs
}}
Adapted from [0], which I was able to simplify some when I ported it from an earlier J version [1]. I think having the whole algorithm there at once was definitely helpful in figuring out what I meant to do and writing it more cleanly.
I don't really see the appeal in using a math-like notation for things like this. The J code doesn't look anything like what I would write down to express this mathematically, and the equivalent python code is nearly as short and much more readable to me.
from matplotlib import pyplot as plt
import numpy as np
plt.plot(np.cumsum(np.random.choice([-1,1], 100)))
The appeal isn't immediately obvious with simple things, but something more complicated like 'plot 100 random walks of length 1000' is basically as easy in J and slightly less easy in numpy. (How long would it take you to adjust the numpy code? If it is less than about 5 seconds, J wins)
The cost of doing something extra is much lower in J/APL than other more verbose languages just because of how easy it is to type a few characters and get a result - this combined with a repl and similar things makes it really nice for exploration.
Intuitively it seems like the more complicated the problem, the bigger the advantage for a language like numpy where functions are used for most higher level abstractions. For example what if you want to change the logic to do a stopped game, where you now keep each walk at zero once it becomes zero, and count the portion of walks that stop after some fixed duration? In numpy I can always fall back to list comprehensions or for loops for something like that. Is it easy to express logic like this in J?
Yes, that example is easy, although you got it wrong, I said 100 random walks of length 1000, which should be (1000, 100) not (100, 1000). However, there are still more 'moving parts' in numpy than in APL/J, you have to modify the tuple, and the axis, and other things. It all adds up when making lots of small iterations.
The modification you mentioned is straightforward in any array language. Here is what you mentioned in APL (I know APL more than J but I'm sure the J would be basically the same):
⊃⍤⍸⍤1⊢0=+\¯1*?100 1000⍴2
This is the number of steps to return to 0 for each of 100 random walks of length 1000. (It is then easy to analyse the frequency/average/etc with only a few more characters)
Interesting, but I guess I'd have to learn APL or J to compare. Without understanding what those symbols mean, it seems much less like "math" than the numpy version.
⊃ is 'first', ⍸ is where, f⍤g is like function composition, and ⍤1 means apply to each rank 1 (vector), so ⊃⍤⍸⍤1 means 'the first true for each vector'
Having a tiny bit of APL experience, I thought 2 was already 1 line too many, before I saw that one of the lines was essentially just an import statement. ;-)
IMHO the APL-family languages are precisely good at things like this, where there's lots of "array processing" going on; on the other hand, typical branchy business logic or other more "mundane" algorithms tend to be a bit harder to express.
Co-dfns is a compiler for a subset of Dyalog APL that targets GPUs. Also supposedly Futhark, a language with a compiler generating GPU code, is a good candidate to target for an APL compiler. An example APL-to-GPU pipeline is APL->apltail->tail2futhark->futhark->GPU as explained in [1].
> One misconception is that languages like J are in the same caliber as Brain F**k and other esoteric or golfing languages which are mostly for recreational programming. Languages like J however were designed to be a replacement to traditional maths notation - an interface for thinking or a tool of thought as their inventor Kenneth E. Iverson called them in his Turing Award lecture.
Does it actually succeed though? The author aims to show how J helps conceptualize a random walk, but from where I stand, it just makes everything look like a regex or something. I don't see what the insight's supposed to be.
Everybody knows you can do a random walk by simulating an array of random 1 and -1 values and doing a cumsum on that array. You can do that in MATLAB or Python or R too, and it's about as short and more readable.
Not sure about J, but I feel APL certainly does, at least for me - the symbol set makes it very easy to sketch out a solution.
Part of that ease is the generality of the various operators. That is the reductions and cumulative prefix can be done with any function. For example ⌊\ as cumulative min - useful as Vanessa McHale shows <http://blog.vmchale.com/article/numba-why>
kspalaiologos also shows APL enabling mathematical understanding on her blog <https://palaiologos.rocks/>
An APL equivalent of the J is +\¯1 1[(?100⍴2)] although others exist as there's no exact equivalent for J's { - see <https://aplwiki.com/wiki/From>
> One misconception is that languages like J are in the same caliber as less practical languages (esoteric languages like brainf*k) that use symbols more than words. The assumption is that these are mostly for recreational programming.
Making some crude assumptions here, I believe you've already invested a lot of time into studying python, or some other well known language. Understanding python or C gives you an idea of hundreds of other languages because they use similar keywords and structures to represent their code. English also helps a great deal in understanding these languages.
Now, looking at this, giving a language like APL or J a chance means that you have to spend time learning their kind of notation. If you spend your time looking at APL without making an effort to understand it, then you cannot be surprised that it looks like regex to you.
J would be much more palatable if it focused on integrating itself into an existing environment rather than doing its own thing.
For example, say I want to the digits of 1000!
echo '!1000' | j9 -c gives '_' which is the greatest number.
echo '!1000x' | j9 -c gives some of the digits. | rev, gives ...8393650(...)
echo '(9!:37) 0 _ 0 1 \n !1000x' | j9 -c gives all the digits. | rev, gives 0000000(...)
echo '!1000' | ivy works as expected, but has a noticeable startup time.
Of course, J and only J works just fine internally with the second form.
On the subject of modules: Don't. If your plot program is so great then I want to use it, outside J. If its main feature is interoperability with an operability-hamstrung language, then to quote another commenter, "Pass".
I don't know if this counts as major, but I work on a presentation tool written with J and Godot. It's a "time travelling" REPL recorder built atop a console widget library. The Godot integration is written in rust (so also a start on rust integration). Godot provides the ability to sync audio and arbitrary animations/graphics with the console animations.
I cannot share specific projects. J is particularly good for numeric/algorithmic experimentation. It is interactive/REPL-based. Types and boilerplate are minimal. There is no compilation and minimal packages. This is the language I reach for when step 2 is not going to be “install these packages” or deploy to share code. It is most remarkable for data hacking, as it is trivial to manipulate structures and maintain performance. I use python, R, and clojure frequently, but the ability to move quickly in J is without parallel. Weaknesses include namespacing and deployment, although I have seen deployment of substantial codebases both on desktops and servers. Multi-threading and AX512 instructions in _your_ (not package) code, from the REPL are some of what you get with j904.
It sounds like we do similar sorts of things (based on tool list) but each time I poke at an APL like system I back away clear on learning curve and unclear on value.
At core, my job is arithmetic on 3D arrays of approx 10x1000x100,000,000.
The rub is that for every LOC of written manipulating those structures I’ve got 100 LOCs doing IO (broadly defined) and then 1,000-10,000 doing some form of ETL, QC, normalization (I.e. find and validate the correctness of the magic numbers that go in the cells of the big array).
Do you think J/APL would be of any use to me and if so where in your similar projects’ life cycle does it crop up?
Possibly so. We do simple IO primarily from parquet or csv data, and I am working on the Apache Arrow/Flight package currently. Our data sets typically fit in RAM but either if you have enough RAM or a file amenable to memory mapping, you shouldn’t have a problem. J has a memory mapping utility. Numerical data is straightforward. We do ETL, QC, and normalization in J. Avoid transposes where you can, but reshapes are trivial and done with metadata, so fast. Overall what you describe sounds like a fun project to try in J.
I suspect the J package Jd is probably the most non-trivial public codebase. I don’t love the coding style (functions are long and scripted) and it doesn’t make use of newer lambda functions (“direct definitions”) which are easier to read. https://github.com/jsoftware/data_jd
I remember looking up how people solved the problems on project Euler and there would always be a J answer for the first problems taking less than a few dozens characters.
At the time my own answers in C would take dozens or even hundred of lines. I didn't know J existed and thought people posting those absurdly small solutions were just trolling or running a scam.
Man was I left dumbfounded when I learned this thing actually runs and produces the correct answers.
Every few years I try again to pick up J, however I quickly get frustrated as the "ASCII noise" means you have to memorize an awful lot to be able to read real J. In contrast, I can generally read code in any C-like or Pascal-like language as there's enough familiar convention that I can patch in the rest.
For the lesser-gifted like myself, I wonder if one couldn't attain the power of J without the cryptic syntax. Haskell, for example, has a general very easily understood syntax for function applications: if "add a b" invokes a binary add function than "add a" is a unary function that adds a and the syntax allows for infix expressions, like "a `add` b", "(a `add`)", and (`add` b). (The latter more useful for non-alphanumerically named functions, like (++ [42]). I wonder if J/Apl couldn't use a similar strategy to trade a bit of terseness for something much more readable.
I don't understand the fetish for languages like this. Their notations and constructs aren't very high fidelity representations of the way most people who use mathematics think when working on a math problem. Maybe for a (relatively tiny) subset of such individuals, but not generally. The ability to parse and understand the code in these languages has nothing to do with how intelligent one is (outside outliers, such as those with low IQs).
You seem to take it as a given that the language needs to work the way you already think? The people enthusiastic about array languages have put some time (but not as much as you might expect) into learning this different way of thinking about problems and find it pretty neat. Okay, the fetishization of a language that looks really crazy is... probably the main reason it ever shows up on HN. But the different way of thinking is pretty neat!
I understood "lesser-gifted" as just referring to facility with symbols and memorization, so I feel like you're reading a bit too much into that comment.
Start using J as a desk calculator. One can just keep J window open, and run tiny expressions every once in a while for various needs.
It probably won't teach effectively more complex features, you might need a bigger project for that. But it will add to familiarity. Where other languages have lots of libraries, J has language where many things are just so easy to assemble from few language primitives.
Technically that’s not addressing my point, rather you are suggesting how to learn J, but sure I’ve tried this in the past and I’ll try it again perhaps with a cheatsheet.
[0] https://github.com/mlochbaum/BQNoise/blob/master/tracker.bqn...
[1] https://github.com/mlochbaum/JSound/blob/master/makedrums.ij...