
J for C Programmers - sea6ear
http://www.jsoftware.com/help/jforc/contents.htm
======
rebootthesystem
I'm sorry, as someone who used APL professionally for about ten years J simply
fills me with an urge to defecate. It is an absolute abomination. It left
behind the power of notation as a tool for thought and turnd the concept of
APL into a pile of giberish on a page.

[http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pd...](http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf)
Where are APL and J? For the most part, dead. Sure there are people using
them. If someone like me who used the language almost exclusively for ten
years hasn't touched it for two decades I think it is safe to say: Great
language to learn. Powerful and fantastic concepts. But, no thanks. Happy with
C, C++ and Python.

The failure to open source a good APL didn't help either.

If J-Software wants to make a last valiant effort to bring this back my
opinion is simple: Read the paper linked above and go back to the roots.
Extend the language to make it object oriented and create interfaces to C, C++
and Python. That's a start. I could write a paper on how to improve APL. It
has to start with a commitment to moving software engineering into the realm
of communicating through notation rather than cryptic text.

~~~
FreezerburnV
What's interesting to me is how, as I've started to really look into things
like J or K, I've begun to appreciate the whole "cryptic text" side of
programming while rebelling against the more verbose OOP/whitespace delimited
code. Sure, initially it might look like some C code started printing out
random data beyond the end of an array, but as I've started to study it, it
really has started to make more sense to me to use this highly condensed form
of programming. I've even begun to toy with the idea that the original creator
of K uses wherein he hates scrolling, so all his code is organized into a few
small files so that he doesn't have to scroll any of them. At least one of the
benefits of this kind of coding are that you can easily see and reference
everything that is used, since your eyes only have to move a small distance
and you don't have to scroll 500 lines to see the definition of whatever
you're trying to look up. I also feel like it promotes a simplicity, due to it
promoting dense, terse code, you really need to make sure what you're writing
is the simplest possible to fit into a small amount of text.

Anyway, I'm not saying you have to personally like something such as J. If
you're productive in C/C++ and Python, by all means create things in those
languages. But please don't call J an absolute abomination, as I believe it
merely calls coders with a different mindset/aesthetic desire to code in it.

~~~
tailrecursion
After reading a lot of K code I had the same experience, where word-based
programs now read like someone from the Slow Talkers of America.

Learning APL, I was skeptical of its symbolry. But, by avoiding (verbal)
mnemonics the symbol becomes its own thing, it has very specific meaning that
is unique to APL. I can see the value in manipulating those very precisely
understood symbols once a person knows the language cold.

J however remains above my pay grade.

~~~
FreezerburnV
Out of curiosity, what are you doing that you end up reading a lot of K code?

------
tailrecursion
I've been struggling to learn J for a few weeks, and there are several things
that make it difficult.

1) Lack of spaces in the majority of code make it difficult to lex let alone
parse, meaning I can't tell what the operators are. When people use spaces it
makes a difference.

2) Every operator has at least two meanings. Some have three meanings. That's
more than 100 operators to memorize and 50 share the same spelling as the
other 50.

3) The parsing rules as described strike me as extraordinarily complicated.

The ideas in APL are worth learning. For example striping objects and using
grade up to make a concordance or annotate any other data structure. Another
example is having the primitives auto-sort collections when searching to get
nlogn complexity. Almost any language can use these ideas. If Python is one
step up from Standard ML because of dicts and out-of-the-box primitives, then
APL is one step up from Python.

If anyone who knows APL well is around, can I ask some simple questions?

1) Do user functions and if statements vectorize? Meaning, if I write a
recursive function to implement Euclid's GCD and then pass it two arrays, will
it give an array of pairwise GCDs?

gcd(a,b) = if b=0 then a else gcd(b,a mod b)

where = and mod are both vectorized APL primitives.

2) Why do I see so few user functions being used? I would expect application
code to consist mostly of specific functions but instead I see line after line
of unbroken primitive operators.

(EDIT: fixed formatting.)

~~~
kd0amg
_Do user functions and if statements vectorize?_

In J, user-written verbs have rank _ _ _ (infinite for both unary and binary
cases) by default, but you can rerank them with the " operator just like any
built-in verb.

 _Why do I see so few user functions being used? I would expect application
code to consist mostly of specific functions but instead I see line after line
of unbroken primitive operators._

Lots of purpose-specific functions can be built using composition forms on
built-in verbs. The resistance to naming sub-pieces of these compositions is
largely a cultural thing but still a point of dispute within the community --
some are very much in favor of giving descriptive names to intermediate
results.

~~~
tailrecursion
I imagine that making user functions would require leaving the one-symbol-one-
function spelling system that J programmers are accustomed to, and that
spelling system is part of the appeal. Also, if the idiomatic form with
primitives is shorter than the name, it's tough to consider the name an
improvement. :)

Thanks for your response.

~~~
kd0amg
Yeah, some things are so common and small as to not merit their own name
except in a tutorial. It might be fun to try using those to construct really
garden-pathy code. Stick the fork `+/%#` (average when used on its own)
somewhere in a train, but use the dyadic case, or position it so that parsing
splits it up.

~~~
klibertp
I found that it quickly becomes a mess with tens of parens when you try to do
this. Parsing rules of J are complex and the same sequence of symbols can be
parsed in a couple of different ways depending on the context - it's good to
keep expressions very short to avoid this.

------
plaguuuuuu
This is what Fibonacci looks like, apparently

    
    
        f1a=: 3 : 0
         {. y f1a 0 1x
        :
         if. *x do. (x-1) f1a +/\|.y else. y end.
        )
     
        f1b=: {.@($:&0 1x) : ((<:@[ $: +/\@|.@])^:(*@[))
    

I think I have PTSD already just from reading this stuff.

~~~
gd1
Hmm, that is pretty ugly compared to its APL derived cousins in Q/K/KDB...

    
    
      /first n+2 numbers of fibonacci sequence
    
      fibonacci:{{x,sum -2#x}/[x;0 1]};
    

And then...

    
    
      q) fibonacci[10]
    
      0 1 1 2 3 5 8 13 21 34 55 89
    

I haven't seen much J before, is that really the best implementation?

~~~
klibertp
There are some other example implementations here:

[http://www.jsoftware.com/jwiki/Essays/Fibonacci%20Sequence](http://www.jsoftware.com/jwiki/Essays/Fibonacci%20Sequence)

This:

f0b=: (-&2 +&$: -&1) ^: (1&<) M.

is probably the most elegant solution. It's recursive ($: stands for "self-
reference"), memoized, and it works:

f0b&.> i.20

│0│1│1│2│3│5│8│13│21│34│55│89│144│233│377│610│987│1597│2584│4181│

------
orbifold
I think the key idea behind APL and J is that they take one fundamental data
structure, namely arrays of arbitrary rank containing characters, numbers or
boxes, and define a number of operations on them, each of which has a natural
mathematical definition and satisfies a number of laws. For example APL has at
its core combinators ι ρ φ ⊤ , . \ / and a number of binary and unary
operators.

To give an example, / is like fold in a functional programming language, so +/
would be fold (+) in Haskell, there fold (+) :: Foldable t, Monoid m => m t ->
t. However you would have a hard time implementing J like arbitrary rank
arrays in Haskell efficiently and supporting all the operations it does.

The philosophy of J and K is to isolate the essential operations and laws the
data structure you are trying to manipulate has and to implement an
interpreter for those operations that is as efficient as possible. This is in
contrast to a conventional general purpose language that tries to implement a
sufficiently smart compiler and primitives to create new data types from a set
of primitives ones. The problem is that it is quite hard and sometimes
impossible to teach the compiler retroactively about all the nice properties
your newly defined data structure has.

I believe several such domain specific interpreters/compilers could be
"stitched together" into a general purpose language, maybe integrated by a
system specialised in symbolic computation. Instead of compilation to machine
code the top level system would compile down to the base operations for each
data structure and simplify according to the laws the base operations satisfy.
Haskell and many other do that by compiling down to a Core language and then
applying a huge number of simplification passes on that intermediate language.

------
DennisP
Now I've gone and installed J and started playing around. I'm doomed.

Just the most basic stuff is pretty nifty:
[http://www.jsoftware.com/help/learning/01.htm](http://www.jsoftware.com/help/learning/01.htm)

------
barbudorojo
Perhaps the hope with J was that symbols could create a grammar for concepts.
In maths we use intuition to define an integral as a certain kind of
summation, a progressive difference as a certain kind of derivative, and so
on. I was thinking that perhaps J symbols could create some form of grammar
that suggest new concepts from primitive ones. But the language for me don't
provide me with that kind of thinking and at the end is simply a way of
writing code that is cryptic and difficult to understand. The idea of using a
language with automatic vectorization is orthogonal to the syntax of J.

Haskell is different, the concepts of monad, category and so on allow you to
obtain a higher order view of your ideas, to frame them in a different
concept, that is experience that expand your concept of programming, I don't
experiment that feeling with J, I paid the price to learn the vocabulary and
grammar, but I don't think the language is worth the effort.

~~~
codygman
Thanks for sharing your experience!

------
yummyfajitas
A question. I've played with J a bit, but I haven't had the "burst of
enlightenment" that some suggest will come (ala lisp/haskell/etc).

My hypothesis based on a few conversations: the "burst of enlightenment" is
treating problems like "applied math" rather than computer science. I.e.,
"first represent as a vector, then do linear algebra, iterate this process
until < epsilon". Since I already do this in Python/Julia/Mathematica, J just
feels like a variant with esoteric syntax.

Possibly my problem is that I'm simply programming Python in J. Can anyone
with deeper knowledge of J confirm/deny this?

~~~
avmich
Can you rephrase your question? What you want to confirm or deny?

~~~
yummyfajitas
Is the "enlightenment" one gets from using J based on vectorization and linear
algebra as everyday tools (much like numpy) or something else?

~~~
avmich
Linear algebra - I wouldn't say so. Vectorization - yes, to a certain extent;
whenever you'd have a loop, you have to remind your self that it's implicit -
and that forces you to think about the collection as a whole instead of a
single element of the collection. Sometimes there is apparently no collection
to begin with. That, in turn, makes you think why - is it actually correct, do
you have a good design?

In J it's much more natural to load a whole dataset into memory and then work
with it as a whole. To make it easier, you're strongly nudged to design your
data structure simply and uniformly. Thinking about clarity of data structures
has many advantages - e.g., clearly seeing the algorithm as a whole, being
able to pinpoint (and question) exceptions, optimize on the high(est) level...

That doesn't mean J can't do sequential processing - but if you choose that,
you want to understand why: is it the nature of the problem or is it a
(questionable) implementation detail?

Some start learning J by using it as a calculator. Indeed, what could be
easier in the beginning - typing 2 + 2 <Enter> produces expected result, same
for 3 * 4 and 5 - 1, then you have built-in exponent (^3 produces 20.0855),
logarithm (^. 20.0855 - kinda easy to remember, an inverse function),
factorial (!6 makes 720)... Gradually it becomes less conventional - order of
operations is always right-to-left (as in sqrt(sqrt(sqrt 2))) - that would be
%: %: %: 2), negative numbers look like _1, _2, _3, sine and cosine are in the
family of o. functions - 1 o. 1.5 gives 0.997495, 2 o. is 0.0707372 - and
you're also allowed something like 5 + 1 2 3 which returns 6 7 8. A few more
iterations - and you can use moderately complex expressions with infix
notation, typing quite economically. Then vectorization gradually grows in
more complex expressions...

~~~
yummyfajitas
So concretely, consider the following problem. I'll tell you the python way,
maybe you could tell me the J way. I want to simulate a random walk, run a
statistical test on it at each time, then find the first time when the test
returns positive.

In numpy:

    
    
        data = cumsum(bernoulli(p).rvs(1024*16))
        test_statistic = special.xlogy(data, p) * ... other stuff...
        first_pos = where(test_statistic > b)[0][0]
    

bernoulli(p).rvs(1024*16) generates a random array of the form
[0,1,0,0,1,0,1], and where(...) returns an array of places where the condition
is true.

If we translated this code to J, would it be idiomatic?

(Incidentally it does happen to be wildly inefficient - a simple for loop (in
C) with no arrays is about 100x faster.)

------
jimrandomh
I am convinced that the J programming language is an elaborate hoax. I just
don't understand how anyone is falling for it.

~~~
zimpenfish
I feel the same way about Haskell and Monads.

~~~
WolfeReader
It's true: everything you don't already intuitively understand is just a
prank, and everyone who claims to get it is either a trickster or a mark.
Isn't it fortunate that you already know everything worth knowing!

How are your Blub projects going these days?

~~~
zimpenfish
That's ... quite the interpretation and projection from one throwaway line.

------
ryan-c
As someone who likes perl, this frightens me. It seems like an esolang, is it
really intended for productivity?

~~~
mhd
When I look at APL, it seems a bit clearer where this all is coming from.
After all, we're perfectly fine with using an asterisk as infix notation for
multiplication, even if all the rest is done with functions, methods etc..
This isn't just the case because we've learned mathematical notation in school
and thus "x <asterisk> y" seems natural enough (as opposed to e.g.
multiply(x,y)), but also that we learned to accept the "asterisk" as a
substitute for the middle position dot or the cross. The same is true for
division or especially weird stuff like logical-or...

J works along the same lines: First, you had to accept a "widened" set of
mathematical operators with arrays as the core. Which is APL with its funky
fonts and keyboards. And J then takes away your more readable, but
inconvenient to type notation and replaces it with ASCII. It's a bit like
learning calculus with lisp as your first notation...

That's the syntactical hurdle. Then you've also got to get used to the array-
centric programming model. Just like stacks for forth, functions for
functional programming or lists and recursion for lisp.

As opposed to lisp, I think the syntax is the bigger issue here for a change.
You don't just have to change the order in which you read things and ignore
some parens, you've got to get over some very deeply ingrained habits, not
just from other programming languages, but from normal writing itself. No
scanning, neither horizontally nor vertically. The closest analogy for me were
regexes, where you can't just skip ahead, either.

Take that with a grain of salt, I'm just trying to get into it myself. Two
things that I might want to try out to really _grok_ it are programming in
that style in other languages (like the infamous Whitney C example) to adjust
my reading style without having to adjust to other paradigms at the same time,
and to edit APL with _ed_ , so that focusing on a single line becomes
potentially easier.

edit: replaced asterisk with, well, "asterisk". How does one escape that in
HNews? backslash didn't work.

------
NaNaN
I thought it was a J library for C when I saw this title. Anyway, I think
APL/J/K is the real geek code (and its syntax will be forgotten sooner than
Perl syntax if you don't use it often. ;)

------
toolslive
Greg Wilson (from the 'Making Software' book) has been claiming the cost of
building software is rather constant, regardless the programming language. The
formula seems to be "one can build and maintain 10 lines of code per hour".

If this holds for J, K, APL we should all be doing that.

------
amelius
> Writing code in a scalar language makes you rather like a general who gives
> orders to his troops by visiting each one and whispering in his ear.

Well, actually, I use for-loops for that, thank you :)

~~~
klibertp
...which is exactly the same as "visiting each one and whispering in his ear"!
With a for loop you still need to know how many "troops" you have and then go
and do something with every one of them, one by one.

"foreach"-style loops with iterators in many languages eliminate the need for
knowing how many element you exactly have, but you _still_ have the code for
explicit "take the next element" operation.

Vectorized operations are the next logical step, where the code for iteration
("visiting each one") is completely hidden. So instead of explicit for-loop,
or equally explicit map function, you just apply the operation directly to the
list and get a list as a result. This is nice, because it eliminates A LOT of
syntactic and semantic noise from the code which mainly works with
collections.

Such vectorization is available in many languages as a library - for example
in Python via NumPy, where you can say "numpy.array([9,6,7]) + 4" and get
"array([13, 10, 11])" as a result. In J and APL support for this is built in.

~~~
amelius
You are implying that if there are N troops, then I need to do O(N) amount of
work to give them instructions.

Not so.

I only need to write down 2 lines of code: one line for the for-loop, and one
line for the actual instruction(s).

Personally, I prefer explicitness instead of increasing the amount of
overloaded meaning of existing operators.

~~~
klibertp
> Personally, I prefer explicitness

Of course, that's one way to look at it, and a valid one. Most of the time I
do too, but I can accept some degree if implicit behaviour if it's consistent
and (conceptually) simple enough.

> the amount of overloaded meaning of existing operators

But this is not true in regards to J. In J and other array-based languages the
number of overloaded meanings for operators is actually less than in "normal"
languages. That's because in J all the operators work on arrays and only on
arrays. No operator ever deals with things other than arrays, and in basic J
there's just one thing other than array anyway (a box).

In Python this:

2 + 2

and this:

[1, 2, 3] + [1, 2, 3]

are two different operators, but in J (ignore syntactic differences) this:

2 + 2

and this:

1 2 3 + 1 2 3 NB. resulting in 2 4 6

are both applications of the same operator.

There is a certain appeal to languages built on "turtles all the way down"
principle - one of the benefits is that they tend to be compact, simple and
consistent. On the other hand, when you're not dealing with turtles, they tend
to be irritating at best and completely unusable at worst.

Anyway, I'm not trying to convince you to adopt J right now and start using it
for writing web apps, I just want to say there are some interesting concepts
in J which, while being unfamiliar, are not necessarily worse than the things
we're used to. And they're not useless either: a week of learning J made my
NumPy code much better, for example.

------
AnimalMuppet
Interestingly, we just had an article on HN that complained about the amount
of mental RAM that C++ takes nowadays. I can't imagine that J takes less.

------
inquist
I find this hilarious.

~~~
klibertp
What exactly and why?

