
Don't Learn to Code, Learn to Program – But Come Back in 10 Years - dirtyvagabond
http://johnkurkowski.com/posts/dont-learn-to-code-learn-to-program-but-come-back-in-10-years/
======
jimbokun
My paraphrase of this article:

"All of the blood sweat and tears poured into making programming more
productive over the years just isn't good enough. No, I haven't done anything,
personally, to improve the situation. My plan is to continue bitching about
how things aren't good enough until...well, indefinitely, because bitching
about other people's efforts is much easier than doing the hard work of
solving a challenging problem myself."

Is that too harsh? I don't see a single concrete, actionable proposal in
there, let alone anything the author has personally done to fix the sorry
state of programming.

I also think it's a huge disservice for him to dissuade his friends from
programming until some state of nirvana is reached in programming tools. There
are some hurdles to overcome, but a little bit of programming skill can
greatly amplify the reach, scope and influence of non-programming talents.

~~~
nevir
I read it as: "programming could be so much better; I have a glimpse of what
it could be like, but don't know how to get there."

~~~
joe_the_user
I like your reading better. I would have liked the blog better if it had had
more of the feeling of humbleness that your words have.

The blog felt like it was saying "shut up till you do something better" where
it should have said "Imagine how cool it would be to do something better".

~~~
mattschmulen
A humble approach is always nice, its always challenging to give an earnest
opinion ( after all that's why I read your content, for your opinions) and be
openly modest. your (authors) terminology in differentiating programmers form
coders is different from my own 'naming'. My wording are often around
application development vs systems programming. a sys programmer writes
software for humans to control machines ( databases, OS's,tools) where as an
app developer is focused on software (computers) that controls humans ( apps
that constructively shape human actions) . I liked the article , good post.

------
zedshaw
This is an entirely pointless false dichotomy. "Coding" and "Programming" are
just synonyms of the same damn thing: make a turing machine do stuff. I have
no idea why people are suddenly trying to make a false distinction when there
was none for decades and decades, but frankly it's stupid. My only thought is
there's a bunch of butt sore guys who are upset that learning to make a
computer do stuff is nothing special and are now trying to find a new way to
differentiate themselves with the moniker "Programmer" and looking to shit on
anyone who's "Just A Coder".

So I propose a new term. Turingulate: verb. Making a turing machine do stuff.
Typing Python into a file and running it with python? Turingulated! Running a
PD graph to create a fuzz pedal? Turingulating! Using Max/SP? Turingulating!
Programming because you're so special? Turingulate. Coding to .. uh code some
... uh code... which isn't programming? Turingulator!

So stupid.

~~~
sjs
Colloquially they are synonyms. Especially amongst people who don't make
software. It's not derogatory and not worth getting bent out of shape about.

BUT ... (Bet you saw this coming)

Coders turn detailed specs into code. No real thought required.

Programmers, or software developers, design systems and code them themselves.

That's the distinction I make anyway. I've seen coders at some workplaces.
It's a dreary job and I would never do it.

~~~
collyw
I have seen similar articles with the word "programmer" used in place of
"coder" and "developer" used in place of "programmer". I don't think there is
any hard and fast rules as to what makes a coder / programer / developer /
software engineer.

I think that is part of the problem. The outside world sees these as equal.
While after 10 years in the industry I can see the difference between someone
who can write a script that will get the job done (for the time being), and
someone who can design / write decent code that you expect to be working a
year or two later.

------
j_baker
Hold on a second. OOP, markup, APIs, and HTML/CSS/JS are snake oil, while
visual programming isn't? Say what you will about the former technologies, but
they're being used for productive purposes. Visual programming is vaporware
that someone tries to build every 5 years or so to no effect. Functional
programming has much the same challenge, but it's getting integrated into
mainstream languages. Otherwise, it would likely still be a research toy as
well.

~~~
Jtsummers
[http://en.wikipedia.org/wiki/LabVIEW](http://en.wikipedia.org/wiki/LabVIEW)

I believe that's an example of visual/graphical programming, it's fairly
mainstream in some areas and not vaporware.

Re: Rest of comment - Filtered site, I'll read it at home.

~~~
Scramblejams
I used LabView in college and a couple of other visual programming schemes in
my work that have since disappeared, and they all had the same drawbacks: If
your needs fit tidily within the assumptions made by the designer, they can be
very productive indeed. Once you step a little bit outside of that, they turn
into a tortuous mess. Can't even count the number of times I wished I could
just write a few lines of code instead of trying to figure out a byzantine way
to hook up enough icons that I got my desired end result.

~~~
Sheepshow
I've always been told by labview advocates that "you can of course pull up a
text editor and write code." The language and API is certainly a different
matter, being a second class citizen and presumably an afterthought.

------
john_b
> _" Programming is a notion to extend human capability, by offloading
> humanly-infeasible work onto a machine. It is the promise of an amplified
> knowledge worker. This would be worthy to learn."_

With this (correct) notion the author continues until he decides that

> _" Meanwhile there are few souls looking to evolve text code into something
> more humanly intuitive. For example something you can touch, you don’t have
> to read, or that tells you what the computer is thinking, so you don’t have
> to think like a computer yourself."_

I can't help but feel that what the author really wants is some futuristic
kind of AI assistant. Decades of programming progress has resulted in numerous
abstractions piled on top of one another in order to minimize the amount of
"thinking like a computer" that you have to do. But all abstractions are leaky
and when you encounter those leaks you will inevitably have to be able to
think like the machine you are trying to communicate with.

Why is it acceptable to say that effective communication with _people_ depends
on being able to think like the person you're talking to, but demand that
communication with a far more different and limited _machine_ should never
require you to compromise on your preferred thought patterns? This is the
fundamental fallacy of the natural language programming advocates. They want
to use another entity to do things for them but demand all communication with
that entity be in their preferred language, on their own terms.

~~~
dllthomas
Moreover, I'd just jump in here to point out that effective communication with
_people_ quite frequently involves "text code".

~~~
rix0r
Is the entire premise of Neuro-Linguistic Programming (NLP) not that you can
influence people's thought patterns by using language as code, as
instructions?

~~~
dllthomas
That's basically my impression, but my understanding is that NLP is not very
strongly supported so I'd be leery about using theories from there as if they
provided much evidence about the world.

I don't think you need to get so vague though - if you want to communicate a
task to a human, you very explicitly use language most of the time
(particularly for anything complex). They may not follow your instructions for
whatever reason, but the _act of communication_ is done with text.

------
overgard
Just because something is hard doesn't mean it's broken. Nobody would say that
say, advanced mathematics is broken because the layman doesn't get it.

My bullshit detector notes that he talks very generally about "knowledge
workers leveraging machines". The problem with the phrase "knowledge worker"
is it often means "person who sends lots of emails". You don't need to be a
programmer to send lots of emails.

Real "knowledge workers", like, the kind of people that actually are working
to extend human knowledge and need computers to do it, largely do know how to
program to some degree (IE, scientists and mathematicians and so on). A lot of
it isn't "professional" grade, but it works well enough for them and it keeps
getting better.

Also a tangent: but I'm sick of the whole "coding is literacy" nonsense. Not
everyone needs to code. Literacy unlocks centuries worth of knowledge in any
domain you want, from contemporary thinkers to people who are long dead.
Coding lets you tell a computer how to do specialized things. It's not even
remotely comparable.

~~~
wwweston
> Nobody would say that say, advanced mathematics is broken because the layman
> doesn't get it.

We _do_ make math more tractable and accessible over time with various
reformulations and new abstractions and conceptual tools, though. You're
probably using arabic numerals when you do arithmetic, and there are good
reasons for that.

(Though heaven knows, if I'm likely to run across an abacist anywhere in this
sort of discussion, it's probably here. :)

~~~
KC8ZKF
And computers have been made more tractable and accessible over time with
reformulations and abstractions. And at a much faster pace.

People who think that programming is not dramatically easier today than ten
years ago just have a short memory. Or they're 25 years old...

~~~
collyw
I wouldn't say it is easier (or harder), I would say it is different. Before
you were having to remember lots of tricks to fit as much information into as
little memory as possible. Today we have lots more stuff to remember.
Operating systems are way more complex.

At the end of the day, most people are programming at a far higher level, so
things are more productive. I have produced a reasonably sophisticated
databases / web front end for my work, in Django as a one man team. Ten years
ago I believe that it would have taken far more man hours to achieve the same
thing. I don't have to remember tricks to compress data, as disk space is
cheap now. I do have to know reasonably advanced SQL, and remember a lot about
how Django works.

------
joesmo
I don't think we have to worry about "everyone" or even a small percentage
thereof learning how to code. They won't. It's not that they can't, but it
takes more than a couple of hours writing "Hello World" to do anything
substantial and most people are not willing to put in the time, day after day,
to learn anything substantial.

That being said, I can't imagine anything more human and more powerful than
text. If I want to give instructions to a _human_, the best way I know of is
to write them down as clearly as possible. If this works with a complex
person, why would it not work with an incredibly simple (from an abstract
point of view, not implementation) microprocessor. After all, all processors
have an extremely limited set of instructions (even CISC ones).

This desire for something that is not text seems misplaced. After all, text is
(IMO) certainly better than pairs of numbers indicating operation and data.
_Writing is one of the greatest human achievements._ To dismiss it in favor of
touch or something graphically visual is to disregard its power and
superiority to both touch and visual systems of communication.

When designing a system, if I am able to give it commands in English (or any
other language), this would be vastly superior to anything visual or touch
based that I can think of or that the article specifies (or rather doesn't).
To take the whole system of a language like English with its complex idea-
expression engine and its vast array of words (1 million++) and discard it is
ridiculous. Now, we are not currently programming in English, or anything even
remotely close (though many languages' grammar follows from natural language
somewhat), but I do believe this is, and has been for decades, a goal that may
one day be achievable. This is desirable as humans are naturally inclined
towards language and those who practice it can be incredibly good
communicators (programmers of reality).

~~~
VLM
"It's not that they can't"

No, they really can't. Garbage in garbage out. Can't implement an algorithm or
follow a protocol unless you understand it, and the world is full of people
who literally don't understand how a thermostat works, how to drive safely, or
how to implement the NRAs three basic rules of gun safety.

Note that our shared enjoyment of text as per your 3rd paragraph is only
politically correct WRT to programming... if you dare to suggest text might
also be the superior user interface, the dogpile will begin. Which is sad.
Closed minds are always weaker than open minds...

Lets play a game, I'm thinking of a well designed, expressive, long lived
programming language developed by a linguist... Its dumped on here at HN for
no logical reason, just the usual womens fashion fad explanations. I suspect
this would happen to any "English like" future programming language.

~~~
GFK_of_xmaspast
"just the usual womens fashion fad explanations"

Very classy, dude.

~~~
boyaka
Men can be fashionable too, it's just typically not as fun for us. I would say
most of the time the only reason men are interested in fashion is to become
more attractive to women.

Edit: I realize this kind of generalization probably offends people. I'm
sorry, I just believe that men and women are fundamentally different from each
other. Here's my google search findings on the topic:

[http://www.lloydianaspects.co.uk/evolve/fashion.html](http://www.lloydianaspects.co.uk/evolve/fashion.html)

------
ef4
The author is frustrated by a real and frustrating problem, but he's missing
why things are this way.

The "triumvirate of HTML/CSS/Javascript" isn't just a technology. It's a
_social consensus_. It's not actually "designed" in a meaningful sense -- it's
evolved.

And things pretty much need to be this way, unless you want to be an island.
Even if you have the ambition and the resources to throw it all away and "do
it right" from scratch, by the time you finish the world will have moved on
and your perfect gem will be obsolete. Real technology is always iterative.

The same constraints apply to programming languages in general, because they
truly are _languages_ that human beings use for thinking. Languages and their
associated cultures can only evolve so fast. It's not really a technical
issue, it's a social issue.

The author's assertion that our "arcane" interfaces persist out of a kind of
perverse pride just doesn't ring true to my experience. Programmers cry tears
of joy when they get to throw away something old and arcane because they don't
need it anymore. And his assumption that visual programming will necessarily
be better than textual programming is completely unproven.

Anyone who could actually make programming easy would reap vast wealth. The
incentive is there. If it's really a no-brainer to build visual programming
tools, well, where are they? There must be something harder about it than you
think if we're still waiting.

~~~
analog31
An interesting thought about evolution is that text-based programming
languages are exquisitely evolve-able, because the technology and effort to
evolve a language or create a new one aren't too far out of our reach. If you
don't care too much about efficiency, you can create a new language from an
old one, and pass it around for others to try out.

This may be why there are 100's of text based languages, but I can think of
only one graphical language -- LabVIEW -- in widespread use. LabVIEW requires
a monumental organization to maintain. Adding a structural feature to the
language, requires changing the entire GUI, menus, and so forth.

I suspect that if LabVIEW came out with a text based option for dataflow
programming with support for their massive libraries, folks would abandon the
graphical interface.

Maybe graphical isn't always better. Maybe text is really the best way to
express programs after all. We could wait for programming to become
"intuitive," or learn to develop our intuitions.

An anecdote about graphical systems. When I read tutorials for Windows, it's
usually a lot of text interspersed with pictures of windows and dialogs. Even
videos. Walking someone through a process over the phone is excruciating.

The same process using Linux: Open the terminal and enter this text.

In fact, Windows tutorials are starting to use: Press the start button and
enter some text. I find the device manager by entering "device manager" into
the start button menu, not by following an "intuitive" GUI.

------
bestrapperalive
"A relief from the unintuitive, unscalable, unfounded snake oils like OOP,
markup, APIs, or the triumvirate of HTML/CSS/JS. As if these technologies are
the best we can do."

At first I was mad. Then I read the above quote and everything clicked into
place. This article is hilarious and I love every shrill, indignant,
purposefully absurd word of it.

------
angersock
Probably the biggest bit:

"Can’t I just make a website?"

That type of question is where a lot of things go wrong. Newbies ask that
because they don't have the experience to break it into smaller parts,
experienced people get tripped up because they can't see past the "well, what
type of website doing what?"\--all the while the question is "what would you
use it for?".

It's a bad question, because the asker doesn't know enough to ask a useful
one. Our job, then, is to help them learn how to ask the right question.

------
kamaal
There is no point in teaching something to people they don't like doing. Its a
bit like forcing painting or sketching on people who just don't like doing it.
A lot of schools do it, and kids who don't like painting or sketching do it
real bad and become the object of ridicule.

Programming et al has a meta science its called 'making', 'tinkering' or
'hacking' or whatever you call it. If there are people who don't have a taste
for it, teaching them sorting algorithms is as boring as learning math. Now
common people are one thing, most people who join our industry as programmers
discover they just want to do something else and go down the managerial
stream.

Programming isn't for every one, just like building houses, cars, dams, or an
electric grid isn't for every one. In all those disciplines common patterns of
problem solution and general science are at play. Chances are if you like
chasing a difficult bug and fixing it, you will also like repairing a car
engine, or fixing your TV.

I have an elder cousin who runs a shoe shop for a living, he tried very hard
to learn programming during the dot com boom era. He gave up in like six
months. He just realized that procedurally learning something only because,
every one is just doesn't work out, the problem isn't just programming. If you
don't like solving problems and are not good at problem identification and
opportunity identification, it gets really really boring at one point. He
likes to sell things and is exceptionally good at that. Building things is
just not for him. And there is nothing wrong about it.

The biggest problem with 'teach every one to code' is you have to first ask
'Do they want to learn coding?'.

------
opnitro
The issue with every Visual programming language I have used is that while it
is easier to learn for beginners, it is almost always faster to type once you
start to understand programming.

------
cliveowen
I don't think programming should be mainstream, I think software should simple
and intuitive enough to let anyone get work done without having to know how to
write code.

------
xradionut
I agree with the author in regards to coding web sites. It's not enough to
know a just a few languages, there's a whole frickin' vocabulary of crap piled
on layers of services and frameworks frosted by the JS flavor of the week.

Or just configuration. Everybody wants to reinvent Make and they all do it
wrong...

------
clavalle
There is a reason we speak and write words to communicate complex ideas rather
than hold up pictures. Words are powerful and flexible. They are not going
away.

------
mathattack
When I saw the title, I thought it was, "Come back after you practiced for 10
years" which doesn't help much.

I used to think the solution was better interfaces, or making programming more
accessible to the lay person. I get less sure of this over time. It seems like
what we've gained in language ease of use (Python versus C) we've given back
in environmental complexity. The net is better productivity for programmers,
and more people crossing the divide (CS programs growing by leaps and bounds)
but we're far from being there for the Everyperson. I don't see this changing
in 10 years without increasing the rigor and logic requirements of our entire
educational system.

------
jaybill
The author of the article wrote it using words. He didn't make a video or draw
a picture, because words were the best way for him to communicate with other
humans. For some reason, he wants a different way to communicate with
machines, one that is inherently not the way humans communicate with each
other.

I also agree with what others have said, that this is nothing but a bitch fest
with nothing actionable other than we ought to be creating visual programming
tools no one wants or needs instead of doing actual work. Sure, buddy. I'll
get right on that.

------
rdudek
Man, people are so overly sensitive... it's almost like someone is trying to
promote job security.

------
zenbowman
I don't the author understands programming if he sees object-oriented
programming as being in the same bucket as HTML/CSS/JS. OOP is a way to
deconstruct problems, just like functional programming, and their beginnings
are based on very similar principles.

Any language that fully accepts the object-oriented paradigm lends itself very
naturally to functional programming, and vice-versa.

~~~
nbouscal
Uh. What? Fully accepting the object-oriented paradigm does not make a
language any more likely to properly isolate state and side effects[1], any
more likely to have immutability as a default assumption, or any more likely
in any other way to encourage referential transparency. As referential
transparency is the essence of functional programming, this makes your claim
very confusing.

[1] Following OOP _design patterns_ makes a given program more likely to
properly isolate state, but this is by no means a virtue of OOP _languages_.

------
reyan
I think the author should read Fred Brooks's "No Silver Bullet". Many of his
arguments are still valid after 28 years. "There is no single development, in
either technology or in management technique, that by itself promises even one
order-of-magnitude improvement in productivity, in reliability, in
simplicity."

~~~
slacka
"No Silver Bullet" reminds me of the physicists at turn of century claiming
that Newtonian physics held all of the answers. He blames the issues in
software engineering on bad programmers, instead of questioning his field. His
narrow minded, defeatist arguments fail to recognize the full potential of
computer science. I don't buy his argument that Java or JavaScript have
eliminated most of the accidental complexity in programming. It's like Von
Neumann claiming, “I don’t see why anybody would need anything other than
machine code.

Computer Science is still in its infancy. We haven't reached the full
potential of Von Neumann architecture, let alone the dozens of non-Von Neumann
systems that have been largely ignored by academia. Recent advances in
neuroscience may open up a whole new model for information processing, such as
IBM's SyNAPSE project.[1] Have you watched Bret Victor's, "The Future of
Programming"?[2] He does a wonderful job of countering many of Fred Brooks's
points.

[1] [https://www.research.ibm.com/cognitive-
computing/neurosynapt...](https://www.research.ibm.com/cognitive-
computing/neurosynaptic-chips.shtml)

[2] [http://worrydream.com/dbx/](http://worrydream.com/dbx/)

~~~
reyan
Yes I have watched Bret Victor's talk. Narrow minded or not, Brooks's
predictions held true for more than a decade after the time of writing. I'd
like to read your counter-arguments.

I'd say architecture and advances in neuroscience are tangential here. I think
the core of the problem is still somewhere else.

* sidenote: Lisp machines weren't at least completely _ignored_ by academia.

------
Aqueous
I think the reason visual programming hasn't taken off is there is a very real
difference between what you can express with language and what you can express
visually. Images don't really contain determinate, precise statements or
propositions the same way that written or spoken word does. In contrast
language can express any thought, no matter how recursive or abstract,
effortlessly. For the narrow domain of programming - not typesetting, not
rendering and manipulating images - the text editor happens to be an extremely
very powerful tool that allows us to tell the computer exactly what we want it
to do. It hasn't been replaced because it hasn't needed to be - it is simply
the most powerful tool in which to write language.

The other reason visual programming hasn't taken off is because programmers
aren't visually oriented, right-brained, intuitive people. Exceptions to this
rule abound, but I am certainly not one of them, and neither are many
programmers I've known. They've tended to be left brained, linear, linguistic,
rational-over-intuitive sort of people - which is why we pick up computer
languages so quickly, and it's also why we like typing code into a text editor
instead of drawing our program on screen. I know that functionality isn't
lateralized to easily and this is an oversimplification - but there does seem
to be a dispositional difference between those who are attracted to
programming and those who aren't.

But this'll probably change, as everything does. As the author notes FP and
lambda calculus have been around since the beginning of computer science as a
field and they are just now starting to get mainstream traction. So, maybe, in
about 20 years visual programming tools will start to be mature enough to
applied to any arbitrary problem., and a powerful visual language will develop
that attracts enough visually-oriented people to change or revolutionize the
field. But we can't put the chicken before the egg - the tooling needs to
mature enough to cause this network effect to occur.

~~~
collyw
I would say I am pretty visual. I like to draw ER diagrams of database design
so I can see whats going on. I usually have a visual representation of the
tables in my head with positions when I am thinking of queries. The tables
stay in the same position (probably based o the ER diagram I drew ages ago).

------
crackerz
We currently have a decentralized mess of competing products. Many of them
overlap and do similar things, but they all feed off of each other to improve.
It is this competitive environment that improves the products and libraries we
use. Anybody can identify a flaw in an existing system and build something to
address it.

What this article proposes is a monolithic culture, where everything is
centralized and controlled by a few entities. We tried this in the
IBM/Microsoft days and it doesn't work. We end up with a single poorly
documented tool that offers limited functionality.

The world of programming is more accessible today than ever before. If you are
a novice, there are tools for you. If you are an expert, there are tools for
you. If you are transitioning from novice to expert, there are tools for you!

------
chris_mahan
I left this comment at the blog:

Perhaps coding will look like minecraft, where you dig physical paths for the
water, for the chickens, and hunt down bugs with bows and arrows. Modern
tablets with no keyboards do minecraft just fine (my 8 year old and all his
friends are completely addicted to minecraft). Also, they can join lan games,
and work in the same virtual world. This may allow multiuser programming, no?
Weird thoughts. They won't call it programming either, they'll call it
something like flowcraft or some such.

~~~
jmaclabs
I think this type of a concept is what the author is driving at. We shouldn't
be manipulating blocks of words, we should be manipulating what those blocks
of words create into making even bigger things.

------
phantom_oracle
Visual-based programming is definitely something that can help more people
solve problems, but it gets complex to manage too (or so we've been told by
blogs and others).

Perhaps something like visual-based programming would suit a different mindset
altogether.

Truth be told though, programmers have managed to solve the website problem
with all the new site-builders out there. They all seem to be "drag and drop",
which should have been addressed long ago as a standard for at least the
frontend.

~~~
smoyer
The last (natively compiled) program I wrote to run on the Windows OS was done
using an almost completely visual programming tool called ProGraph [1]. It
worked well, and the small cards containing the code made sure you didn't put
too much logic in one picture (much like the character limit Forth imposes on
its page size). The system lives on as Marten [2], but only supports Mac OSX
now.

While there are some things I like about visual programming, there are also
many issues - perhaps the biggest is that it lulls the developer into
believing they don't need to understand the underlying computer. I think a
similar analogy is how a newbie RoR developer can create a CRUD application
without ever looking at or understanding the scaffold-generated code.

I've thought about this a lot (in fact, years before I found ProGraph I tried
to develop a system I called FloPro which was standard Fortran flow-charts on
the front end and generated C (to compile on your local platform) on the
backend), but I haven't seen a solution to these problems in any visual
programming system.

[1]
[http://en.wikipedia.org/wiki/Prograph](http://en.wikipedia.org/wiki/Prograph)
[2]
[http://www.andescotia.com/products/marten/](http://www.andescotia.com/products/marten/)

~~~
phantom_oracle
That's quite an interesting product you have there.

Were you only able to build it by writing code itself?

~~~
andescotia
At the risk of some criticism, I would like to make some visual language
comments. I am the author of a visual language programming environment called
Marten which supports the Prograph visual language. I use this IDE everyday to
write commercial-grade software for clients. Marten is in fact written in
Prograph using Marten. I have created software in many other languages such as
FORTRAN, RATFOR, C, C++, C#, Objective-C, Java, Perl, Python, along with i386
and PPC assembly, so I am very familiar with the difference between
programming in a text-based language and visual programming. I found that
visual programming is so superior to using text-based languages that I wrote
my own IDE to ensure that I could continue to do so. Consequently, Marten
stands as an example that not only is visual-programming not "vaporware", it
can be an valuable and powerful addition to a developer's toolkit.

------
scdoshi
This blog post would have been way better without the ranting style of
writing.

It doesn't even get to the point (that programming does not have to be text-
based) until halfway into the article.

The title and the objection to other people learning to code are at most
tangentially related to that point. Just because there could be better ways to
give instructions to machine doesn't mean other people should wait to learn
how to instruct machines until those methods are developed.

------
Edmond
For anyone interested in the prospect of visual programming for web
development, please join our newsletter at crudzilla.com ... we are coming out
with an update that will attempt to fit a wordpress-like plugin system into an
IDE. We think that'll be a good start for a more visually oriented development
environment.

------
wellboy
You only can learn how to program, if you have learned how to code. So learn
how to code :)

~~~
collyw
Learn to code, but don't stop there. Aspire to write good quality stuff.

------
hajderr
Focusing on solving a problem instead of the tool is what motivates me to try
new frameworks and actually code.

I agree that the focus has shifted way too much on spec. technologies
unfortunately.

------
10098
Again with this visual programming bullshit

------
networkjester
Eye opening at certain points, dragged on at others. Either way I'm more
excited to code. I mean, program. :)

------
NAFV_P
I'm neither a coder or a programmer, I'm a _code-monkey_ and proud of it.

------
bliti
Don't learn how to paint, learn how to create art. How is one without the
other?

~~~
xerophtye
"The current state of painting sucks. You have to figure out which colour
mixes with which colour to become what. You have to learn how to hold the
brush and what different brushes do. you have to learn different strokes. It
shouldn't be like that at all, you shouldn't have to LEARN all those arcane
things at all. People have been doing it for CENTURIES and they are unwilling
to change due to some perverse pride in all of this. What should be, is that
you think of a pretty picture and tada! It appears on Canvas. Without you
going through the process of actually making it"

That's is the Author's viewpoint.

~~~
cookingrobot
Nobody seems to be giving the author the benefit of the doubt - so I'll play
devils advocate against your analogy.

"The current state of painting as a way to capture images sucks. It takes
years of practice, and even then the likeness created is only ok unless you're
a true master. There's this area of research around lenses and films that
professional painters seem to scoff at - but I really think we could be doing
profoundly better if we can get it working."

~~~
xerophtye
Hahah funny thing is I had these thoughts as i typed my last reply.

Btw it's rather interesting how the same cycle begins in photography where old
school photographers scoff at post-capture editing via photoshop and say that
it takes away from the beauty of the art. So maybe even if we get visual
programming, maybe things won't change...

But still, I gotta ask, how can you learn programming without learning HOW to
program? The HOW is where code comes in. Yes, we'd all love the magic of
making the computer do as you wish with a point of a finger. But until then,
why not enjoy the process of taking a problem, thinking of a solution and
figuring out how to explain that to the computer?

------
thesimpsons1022
is it just me or is this guy completely wrong. In my opinion languages aren't
that confusing or unwieldy, the hard part about programming is the logic, not
the languages.

------
elnate
Is it satire?

~~~
recursify
It has that feel, doesn't it?

------
jmaclabs
It seems that folks here are reacting to the angst of the author as well as
the literal meaning of the language used to make his point instead of
contemplating what the author is trying to actually say.

Otherwise, why get hung up on definitions of "coding" vs "programming" or
concepts such as "text" vs "visual"?

I think this post is an expression of the frustration that we have all reached
at some point (if not many points) along the way while using any language or
solution or architecture: this sucks and there's got to be a better way to get
shit done!

How many times have you been working on project "foo", ran into a problem and
employed solution "bar" (or better yet, created solution "baz" on your own) in
order to solve your problem...only to find yourself derailed into a swamp of
limitations with "bar" (that everyone forgot to mention in that awesome blog
post you found)? Or, how many times have you lied to yourself and said "I'll
improve upon the limitations of 'baz' in the v2 release" of your home grown
masterpiece only to never return because you're actual goal is to finish the
"foo" project?

The problem might be the tools or the user or the application or even the
philosophy...or it might be a combination of that list plus some I haven't
mentioned. Perhaps these problems have already been solved and the lessons
have been lost? Or, perhaps more analysis is needed to come up with a new way
around this issue entirely? I certainly don't have the answer right now but
this article made me take a step backward to survey the landscape (or the
wake?) of the most recent 10 years of "disruptive" (read, self-serving)
technology and I feel an undeniable sense of dissatisfaction with it all.

Maybe that desire for something better just means I'm a programmer and not a
coder? I don't know but I won't avert my eyes from the problem no matter how
obvious it is to everyone simply criticizing the article. Awareness is the
first step to understanding the problem you are trying to solve and I see
people recoiling from that. Denial is a sign that a belief (or end of thought)
is overruling the critical thinking process and to me that's acceptance of the
status quo (not just dangerous but stupid too).

I did get a good chuckle at the metaphor provided by xerophtye's comment "you
think of a pretty picture and tada! It appears on canvas!" The truth is always
funny so perhaps this comment encompasses the entire article but I think it
oversimplifies what the article attempts to drive toward. Sure, comprehension
of the building blocks (or, the colors) allows you to build bigger and better
buildings (or paintings), but, if that were truly the case, why aren't things
improving? Wouldn't all of the so called experts chiming in have already built
bigger and better buildings? And if so, what are they? NoSQL? Twitter
Bootstrap? Ember or AngularJS? Bitcoins? Animated CSS? All I see are copies of
the original idea done "my way" which is "better".

Amateurs borrow, professionals steal. And that is the process improvement
methodology I see employed today.

If everyone thinks they know their craft so well, inside and out (and, for
safety, let's say I don't), how might you take that knowledge (and a step
back) and change/improve upon how things are accomplished today?

I think that's the core of this article.

To put it another way, if the acquired knowledge to accomplish tasks via
"programming" is only useful and self-serving within the realm of the
library/language being used, how is that moving the entire ball forward?

I think the solution implied here is that a re-examination of history and more
experimentation with different architectures or applications of the existing
building blocks might get us over this hump we're facing.

------
asdasf
Why 10 years? Given that we've known what a mess the software development
world is for 50 years and nothing has improved, what makes you think anything
will be better 10 years from now?

~~~
tbirdz
Might be a reference to the famous Norvig essay: "Teach Yourself Programming
in Ten Years"?

~~~
john-kurkowski
No, I'm not that clever. It seemed like a reasonable time to check back in and
see if the revolution happened.

------
benched
Anytime anyone tries to spin any of the following - coding, programming,
software engineering, software development - as distinct things, and expects
me to acknowledge their special definitions that they probably made up, I kind
of lose interest.

