
The Future of Software Is No Code - batemanSMA
https://www.inc.com/greg-satell/how-no-code-platforms-are-disrupting-software.html
======
Verdex_3
They've been trying the "let end users code in an impoverished pseudo
programming language" for a long time and at best it only ever takes over
small niche. And they still need real programming languages and software
engineers to help them overcome their problems when they try to scale, extend,
or encounter a problem that's too complicated.

There are two things here. The first is sometimes programming languages are
harder than they have to be and the other is that sometimes problems are hard.

1) Yeah, C++ is really complicated and you could probably replace it with
language that was less complicated, but if you're trying to do something in
the C++ problem space then you're still going to require a lot of difficult
thought with respect to your problem, your solution, and the machine it's
implemented on. You're not going to get away from writing driver code with a
no code solution.

2) If you want to do arbitrary things, then eventually you'll need something
that's turing complete (or really close to it if you're going to talk about
things like Idris or Agda). A no code solution might be able to handle things
as long as you follow a very carefully constructed path, but eventually your
users are going to have reasonable requests that no code solutions won't be
able to solve because if they could they would just be programming languages
unto themselves.

If you're telling me that eventually CRUD apps will be mostly automated away,
then I believe you. If you're telling me that eventually software engineers
won't be needed then I'll see you in line at the unemployment office because
the last two jobs to be made obsolete are software engineers and youtube
personalities (and last I checked they're working on automating youtube
personalities).

~~~
tylfin
I just love the dichotomy on HN where you get one article: “The Future of
Software Is No Code” and another “Why it took a long time to build the tiny
link preview on Wikipedia,” these two are seemingly orthogonal in my mind.

~~~
fh973
I think orthogonal (at least from its usage in math-related contexts) means
the two don't have anything to do with each other.

~~~
taoistextremist
Nah, that would be skew. Orthogonal crosses but goes in a different direction.

~~~
cliffy
Orthogonal is frequently used to mean two (or more) things are independent of
one another. I have _never_ heard skew used in that way.

~~~
taoistextremist
Yeah, but they were saying in math related contexts, in which orthogonal is
almost equivalent to perpendicular.

~~~
vishnugupta
I believe even in the context of mathematics orthogonal means independence.
I.e., one can move along x and y axis without having y or x value being
changed. However if you move along any line that's not parallel to x or y axis
then both x _and_ y value change simultaneously. That's how I interpret
orthogonality in the context of math anyway.

~~~
Maybestring
>However if you move along any line that's not parallel to x or y axis then
both x and y value change simultaneously.

A third orthogonal axis is neither parallel to x and y, nor do x and y vary as
you move along it.

Replace parallel with linearly indepent, and you are spot on.

~~~
romwell
No need to replace anything; the implicit assumption is a 2D space.

~~~
Maybestring
Then you can only talk about two concepts. I would like to be able to say,
intelligence, morality, and obesity are mutually orthogonal.

If your model is 2D, then you are requiring a relationship between at least
two of them.

~~~
romwell
The _example_ was in 2D. The reader is able to generalize from there, we hope,
no? :)

~~~
Maybestring
As long as you don't rely on concepts that don't generalize to higher
dimensions. To the extent that you use concepts that don't generalize in your
example, you are using a poor example.

------
reificator
Round and round we go. Textual code looks like the barrier when you start
learning, but it's a boon when you've gotten past it. The big barriers to
quality software are not typing or syntax, they're just the most visible from
the outside.

Despite the goal being to "let anyone program", these tools rarely end up as
more than toys. Why is that?

It's because learning to program is not a matter of learning syntax. If that
was the barrier we'd have just put more effort into NLP and been done with it.

But traditional spoken/written languages are inadequate for the precision
needed in programming, so we use more precise alternatives. The language is a
tool that makes it easier to program, despite looking more intimidating from
the outside.

That said, when you limit your domain you can make quality "codeless" tools.
Shader Blueprints in Unreal come to mind as a good example. But a lot of the
value they provide is in seeing the previews at every step because it's a very
visual thing that you're making. And if you're hitting performance barriers,
you'll probably need to rewrite it in text anyway.

Just because programming languages look scary from the outside does not mean
that removing them will make programming _easier_. Most of the time it will
make programming harder.

~~~
commandlinefan
> Textual code looks like the barrier when you start learning

The fact that, hundreds of years after Newton, mathematicians still use
symbols to convey concepts strengthens your argument here.

~~~
dragandj
Not only that, but writers around the world still use text to express their
thoughts. And, 5000 years ago, they _started_ with drawings. But, smart as
they are, they realized it sucks, and that for anything beyond simple
concepts, text is da best.

~~~
noxecanexx
I second this. Programmong languages where made to be precise enough for
computers to understand and ambigous enough for general computing

------
sametmax
It's like saying the future of books is no writing.

But what's hard with writing a book is not putting words on a paper. It's
organizing thoughts and expressing it in a way that convey what you wish to
share.

Same with software.

As a freelancer, I spend a lot of time with my clients speaking with them,
watching them work, reading their doc, understanding their culture, doodling
on paper.

Because my job is first to extract what they need from them.

They are incapable of doing that. I have yet to meet a client that comes to me
with half of the information I need to build their software by themself.

So let's say we figure out a way for people to make software, express the rich
and complex though and work flow their task require, yet with a an easy
process. Its seems dubious given our past history, but I'll indulge the
author.

Now what ?

What would my customer build with that ? They don't even know precisely what
they want, let alone what they need.

And then let's say that Jesus descents from the sky and a miracle occur, and
their figure it out, and create the product with the unicorn No Code magic
tool.

They have no idea how to deal with the changes they are going to need to make.
How to adapt to politics. How to make it so that it can evolve. What technical
decisions will affect their cost in the future. After all, it's not their job.

The code is not really the hard part of the job.

We keep the code because it fits the problem quite well and make it easy to
solve once we have the rest figured out.

~~~
jazzyk
"I'll go talk to the stakeholders to find out their requirements... in the
meantime, you guys start coding."

[http://www.modernanalyst.com/Resources/BusinessAnalystHumor/...](http://www.modernanalyst.com/Resources/BusinessAnalystHumor/tabid/218/ID/2798/While_I_get_the_business_requirements.aspx)

------
KeitIG
Made me think of this comic by commitstrip [1]

\- Some day, we won't even need coders any more. We'll be able to just write
the specification and the program will write itself.

\- Oh wow, you're right! We'll be able to write a comprehensive and precise
spec and bam, we won't need programmers any more.

\- Exactly

\- And do you know the industry term for a project specification that is
comprehensive and precise enough to generate a program?

\- Uh... no...

\- Code, it's called code.

In the end, we'll just move the abstraction layer to another level, that's it.

[1] [http://www.commitstrip.com/en/2016/08/25/a-very-
comprehensiv...](http://www.commitstrip.com/en/2016/08/25/a-very-
comprehensive-and-precise-spec/)

~~~
TeMPOraL
Yup. One of the most important things I've learned in my programming career is
that _reality is damn more complicated that it seems_. At all levels - both
physical and abstract.

The real problem is that humans imagine things and communicate them at _very_
high abstraction levels, without filling in the levels below. Much like a
children's drawing of a car is nowhere near detailed enough to serve as a
blueprint for building one, our usual descriptions of stuff don't contain
necessary details _at all_. If you want to build a real thing, all the work to
fill in the lower levels of detail needs to be done. You can't magick it away.
Today, that work is done 10% by the people writing specifications and 90% by
programmers figuring things out. In the future it might be done by a computer
- but that computer would necessarily need to be a human-level artificial
intelligence, doing all the work programmers do to go from the way we
communicate to a working program.

~~~
vbezhenar
You don't fill all the lower levels of details. You're likely not caring about
generated machine code, you're likely not caring about details of memory
deallocation, you're likely not caring about Unicode quirks when you're
comparing strings, you're likely not even aware which algorithm is used when
you're multiplying two BigDecimals, you're likely not care which characters
are allowed in HTTP header names, and so on and so on. A lot of implementation
details are already implemented. There's no reason that programs won't be
abstracted further away. Sure, it still is programming, but with less
unnecessary details (so more people can do it).

~~~
wvenable
You're thinking of the lower-level details of the computer but the issue is
the low-level details of the problem you're trying to solve. The amount of
detail necessary to solve even the most basic business problem is far more
than the average client can ever describe on their own.

All the computer abstractions help solve computer problems and in turn make it
easier to implementation solutions but doesn't make the business problems any
smaller.

------
codeulike
The Future of Selling Stuff to Enterprisey Places is Showing Them Slick No
Code Interfaces and Persuading Them It Will Allow Ordinary Users To Get Things
Done

 _(but there will be severe limitations and in practice the users will end up
logging calls to IT support)_

~~~
Pigo
Hey, I remember my boss telling me about the wonders of Lightswitch.
Silverlight was going to revolutionize things for "power-users".

~~~
goatlover
I remember my boss telling me that Flash would replace html.

~~~
Pigo
Oh man, that hurts to read. That's some wishful thinking there. I'm currently
helping a division of the Air Force replace a legacy flash application that
they've spent years trying to keep going.

------
cortesoft
This is not ‘no code’, anymore than someone programming in Python is doing ‘no
code’ just because they aren’t programming in assembly.

A visual coding language is just providing a different abstraction, but it is
still in the same space as other abstractions like a high level language such
as Python.

Now, it might be hiding more behind the abstraction than a traditional high-
level language, but that is a difference in degree and not kind. It does,
however, mean that there are going to be more things that can’t be expressed
in the visual language because the abstraction is too broad to be broken down
enough. This means that a lot of the tasks that need to be done can’t use the
abstraction and need a different programming language.

~~~
AnimalMuppet
The way we make progress is by hiding more and more things under better and
better abstractions. So this is the way forward - _if_ the abstractions are
better.

But there are two problems here. The first is that you can make better
abstractions in text-based languages, too. In fact, it's easier to do there.
So the visual coding language is always easily caught by text-based languages.
They may be easier to use for the layman, but they don't offer anything
_extra_ to the professional.

The second problem is that abstractions always have limits. Good (useful)
languages give you a way to escape from the abstractions when you need to.
Evil languages sucker you in and then trap you in a too-limited abstraction
from which you can't escape. (It's not just visual languages, either. The
original Pascal, before the Turbo extensions, had exactly this problem.)

~~~
DonaldFisk
> The first is that you can make better abstractions in text-based languages,
> too.

Not always:

The most general data structure is a directed graph. Is a formal textual
description as good as shapes connected by arrows?

MIMD parallelism is straightforward with dataflow. (No worries about
synchronization: processes wait until they've received all their inputs before
running.) Again, shapes connected by arrows partially ordered in two
dimensions is a more natural representation than text.

Finite state machines. Again, shapes connected by arrows is clearer.

~~~
geraldbauer
>> The first is that you can make better abstractions in text-based languages,
too.

>> Not always

Why are you writing and reading in text / code? Surprise, surprise - the
English language is text / code. Please, to support your arguments send in
some pictures or models or videos etc.

~~~
DonaldFisk
I've been working on
[http://web.onetel.com/~hibou/fmj/FMJ.html](http://web.onetel.com/~hibou/fmj/FMJ.html)

It's ongoing work, and visible progress has been slow over the past year or
so. Recently I've been refactoring the code, and improving the type system.
One thing worth noting is that representation of functions as a graphs makes
type inference more straightforward.

English and other languages are text because our ears hear a single stream of
speech and written languages reflect that. Historically, computer programs are
single streams of instructions because computers have until recently had
processors only capable of doing one thing at a time, and we've usually
programmed with text (GRAIL and DRAKON being exceptions) because that's all
most terminals could display.

But for finding your way around a city by subway, representing an electric
circuit, an organic molecule, or an elementary particle interaction, we use
diagrams.

------
jcl
As far as I can tell, this is an advertisement for QuickBase.

I can't tell from this article or their homepage how they are replacing code,
but it sounds like they're trying to fill the niche of Microsoft Excel and
Access, letting a semi-technical user specify limited queries and behavior.

Letting more users access the data they need is a laudable goal, but of course
there will always be some need for the kind of precisely specified behavior
that you get from code. "The Future of _Some_ Software Is No Code" might be a
more appropriate title.

~~~
himom
Yup. The articles are making the rounds again. Here’s some older ones:

Apr 21 [https://www.inc.com/greg-satell/how-no-code-platforms-are-
di...](https://www.inc.com/greg-satell/how-no-code-platforms-are-disrupting-
software.html)

Apr 11 [https://www.cio.com/article/3268508/leadership-
management/en...](https://www.cio.com/article/3268508/leadership-
management/enhanced-it-productivity-through-the-use-of-low-code-no-code-
software-development.html)

Mar 6 [https://www.zdnet.com/article/do-you-really-need-
developers-...](https://www.zdnet.com/article/do-you-really-need-developers-
to-build-applications/)

Feb 27 [http://www.tucsonnewsnow.com/story/37601734/quick-base-
power...](http://www.tucsonnewsnow.com/story/37601734/quick-base-powers-large-
organizations-with-new-enterprise-capabilities-for-easy-adoption-and-wide-
deployment)

------
tincholio
No it's not. For some reason, people with no SW background tend to under-
estimate the complexity of seemingly simple business rules and their
interactions.

~~~
TeMPOraL
Exactly. And that's my primary argument when I tell people they should learn
some basic programming - because it can quickly make you realize just how
imprecise the day-to-day human thinking and communication is.

------
websitescenes
“Rather than having to pay upfront, even the smallest startup could access
technology that rivalled what a large enterprise had available to them.“

This assumption seems wrong to me and I’ll even suggest the exact opposite.
Cloud software has traditionally been cumbersome since its the one size fits
all approach. With the advent of the cloud, people were willing to sacrifice
functionality for convenience. Now that everything is in the cloud this is not
true anymore and we’re starting to see pushback at all levels. Almost every
company has a different workflow that only custom software can serve best.
This is why companies with more resources will win at the software game;
assuming they know how to play. Molding software to your workflow is always
better than changing your workflow to accommodate software.

------
geraldbauer
How about the future of world literature is no text / no alphabet?

Or how about the future of mathemtatics is no numbers?

Or how about the future of civilization is no books?

Or how about the future of marketing is no bullshit?

And so on. Source: [https://github.com/bigkorupto/awesome-
nocode](https://github.com/bigkorupto/awesome-nocode)

------
samfisher83
There is this product called Labview that does visual programing. I am not a
huge fan because its just easier personally to write c code. Does everyone
just want picture books. Plenty of people like normal books. I think Code is
just more concise to understand. There are probably plenty of people who feel
the same way.

~~~
m-watson
Oh man do I hate Labview. I used everything else when possible including C,
python with C bindings, and even MatLab compiling into C to give direct
equipment directions. But that being said it is just my (and it seems like
your) preference. I had several peers who were significantly better at getting
the whole lab collection, processing, and output process up and running
quickly and efficiently and they did not know how to code. I think there is a
space for "no code" tools. Of course there will be someone on the back working
with code but visual tools help lower that barrier for individuals with other
skills than coding.

------
emodendroket
> Much of what they used to set up in Excel spreadsheets or checklists on
> clipboards they can now do in cloud-based mobile applications.

> "A big part of the benefit to no-code or low-code platforms is that they let
> you access elements of a development environment visually rather than
> actually writing the code yourself. That accelerates development and
> improves quality at the same time," Marshall Worster, a senior director of
> solution architecture at Mendix, told me.

So this is today's equivalent to an Access DB or spreadsheet designed by a
non-technical user which grows completely out of control, and then a
programmer is eventually brought in to clean it all up with much more
difficulty than if he had designed it in the first place.

These tools have their place but there are serious drawbacks to them.

------
jusonchan81
The author of the article is forgetting a ton of things that software building
needs. If that Janitorial service needs to scale and use data to drive
optimizations and improvements, I doubt quick base can do all that. Another
way to put that is, can you build google maps with quick base?

So it’s not really the future. The future that the author suggests is already
here. We have form builders, web site builders, accounting tools that can be
customized, automated billing and scheduling tools etc etc. So yes, most
business can subscribe to these tools and avoid writing code. However someone
still has to write code to make that happen

------
jtwaleson
I work for one of these companies (Mendix) and no surprise, I strongly believe
that no-code/low-code tools are here to stay.

What will they replace? Certainly a lot of CRUD programming. That's the
absolute sweet spot for now. But make no mistake, high performant enterprise-
grade systems with 10.000s of concurrent users are also being built.

I personally don't think it's the visual programming that makes the
difference. Instead, the magic is: the very tight integration between forms,
logic and data model. This allows rapid application development (not
prototyping!) of web/mobile applications. Hardcore developers become many
times more efficient and "citizen developers" can still contribute things such
as business logic without knowing all the low-level details. All the
boilerplate BS of modern web apps? Gone. Instead you just have an IDE with a
"Run" button.

An example: remove an attribute from an entity in the data model? On next
deployment, the system will remove the column from the database automatically.
Also, the IDE will show errors (within a second) in all the forms where the
attribute is still being used, so you know what to change. This gets you in a
state of flow very quickly. I know many developers that would hate switching
back to Java/.NET because they can't get nearly as much done.

The tight integration is possible because the syntax of visual programming is
simpler/more limited (and very strongly typed). This allows the compiler/IDE
to reason about what the "code" is doing.

------
zwieback
I had a deja-vu reading this article, sounds just like the things that were
promised in the 80s, then the 90s and probably other times I missed. There are
periodic tides of centralization and decentralization washing over the IT
world.

I do feel things are different this time around, though. SW distribution
really has been significantly improved and there are much better ways of doing
client/server applications now.

As to graphical app development UIs taking over programming - I have my
doubts.

~~~
maze-le
I also had a deja-vu moment. When I began to study CS a professor of us really
tried to get us into electrical engineering instead of CS. His basic argument
was, that in 10 years programmers will basically be unemployed or on the verge
of it, because even _nowdays_ there are visual code generators that can create
better code than most programmers could. That was 2004 and I think he was
referring to LabVIEW.

A lot of things have happened since then, but LabVIEW is still very much a
niche application for a very specific domain. And code generators didn't
magically abstract away integration, requirements optimization, scaling and
architectural desision making.

------
Adamantcheese
Oh man I can't wait to write my new tech in LEGO Mindstorms EV3 block code or
Scratch or SIMULINK or literally just Visio drawings with a bunch of scripts
on top. It's a massive horrendous clunky pile of inextensible garbage 80% of
the time. The only reason it's ever been wanted was so managers could visually
see what you're doing and quite frankly as the person writing it I should be
explaining it rather than them looking at it.

------
chasing
The code is the easy part.

The understanding of how systems and resources fit together to accomplish
goals -- that's the hard part.

------
keithnz
I think a lot of people may be missing the point.

No Code is not about replacing all forms of software development. It's just
about the vast majority of solutions can be done via no code platforms.

simplistically, wordpress is a no/low code platform that allows people to
target all kinds of solutions. Complex issue trackers allow all kinds of
workflows to be mapped. Our entire issue system / build / deployment process
coordinates a number of services together, very very low code. But I feel
confident I could model any number of build processes because the platforms
are super flexible/powerful.

I've been watching a friend of mine who is running a vegan food box business (
[https://thekaibox.co.nz/](https://thekaibox.co.nz/) ) put together the whole
thing without code. It's not suuper smooth, but the whole webshop / purchasing
/ logistics / shipping / printing / people management etc is done with "No
Code". But it's quite impressive what he can achieve, albeit with some manual
steps, by connecting around 10 different pieces of software to do what he
wants. I've been working with him as it always seems like he may need some
custom software to make things work better. But we often brainstorm a way to
do it with existing tools. The great thing is, he keeps adapting the system to
match his evolving understanding of his business STRAIGHTWAY. This makes
things super agile.

As platforms get better at connecting with each other it will allow more and
more no code / low code solutions to be developed. I think it will be
interesting to see what tools actually accidently become "no code" platforms
because they build a base level of flexibility into it.

------
organsnyder
If you're trying to replace the software developers that take fully-fleshed-
out work items (including a complete pseudocode description of business logic,
edge cases, failure modes, acceptable data ranges...), then you might have
some success. But then you've only managed to replace the lowliest of
developers.

Even in my experience as an enterprise developer, I've yet to come across
anyone that didn't bring some amount of independent thought to the process.
And, due to the nature of our work, almost all of us learn to question
assumptions, hone definitions of behavior, ponder possible failure scenarios,
and just in general ask the "what if?" questions that will bite us if left
undiscovered. Our ability to translate that into a standard machine-readable
language is really a small fraction of the value we contribute. Yes, it is a
barrier to entry—and it's seen as akin to voodoo by some non-technical
types—but replacing it with something with a "friendlier" interface doesn't
shrink the problem domain in the slightest.

------
lucozade
I'm in two minds here.

On one hand, it's a bit of a shame that the article focuses on "no code". What
the linked applications all seem to do is a pretty standard package of
workflow tooling, reporting generation and data source connectivity. If any of
them do a better job than all the existing ones them good on them. In my
experience, good tooling here is quite hard to come by and could be very
useful/valuable.

On the other hand, on the odd occasion that I've had users buy a tool
believing they're doing my job, it's ended up net positive for me. When it
inevitably goes south, the reminder that what I do is actually harder than it
looks has been beneficial. Obviously, I always get in writing that I wouldn't
ever have to support their efforts at finger painting.

------
UseStrict
Oh great, another article claiming that cloud and APIs will make code go away.
What do these people think drives APIs and provisions infrastructure in the
cloud? We may get better abstractions for some forms of data, but there will
always be a demand for code, even when we get hyper-intelligent AIs it will
still need to be able to generate code.

At $WORK, management attempted to jump on the no-code train, forcing us to use
Salesforce and OutSystems to attempt to write complex business systems. What
we got was an unverifiable, totally vendor locked, poorly performing system
that cost hundreds of thousands of dollars. It was an exercise in frustration.

~~~
orclev
Any system sufficiently complex to replace programming is indistinguishable
from a poorly written and designed programming language. It's the same broken
thought process that leads to business rules engines which themselves always
end up being maintained by a specialized programming team. You can't magic
away the skills and requirements that necessitate a programmer, you can merely
make the programmers job easier or harder.

~~~
borplk
orclev law :)

------
yoz-y
Only using visual development exacerbates traceability and verifiability
though. Reviewing "code" is hard if there is none.

For example I have yet to see a good way to visually diff graphs.

~~~
yorwba
Text diffs are usually shown by displaying added parts in green and removed
parts in red while including surrounding context in a neutral color.

You can do exactly the same thing for graphs.

~~~
yoz-y
You can, but I have not seen tools that actually do. My issue is that I
develop a graphical node-based programming language and I'd like to be able to
see diffs. I thought that I could convert our format into some other form and
use a diff tool for that format.

~~~
yorwba
Right, the problem with developing a completely new programming language is
that you need to also create new support tooling for all the ways in which it
is different. That's one of the reasons most programming languages are so
similar: it's just less work that way.

If you use text as a storage medium, you can reuse diff, grep, sed and other
text-based tools. They won't be perfectly adapted to the language semantics,
but good enough. Debugging with gdb is a similar situation: if you include
DWARF in compiled binaries it works, otherwise your language needs its own
debugger.

I think the best thing you could do is to convert your graph into some common
format like dot and create a generic diff tool for that format to make it
easier for the next person with similar needs.

~~~
yoz-y
We are using xml underneath, the problem is that since the graph can be
cyclical the nodes and links are in separate lists, which makes a standard
diff quite a poor tool. As you say, converting to dot and making a tool for
that is probably the way to go. I was just hoping that I could do just the
first step. I was thinking about SVG as well, but in that case the diff would
have to be very specifically tailored and not much of a reusable brick.

------
everdev
This reads like a promo piece for these companies. It's been tried many times
and hasn't worked.

As a developer, I'd love anything that makes my job easier, but that usually
is more control and explicitness. I think "always bet on text" is a more
accurate future:
[https://graydon2.dreamwidth.org/193447.html](https://graydon2.dreamwidth.org/193447.html)

~~~
TeMPOraL
I strongly agree. Text is just much more efficient in so many ways.

A tangent, this is related to why so many people - myself included - love the
paradigm of Emacs as an OS, i.e. moving as much of their workflow and non-work
tasks into the editor. It's because of - severely underexplored elsewhere -
the power of the abstractions around text Emacs provides. When everything is
text - including the UI - but the text isn't transient, like typically used in
Unix, but at rest, like in text editors, you get to employ the same set of
editing, navigation and search commands to do vastly different tasks. Combine
with how so many things can be represented by text, and you get a very
powerful tool.

------
bertil
I’ve worked for a company that had a significant need for developers (18%
growth per week sustained over three years). Most of the needs were not
addressed by product team because no one had to time to notice them. So, some
of the guys decided to use no-code tools to set up processes: Google Docs,
Zappier, IFTTT were popular, connected haphazardly to the Reporting tools --
up to a point where half of the process complexity was hidden in those areas
that the CTO had no idea existed.

Was it good, scalable, usable? Not really, it was a heart-stopping
inconsistent mess, but it held while the company was hiring. It gave to the
newly hired product managers patterns to follow, experience in which processes
to follow, impact estimates of what needed to be prioritised.

I’ll readily agree that the circumstances were exceptional, and the
communication with the product teams could have been more helpful, but the no-
code approach provided prototyping. I can imagine a tech lead push back on
some requests by asking project managers to implement some options with no-
code tools and come back with more insights, before committing developper
time.

------
archi42
I recall we were shown something like this a while ago in an embedded systems
class. Engineers can basically build their engine/flight/you-name-it control
software from boxes they click together. Some are inputs (sensors), some are
outputs (actuators), other mash those together (tasks) - so you get a graph-
based model of your system. And a compiler builds structured C code from it. I
was told - so this is just hear-say! - that it's not uncommon for the
engineers working with these tools to be well qualified domain experts, but
rather poor programmers.

What I want to say is: This no-code stuff the article is mentioning as cool
and agile and all - that is already in existence for nearly 20 years; and
that's only one tool I accidentally happen to know of. The author just adds
"agile", "cloud" and "disruptive" to a yesterdays idea/technology.

(I should disrupt myself now from the thought how a WYSIWYG web editor is an
cloud based tool using a no-code way to generate HTML from your text; and in
most cases that probably even qualifies as agile...)

------
bitwize
I liked this meme better the first time around -- when it was called CASE
tools.

Or how about the second, when it was called UML code generation?

The computer magazines of the 80s and 90s are littered with advertisements
from fly-by-night companies promising: "Develop this app with ZERO lines of
code!"

The problem with codeless development environments is -- okay, instead of code
I'm going to represent programs with something else, say, a symbolic language
based on flowchart symbols. The problem is, once my flowchart symbols become
concrete enough to be executable by computer, they also become isomorphic to
constructs in some programming language. So you haven't really gained anything
in expressive power, but you have:

* possibly made things more comprehensible for visual thinkers (at the expense of verbal thinkers). Note that men skew visual and women skew verbal, so this may be considered sex discrimination.

* forced the developers to engage in far more rat wrestling

* required as a hard dependency a particular development environment in order just to get the "code" up on the screen, let alone interpret or change it, thus committing the mortal sin of vendor lock and, in QuickBase's case, locking the application into a cloud platform which may go away when the company liquidates

* made the program more difficult to version-control, diff, or perform large-scale edits on (outside those explicitly supported by the tool)

* turned complex programs into visual clutter, making them far more difficult to manage

* made it difficult or impossible to do things which violate the tool's assumptions about the program structure or function

And there _will_ be assumptions, because the way to gain more expressive
succinctness than is exposed by a typical programming language, and hence to
build the case for using such a visual tool in the first place, is to assume
like crazy. Assume that there is a database, that the application will appear
as various forms and reports, that programmers will never need or use certain
constructs/patterns, etc.

So invariably these attempts to take the code out of software development
fail, and get relegated to niches akin to the skeevy back pages of _PC
Magazine_ right next to the porn advertisements. It seems to be happening
again, as lastI heard QuickBase wasn't doing too good.

------
himom
FileMaker Pro

Access

Rational Rose

Logic Works ERwin

App Maker

PowerApps

OutSystems

VisionX

Mendix

QuickBase

... and a zillion others.

There’s nuance in a spectrum between non-technical end-users developing simple
workflows and forms, and trying to run an entire business on it with business
rules, interoperability, standards, etc.

~~~
jkoudys
I worked at IBM back in Rational Rose's heyday (though not on the Rational
team, thankfully). The sheer amount of configuration you'd inevitably end up
doing was orders of magnitude greater than the actual thing you wanted to
write.

------
oblib
Learning to use APIs for open source libraries is really where the power of
creating quality custom apps fast and easy resides and a GUI laid on top of
those is a very limiting factor.

I made my first app with one called "The Apple Media Tool" and another with
it's successor, "iShell". Using those did help me understand how code worked
when I was first learning but they also exposed the limitations of taking that
route, and they're huge.

If I had an app that provided a GUI for all the features I might want or need
now it would be bloatware that takes a very long time to learn how to use and
I would still be limited to what it included.

The rub is, you don't always know what you need until you need it when
building an app.

Learning how to use available APIs on the other hand has no such limits and
allows you to drop them in and learn how to use them on a "Need to Know"
basis, and to contribute to the projects that provide them.

That's really a pretty great way to build software.

------
InclinedPlane
Always wrong, always.

Pick some problem space. Now imagine specifying some tool to work in that
problem space. Now imagine all of the unique cases and complex interactions
that need to be handled and how the specification needs to take into account
all of them. You can't simply rely on "obvious by context" solutions, the
specification needs to embody that context, AI won't save you. What you have
in the end is a very complex machine, it is code. You can move up and down the
abstraction levels from machine/assembly code up to python or SQL up to even
higher levels, but it's all just coding. I contend that job will never go
away. It may approach problems at progressively higher levels of abstraction
but that just makes it more valuable, not less. Today coders are able to
automate things that in the 1960s would have seemed like magic, are coders
less valuable and in demand today or more?

------
jpswade
It depends how you define "code". If "code" is a system of rules to convert
information from one form, to another, then software will always need code.

This article talks about the "agile", but creating software has never been
about code. It's always been about delivering value. Code or no code, creating
software and delivering value is no trivial task.

I believe we will always need to define the conditions and boundaries that
computers need to work within.

What we need to get better at is defining and communicating what those
conditions and boundaries need to be.

I think one great example of that is Behaviour-Driven Test (BDD) and gherkin.
Gherkin is code written in a business readable, domain specific language.

Programming computers may change and complex computer code will become
abstracted away further, but no matter how you look at it, programming
computers will never totally disappear.

------
tabtab
This has been an age-old promise, but I see two primary bottlenecks that keep
surfacing.

First, non-IT experts don't understand the trade-offs involved in their
decisions. They may make a barely "good-enough" decision to get started via
trial-and-error, but it often doesn't take into account a lot of implications
because they don't know what questions to ask themselves and others.

Second is that existing data is not very malleable. Once production data is
created, it sets conventions in stone that are not easy to change. You cannot
change your data structures and relationships without a lot of manual re-
translation of the data itself. Therefore, early decisions make a big impact,
bringing us back to the first bottleneck.

A better use of resources is perhaps to make IT professionals faster rather
than try to make non-IT people into instant IT professionals.

------
js8
This made me smile..

So legacy applications are going away? Like the one that I maintain at work,
written in (mainframe) SAS, which is so-called 4th generation programming
language, invented with a promise of users just "telling" computers what to do
in a "natural" language.

Don't get me wrong, I like the promise. It's what Facebook is doing with Haxl,
which is a domain-specific language for content filtering, written in Haskell.

Unfortunately, the needs of users will often grow and the language will have
to become Turing-complete (like for example, HTML/CSS was extended with
Javascript), if it wasn't already, and also will have to become extensible by
user routines written in mainstream language. And then you get messy, "legacy"
(as they are called after few years) systems.

------
DonnyV
The bulk of programming is dealing with all these layers we've come up with.
To make things easier for humans to send commands to machines and make them do
what we want them to. Honestly a decent level A.I. system could replace most
coding. I don't think it would need to be human level but at least would need
to learn from watching people do tasks. Yes most apps are CRUD apps but a lot
are CRUD apps with events that need to happen at certain times to maybe give
feedback to users or populate another part of the app. This is where good UI +
UX comes in. That is not an easy thing to do once you break out of the canned
templates.

I started this post thinking a decent A.I. could do this but now I'm not so
sure. ;-)

------
GuB-42
There is no such thing as "no code". You can hide it behind pretty interfaces
but in the end, you are just using images instead of words.

And with images, people think of children books, easy, right? In reality, as
soon as you introduce a tiny bit of complexity, it becomes closer to Chinese.
In Chinese, you can still see the little drawings, but that's just the
background of a highly complex language.

Whether you code in an English-like text based language or a proto-Chinese-
like picture based language doesn't change the fact that you are coding. Most
people end up preferring the "text" approach that is more adapted to
computers, and ultimately easier.

~~~
emodendroket
At least 80% of Chinese characters are picto-phonetic -- meaning that you have
a radical that only gives you a vague category for the lexical item and
another part that gives you a hint of how to pronounce it (or would have when
the character was devised, anyway; there's substantial drift). To think about
this in terms of English, imagine the word "I" were written with a stick
figure and a picture of an eye (except for most the meaning is a little more
obscure than this). It's only a handful that are really like pictures.

------
vijaysamanuri
I personally feel no-code/low-code platforms are here to stay and will be the
future.

The straight-away advantage with these platforms is the readily available
development environment. The visual interface helps you to improve the quality
and accelerates the development which allows Rapid Application Development
(RAD)

These platforms allows you to move towards app modernization in a flexible and
incremental approach with out disruptions.

WaveMaker ([https://www.wavemaker.com](https://www.wavemaker.com)) is Yet
another RAD (Rapid Application Development) Platform which also offers
continuos Deployment.

------
geraldbauer
No Code Trivia Quiz:

Q: What's the name for someone who can't read & write?

\- (A) Chief Alphabet Officer

\- (B) Illiterate Know-All / Know-Nothing

\- (C) No Alphabet Creative

\- (D) Other - Please Tell

Q: What's the name for someone who can't read & write code?

\- (A) Chief Digital Officer

\- (B) Certified No Code Business Architect®

\- (C) Quick Base / Mendix / Zudy / Pega / <Your No Code Corp. Name Here>
Marketing Bullshitter

\- (D) Other - Please Tell

Source: [https://github.com/bigkorupto/awesome-nocode#no-code-
trivia-...](https://github.com/bigkorupto/awesome-nocode#no-code-trivia-quiz)

~~~
troels
Wow. I read that page, believing it to be satire, but then clicked through to
pega.com - [https://www.pega.com/technology/software-writes-your-
softwar...](https://www.pega.com/technology/software-writes-your-software).
And I'm still not sure if this is an elaborate in-joke.

~~~
DonaldFisk
There have been earlier attempts at this, e.g. The Last One:
[https://en.wikipedia.org/wiki/The_Last_One_(software)](https://en.wikipedia.org/wiki/The_Last_One_\(software\))
They work only to the extent that the generated programs were foreseeable by
the software designers.

> You don’t code software in Pega. You design it.

This has to be from a mindset that designing and coding are different jobs
done by different people:
[http://wiki.c2.com/?ArchitectsDontCode](http://wiki.c2.com/?ArchitectsDontCode)

~~~
troels
Yes, but the copy on the page was the part that tripped me up the most.

------
meesterdude
Lots of good comments in here. I will chime in with the observation that:
we've made a lot of progress. To be a programmer 30 years ago is totally
different from today. The tooling is better, more approachable, more abstract
and less technical.

Will we ever be without code? Doubtful. Things will evolve and improve, but
the only visions sold of no longer having code is in the sales pitches of
products.

------
sgt101
There exists a tool called "Blue Prism".

I have an existence proof of an enterprise that was/is sold this tool on the
basis that there is no code and the users will use the visual editor to create
the code themselves.

Behold : [https://www.indeed.co.uk/Blue-Prism-
jobs](https://www.indeed.co.uk/Blue-Prism-jobs)

If it's so easy why are people being paid £500 a day?

------
bmurphy1976
I've been trying to automate myself out of business for 30 years now. So far,
I have failed miserably at it and am busier than ever.

------
jrochkind1
People have been saying this for at _least_ 30 years.

[https://www.washingtonpost.com/archive/business/1990/12/31/a...](https://www.washingtonpost.com/archive/business/1990/12/31/apple-
upgrades-role-for-hypercard-programming-
package/fb428369-795f-4352-8003-ca08543ba1d9/)

------
geraldbauer
FYI: I've started a collection about awesome no code [1] - Why code? All about
the Chief Digital Officer's (CDO) clicky-clicky-clicky dream future. No text.
No numbers. No code.

[1] [https://github.com/bigkorupto/awesome-
nocode](https://github.com/bigkorupto/awesome-nocode)

------
gdltec
I do believe that the majority of coding will be replaced by AI or some
automated process requiring some inputs and expected results to "code" an
application. However, this article reads like an advertisement, and it fails
to go deeper into the topic at hand, unfortunately.

------
Jach
"Linux supports the notion of a command line or a shell for the same reason
that only children read books with only pictures in them. Language, be it
English or something else, is the only tool flexible enough to accomplish a
sufficiently broad range of tasks."

\-- Bill Garrett

------
dman
Whenever I hear of this I think of "The future of math is no algebra, only
geometry".

------
pcunite
"640K is more memory than anyone will ever need"

------
wintorez
No it's not. For the simple reason that Artificial Intelligence can never ever
cover for Natural Stupidity.

------
jordache
when I think of no code/ low code. I shudder a bit as I think about
ServiceNow.

------
RickJWagner
Yeah. I remember when 4GLs were going to replace COBOL, too.

------
salawat
Code isn't going anywhere.

As long as computational operations exist, there will need to be a language to
represent those operations. As long as that exists, there will be more
languages to streamline that, and so on and so forth.

Computers are not magic. They do not think. Even if they did, they would be
constrained by the same fundamental linguistic bottlenecks that we human
beings are.

To illustrate this point: I will concede that no code solutions are possible
when every human being achieves Buddhist Enlightenment. This would be the
human equivalent of a no code solution. No one communicating, or orchestrating
external experiences to create an epiphany. Just an internal realignment of
values.

Another illustration: Convince all of your neighbors to agree with you without
communicating to them in any way.

Coding is a fundamental axiom of information transfer. If you aren't doing it
in some fashion, you aren't DOING anything. If you are doing it, then you are
constrained by your encoding scheme. The thing that trips up most users is
that they aren't consciously aware of their own encoding schemas, and are
typically resistant to adopting someone else's. In the article's case, they
are asserting that "visual" programming is not code.

Which is dead wrong.

It just means the new code is your visual motif or paradigm. (E.g. A box means
X, a line means Y.)

Underlying that visual implementation is...nothing else but more code!

You will never compute without code. You will never COMMUNICATE without code.

This entire article is just a visual representation of a complete vacuum of
common sense.

Note the irony: My response is an example of resistance to adopting the
author's encoding scheme. It is likely that his point is not communicable to
me in its current phrasing because we have fundamentally different experiences
and knowledge structures that get decoded from the same visual representation
(the article). Or it is being clearly communicated, and it's just BS.

(E.g. when I see the phrase "no-code solution", it translates to me as a
hypothetical solution that doesn't require communication of information
according to a fixed set of schemas to orchestrate desired computing behavior.
This is dead on arrival. If we are communicating, coding happens.

Either the author is mistaken, or I am. The only way to know for sure is for
he and I to COMMUNICATE, establish a common coding schema between us, a
"common sense" if you will, and to clear things up so we can both confidently
say that we are saying the same thing about the same thing.

TL;DR:

No-code happens when world peace does.

------
s2g
Sweet, I barely write code at work now. So I should be setup well for this.

