
Shipping Culture Is Hurting Us - uam
http://bitbashing.io/2015/02/16/shipping-culture.html
======
didgeoridoo
The author talks about languages and technologies (JS, Mongo), but he's really
getting at something deeper. The real danger of the "ship it" culture is that
things that can't be "shipped" right away — things that require solving really
hard problems — tend to fall off our collective radar because there is just SO
MUCH cool and (relatively) easy stuff to do right now. PG has a great term for
this: "schlep blindness":
[http://paulgraham.com/schlep.html](http://paulgraham.com/schlep.html)

~~~
digi_owl
Makes me wonder if this is why i find the whole "devops" concept to be raising
my hackles.

~~~
dkarapetyan
How about "full-stack developer"?

~~~
makeitsuckless
That's the one. That's basically short hand for "you don't need to do anything
really well, you just need to do everything barely good enough to ship it
now".

~~~
CmonDev
The sad bit is that many full-stack developers are actually better that
specialized developers in respective skills. (unless of course they are
JS/Node.js types)

~~~
_pmf_
> The sad bit is that many full-stack developers are actually better that
> specialized developers in respective skills.

But those do not call themselves full-stack-developers.

------
saidajigumi
This whole article makes me want to scream, if only because the author seems
to have never heard of the concept of Path Dependence[1]. I could utter a
similar rant, on how terrible it is that we're stuck with awful legacy dumb AC
lightbulb sockets everywhere, and wouldn't it be nice if the entropy fairy
just waved a magic wand and we all had 'net-connected DC smart sockets for our
wonderful Future Bulbs.

But that kind of talk utterly dismisses the important reality that 1) we don't
have a magic wand and 2) we have to deal with things like path dependence and
network effects as phenomena. Example: if you chart out historical network
bandwidth, you'll get a Moore's-law-esque curve, but with a significant step
function depending on which network aspect you study. Why? Path dependence and
(literal!) network effects: we don't see the benefits of network bandwidth
improvements until enough hardware has been upgraded to see an end-to-end
improvement.

[1]
[https://en.wikipedia.org/wiki/Path_dependence](https://en.wikipedia.org/wiki/Path_dependence)

~~~
soup10
Javascript has become the defacto browser programming language, even though
it's objectively awful for complex software. The only reason it's achieved
it's high status is because browser developers refused to cooperate and
develop something better. Microsoft and Apple view software lockin as a
competitive advantage and have actively undermined technologies they thought
were threatening(java and flash amongst others). The explosion of Html5 and
javascript was more of a 'nature will find a way' type development and can
almost entirely be credited to Google and various web companies pushing it and
making it better.

It's still a very flawed technology stack for complex web-apps though and it's
very reasonable to suggest that we could do a lot better.

~~~
nulltype
If you think Javascript is bad for complex software, you should try C++.

~~~
pjmlp
It is not that bad without the C parts.

------
angersock
So, there's a great argument to be made about the negative impacts of
"Shipping Culture".

One could argue that it encourages underdesigning of ultimately complex
systems. One could argue that it encourages excessive code and infratstructure
reuse, to the point that even trivial projects pull in more than is needed and
consume more resources than even remotely necessary. One could even argue that
it creates in customers an unreasonable expectation about ongoing fixes to a
project, about ongoing support contracts and culture, and generally
discourages creating artifacts that later generations can use in favor of
ongoing maintenance performed by morlocks.

Sadly, the author makes none of those arguments, and instead gripes about
hackathons, about Javascript (badly), and about MongoDB (which is fun, but
doesn't go anywhere).

Author bemoans "ship it now" culture, bemoans shipping things before they're
technically good--and then has the gal to use a Netscape Navigator screenshot!
Does anyone else find irony in that? Netscape was never known for good quality
code or sane implementations--go read old kvetching by Brendan Eich or Jamie
Zawinski. You know what they _did_ do, though? Fucking shipped.

Author complains about NaN--which is a goddamn _standard_ in IEEE 754, which
is exactly how C++ (their pet language) handles things, and which is
completely reasonable behavior for any JS implementation. Not having an
integer type? Newsflash kid: JS supports bit-accurate integers up to Int32s.

Author claims systems stopped using coopterative multitasking two decades ago.
Author has presumably never written code for hard-real-time embedded systems.

Author goes on to complain about ancient standards. I have an ancient 120v
60-hertz AC circuit running my house. I have an ancient system of measurement
for the beer on my desk. Funny thing about ancient systems: _if they 're still
around, it's because they've perhaps solved the problem well enough to stay
around_.

"Shipping Culture", as defined by the author, isn't what's hurting us: it's
the influx of jackasses with loud mouths and no appreciation for history,
business, or engineering loudly proclaiming that perhaps the most productive
decade in software engineering is somehow wrong.

~~~
username223
Yeah, the title and the article have very little to do with each other.
Complaining about NaN semantics is especially irrelevant.

OTOH, it's funny you bring up Netscape. Yes, they shipped. Back in the
Netscape 3-4 days of the Browser Wars, every new version was a crap shoot. It
might un-break some page you wanted to use, but it might also be drastically
slower, crash a page that you needed, lose your preferences, or whatever.

But the cool thing is that you could choose when or if to update! You could
install the new version alongside the old one, try it out for awhile, and junk
it if it was worse overall. If most people thought Netscape was getting worse,
then most people would stick to the old, working version. Nowadays browsers
follow the same "random walk" development model, but shove the latest version
down most users' throats whenever they feel like it, and make it hard or
impossible to revert the latest breakage. "Ship crap" combined with "software
force-feeding" is good for no one but lazy coders.

~~~
angersock
So, on the one hand, I agree very much that having the options available was
pretty cool.

On the other hand, having seen this repeated time and time again in the
enterprise world, I don't think users should be given the option of not
updating, especially for services that they aren't hosting themselves. All
they end up doing is creating a support burden and being dissatisfied.

Our users are increasingly ignorant about the systems that they use--I think
that precludes them from having final input about how said systems are
implemented and deployed.

~~~
username223
Since I don't support anything "enterprise," I'm probably a bit biased. Still,
"creating a support burden" basically means "making more work for the
developer," while "updating" almost always means "making more work for the
user." I happen to use Emacs, and even though it's very slow-moving and
careful about backward compatibility, I always put aside some free time for
major updates, because they always break my setup somehow. And I'm lucky
compared to the average software user: I'm a coder, so I can usually work
around the breakage without too much trouble. Regularly making work for people
without this option is inhumane.

Regarding ignorance, you know far more about how the systems work, but they
probably know far more about how they use them.

I still think the auto-update treadmill is a symptom of developer arrogance,
laziness, and callousness, but maybe I'm just old before my time.

------
DiThi
> Instead we get some bizarro-world where where the type of NaN (“Not a
> Number”) is number, where NaN !== NaN*, and a chart like this exists for
> something as simple as comparing two values.

That's the definition of NaN in virtually any programming language with
floating point numbers. And the comparison table makes sense with the rule
"when types are incompatible, both are casted to strings". Just use ===
instead of ==, which is only really useful for comparing against
null/undefined.

~~~
kazinator
The stupid thing about IEEE NaN is that it's not equal to itself! If variable
x holds a NaN, then (x == x) is false.

This violates:
[http://en.wikipedia.org/wiki/Law_of_identity](http://en.wikipedia.org/wiki/Law_of_identity)

If (x == x) tests false, then it asserts that x is not itself, which is
logically preposterous.

ANSI Common Lisp has a bit of this problem in it too, but it's not _required_
; it is there for some weird historic implementations. That is to say, if x
holds a number like 1, then (eq x x) is not required. (But in sane
implementations it does yield t; and it yields t even if x is a bignum,
because (eq x x) is given the same object as two arguments. Two separately
computed bignums of equal value will likely, of course, not be eq.

How this can be explained is that eq tests "implementation identity", and
somehow different instances of a number are treated as different
implementations. Argument passing is by value, and the two reductions of the
expression x in (eq x x) to a value somehow produce a different implementation
of the value.

This rationale is unrelated to IEEE NaN-s, though.

~~~
pwnna
Don't quote me on this, but I recall the rational for NaN is because NaN is
typically the result of a division by 0.

Divisions by 0 can be thought of as infinity (for the sake of this
explanation, but mathematicians will cringe), but it is not any _particular_
infinity. In the sense that x / 0 does not necessarily have to equal y / 0\.
For that definition, the result of a division by 0, NaN, must not equal
itself.

You can use `isNaN`, tho.

~~~
kazinator
I understand the point perfectly. However, if I have a NaN which is captured
in a lexical variable (perhaps the result of a division by zero, as you note)
then in fact I do have a particular infinity: whatever object is inside that
darned variable! If I do another division by zero, then sure, hit me with a
different NaN which doesn't compare equal to the first one I got. But don't
make my variable not equal to _itself_.

~~~
cliffbean
Normal division by zero gives you Infinity. To get NaN, you have to do
something as numerically confounding as divide zero by zero, which isn't any
infinity, because the numerator is zero, and which isn't zero or any finite
number, because the denominator is zero.

------
kazinator
> _Systems stopped using cooperative multitasking at least 20 years ago
> because it sucked compared to the alternative of automatic, preemptive
> multitasking. And yet Node.js harks back to those dark days with its
> callback-based concurrency, all running in a single thread._

Uh, no. For instance, Unix kernels have been traditionally cooperative---at
least when executing kernel code. That is to say, user space can be preempted
but not the execution of kernel code. The introduction of SMP brings
concurrency into the kernel, though, and with that, preemption can follow.
Linux development followed this path.

Still today, you don't have to turn on CONFIG_PREEMPT when building your
kernel. If you don't have SMP either, then you have a cooperative kernel: one
task is in there at a time, and it has the CPU until it voluntarily calls into
the scheduler.

Cooperative tasking has the enormous advantage that it makes a whackload of
potential race conditions go away, and that could be a possible reason why
Node.js is the way it is.

Unix might not have been successful had Ken Thompson decided to make the
kernel preemptive, and then spent 1973-1987 debugging it. :)

~~~
pjmlp
>Unix might not have been successful had Ken Thompson decided to make the
kernel preemptive, and then spent 1973-1987 debugging it. :)

Sadly that didn't happen, UNIX was adopted by 80's hipsters into their
startups, became adopted by the industry at large, thus spreading C into the
industry.

Now we have patch <choose day of week>.

~~~
cben
I'll bite. When I learnt C I found it charmingly simple and it took me years
to realize how deeply broken some aspects are, especially the separate
compilation circus — programmers working hard to help the compiler and not
getting any modularity benefits in return.

So suppose C didn't happen. Do you expect the industry at large would have
adopted a great language instead? Or some random product of history, half way
between PL/I++ to JavaScript?

~~~
kazinator
Enforced modularity in languages has downsides, like basically killing the
flexibility in system construction. Not every image produced is an application
program running over an operating system.

------
mpweiher
Yes. If your problems are essentially trivial and you are just trying to gauge
customer moods, shipping and shipping again quickly is the way to go...and
thinking about the problem will probably not help.

However, while quick iterations converge rapidly on a local maximum, they
really, really suck at getting you out of that local maximum.

I notice this with trying to create a new and different programming language
([http://objective.st](http://objective.st)): for almost all the problems I am
tackling, there are quick and obvious answers...that get me stuck in the same
mess we are already in. So I've found it necessary to deliberately delay
implementing stuff, going _slower_ than I can and make sure I leave the time
for my not-quite-conscious thought processes to work out the problem and
present the results during a relaxing shower.

------
cageface
You see the same thing in natural organisms. Vestigial adaptations and odd or
even awkward designs abound, yet they are functional. Some people take "Worse
is Better" to the extreme of "Worst is Best" but there is some wisdom in the
idea that it's better to get an adequate solution into play early than to wait
for for a perfect, clean-room design. The important thing is to setup a
feedback loop that continually refines your solution.

For an inspiring take on the power of chaos in evolution check out Errol'
Morris' film "Fast, Cheap & Out of Control":
[http://en.wikipedia.org/wiki/Fast,_Cheap_%26_Out_of_Control](http://en.wikipedia.org/wiki/Fast,_Cheap_%26_Out_of_Control)

------
yason
Shipping doesn't always need to mean shipping to the public.

I write complex software too but I "ship" to myself early and often. I start
by trying to solve a hard problem as fast and as quickly as possible in a
naive way so that I can experiment more freely. This gives me more insight
into the problem itself and into the solution space. The solution often
changes -- sometimes it's the problem that changes.

If I were working with a customer or a startup I would definitely "ship" early
and often, too, but I wouldn't give much weight to these "releases". They're
just something to play with and to cut the shape for the problem.

------
ChuckMcM
The problem that I have with this thesis (that shipping culture is hurting us)
is the results people have. You can certainly argue that a better result could
be achieved by taking more care with your design, but it is unclear that in
doing so you would have achieved something more "valuable" than you did by
shipping and iterating.

I do agree with Gary Bernhardt that infrastructure is not getting the
attention it deserves, but as we saw with GPG it isn't something that is
easily funded. Sun paid an engineer who did mostly nothing but maintain xterm
(and the Sun tools equivalent) for several years. Where are you going to find
a sponsor for a new terminal? Current terminal code could be improved of
course, but in so doing would you get better code? Better systems?

~~~
spitfire
The thing is the "Think carefully and create deliberately" approach has been
tried so very few times. But when we have tried it we've gotten Lisp machines,
ACID RDBMs, strong typing, and many other important and very useful tools. I'd
say it's worth a shot more often.

We know we're in a world of hurt now, but the argument against it is "oww.
Devil we don't know!".

Math is hard, lets go shopping.

~~~
nostrademons
The "think carefully and create deliberately" approach has given us a lot more
than that - there's also Smalltalk, Xanadu, Eros/E, NextStep, BeOS,
microkernels, Plan9, Dylan, and General Magic.

A look at that list is pretty instructive for _why_ more people don't take the
"think carefully and create deliberately" approach: by and large, the creators
of those projects failed to profit from their inventions. In many cases, they
wasted years of their life slaving away in pursuit of perfection, and the
market didn't care. If you study any of the systems I mentioned, you'll find
some incredibly elegant and beautiful CS concepts, ones I wish I could use for
every-day programming all the time. But the problem is that none of these
innovations exist in a vacuum, and in the time it took to perfect the product,
the market passed them by and the world changed in a way that made them no
longer relevant.

------
narrator
People use mongo and node because they want to get a site up really really
fast and see if it gets traction. When they actually get somewhere they'll
rewrite it in a better language.

If you want to set yourself up with really slick tooling and a great language
you can code in Scala with Intellij. You can even avoid touching any ugly
dynamically typed stuff by coding your JS in scala.js. If you're doing it
better than the other guys then great, you can code rings around them. Maybe
it's not worth all the extra complexity when just starting out though.

When you get to big corp size code bases static languages are more common.
That's because to be able to navigate and maintain that big of a code base, a
good IDE, a static typing compiler and refactoring tools are a huge help.

~~~
falcolas
> When they actually get somewhere they'll rewrite it in a better language.

Anecdata for you: Not necessairly. I'm working for a company which has
"actually gotten somewhere", and our development team is still writing in Node
with Mongo.

Of course, there is a reason for it, the front end development team has spare
cycles, and the backend team does not. Ergo, Node!

~~~
alxndr
I'm inclined to repurpose a cliche, and say that now your backend team has two
problems!

------
kazinator
This is not just the last several years.

It comes from a slogan that was popularized by Eric S. Raymond some 18 years
ago:

[http://en.wikipedia.org/wiki/Release_early,_release_often](http://en.wikipedia.org/wiki/Release_early,_release_often)

~~~
irickt
Or Guy Kawasaki, circa 2000

>>> Don't worry, be crappy. Revolutionary means you ship and then test... Lots
of things made the first Mac in 1984 a piece of crap - but it was a
revolutionary piece of crap.

------
halayli
It's a spectrum. Trying to ship a perfect software from the start will
probably end up like OS/360 project. Then K&R came along and designed what we
know now as unix. An OS that wasn't complete and fully featured like OS/360
was intended to be, but it worked!

Making the wrong choices is inevitable, but correcting them is part of the
software life cycle.

We aren't working with concrete and building bridges here. If we built Pisa
tower like software, we can still fix it.

So yes, ship when you can as long as it works and delivers what it promises.
It doesn't have to be perfect.

~~~
hyc_symas
Delays getting OS/360 were a blessing, really - it prompted many universities
to develop their own mainframe OSs, thus helping OS research flourish around
the world.

------
mrgriscom
Reminded me of this essay: [http://www.jwz.org/blog/2012/08/a-generation-lost-
in-the-baz...](http://www.jwz.org/blog/2012/08/a-generation-lost-in-the-
bazaar/)

I don't think it's fair to attribute the issues he brings up to "shipping
culture". Rather, the causes are two-fold:

1) Lack of appreciation for history, being doomed to repeat it, all that

2) Making anything above the level of 'just not-terrible enough' without a
benevolent dictator is _hard_.

------
RangerScience
The author's bit about how we forget about the problems and solutions of our
forefathers - that resonates with me.

We do seem to keep re-inventing the wheel, and while part of that is out of
joy of creation, I think a lot of that is that we're not writing down the
problems, and we're not teaching the problems. We're only paying attention to
solutions, and that's limiting.

Code is a solution to a problem, but it's not always apparent what that
problem /was/ when you just look at it. So NodeJS (according to the authoer)
is doing something that was thought of as a good idea at first, and then
people learned why it wasn't, and /nobody wrote that down/. Or, at least, when
they wrote it down, nobody taught it.

Solutions are /answers/, problems are /questions/. Here's a beautiful
illustration of this dichotomy: [http://dead-
logic.blogspot.com/2012/09/a-collection-of-quest...](http://dead-
logic.blogspot.com/2012/09/a-collection-of-questions.html)

This is then part of why the TDD movement is in the right direction - you
write down the problem you're going to solve, and later, someone can come read
it. And maybe teach it.

(Not saying TDD has gotten "there", but it's in the right direction)

------
annnnd
While I agree with the facts in the article, my own conclusions are quite the
opposite (feel free to downvote / flame, but please read the whole response
first).

I guess it depends on values - for me, what matters is the value added to
customer. Iterating quickly (often) yields better results in this area because
it allows you to guess customer expectations early in development process. I
have often encountered die-hard engineers who want 100% specification
upfront... In my experience, world doesn't work this way. It would be nice if
it did, but it doesn't.

And yes, I am guilty. I am using MongoDB AND JavaScript (but not Node.js - not
that I have anything against it, just never had the need for it). I don't use
these technologies because they are "cool", but because they solve specific
problems in an efficient way. Which is probably why hackathon devs used them
too. And yes, I would appreciate schema in MongoDB and types in JS, but I can
live without it. Life is made of tradeoffs. Does it really matter if latest
Fart App (tm) builds on transaction-safe DB and uses strong type language?

So, are we seeing a rise of "shipping culture"? Yes. Does it change how we
work? Yes. Is it in some ways worse? Yes. Is it hurting us? No.

------
andrewstuart2
Every time I see someone quote that JavaScript was designed in ten days I
cringe.

JavaScript was designed in 1995. It was _standardized_ as ECMAScript in 1997.
Ever since then, it's been under active development by a thriving community of
engineers pushing for better standards.

It's been a bit longer than ten days.

~~~
moe
_Ever since then, it 's been under active development_

What have been some significant changes to JS since it was thrown together by
Netscape?

It seems to be largely the same trainwreck today that it was 15 years ago.

~~~
andrewstuart2
[http://en.wikipedia.org/wiki/ECMAScript#Versions](http://en.wikipedia.org/wiki/ECMAScript#Versions)

[http://kangax.github.io/compat-table/es5/](http://kangax.github.io/compat-
table/es5/)

------
danschuller
An interesting read but poor tools can't only be attributed to shipping
culture. A more important reason is the tools are good enough (only just, but
that's enough) and at that point priorities change. And once momentum becomes
big it's hard to change direction i.e. Javascript, so hard to revisit the
fundamentals.

Better tools are coming, such as Lighttable / Bret Victor's talk:
[https://vimeo.com/71278954](https://vimeo.com/71278954) . But it's not clear
when they'll arrive! Will we still be developing web-apps in Javascript after
another decade? It feels like we can do better.

------
dustingetz
Seth Godin wrote that 5 years ago, i think (hope) the economics are shifting,
in the last 5 years a lot of the omgcats apps (market opportunities that can
be captured with lousy software) have already been written, i think it may not
be so easy for companies started today. However with the pace of innovation in
the last 5 years (basically the coming of age of functional programming) it's
a lot easier for a single person to write high quality software.

Anyway common wisdom is always going to lag behind today's actual best
practices

------
d_b42
All things in moderation..

To me, "ship it" as a philosophy means the most important thing we do as
developers is deliver working code to customers.

On my team, when we encounter a concept that is just too complicated to ship
right away, we try to branch-by-abstraction and keep shipping changes, even if
the feature isn't "ready" yet.

To me, the marginal returns on trying to be right diminish more rapidly than
the returns on making it easy to fix to stuff when I'm wrong. Maybe I'm just
not as smart as everyone else here, but it seems to work.

~~~
Sir_Substance
It's more of a business necessity then a philosophical point of view.

The history of software development is littered with companies that failed
because they spent so long polishing the beautiful crystal tower they were
developing that someone released a concrete one instead and everyone bought
that, and then no one needed a crystal one any more.

Unfortunately, it can be really hard to find out about these companies because
of how hard they've been trodden into the ground by their successors, but
Xerox and Netscape would be two high profile examples.

Shipping constantly may not be the best thing for technology, it may not be
the best thing for quality and it may not, at times, even be the best thing
for your users. But if you don't, someone else will.

------
wodenokoto
While it is great using hyperlink to point readers to sources, it shouldn't
become a substitution for actual words.

------
igl
The problem is that the Author is assuming that 90% of software we write to
make a living is set into stone and suppose to run forever. Almost everything
we do is disposable and replaced within the same decade.

So he does not like Javascript and Nosql. Is it hard to find a JavaEE job? I
think not.

~~~
hyc_symas
I think the real problem is the disposability attitude you just described.
It's prevalent in our real world consumerist economy and it's killing us there
too.

Building software that's to be thrown away is a waste of mental resources and
physical resources. Society and civilization advances on top of our lasting
creations, not the ephemeral ones. Reinventing the wheel doesn't advance the
state of the art. You want to build one good set of tools that will last a
long time, so you can stop thinking about them and be free to tackle the next
truly new challenge. Doing anything else is just a waste of life.

~~~
igl
wat? How can you even compare the "problems" of building commercial software
to the problems of consumerism in capitalism?

The standards applied here are whack in the scope of what js and mongo are
used for.

Also: You only get better by reiterating.

------
chvid
I wonder if "hurtful shipping culture" has a parallel in the blogging world?

------
logicallee
honest question. do you guys want a piece of crap from me that illustrates a
new application type? (the way bittorrent or bitcoin or napster or wikipedia
was new?) or should I wait.

I don't have any resources to put into this but can release a piece of crap
myself. (I don't personally program professionally.) honest question -
discuss.

would you like a piece of crap - or for me to wait. (Nobody more competent is
going to just code this for me, at least not until the piece of crap exists
and has traction.) I don't really envision other options but am open to them.
What should I do? Get it right (not happening) or get it out?

------
erikb
I actually watched the screen cast this blog post is about. [1] And I would
like to say something about that here, because the questions from the screen
cast were also coined in the blog post.

The Question is this: "Why do people not replace VT100 style terminals?" There
are two reasons. tl;dr Terminals have other reasons than programming in them
and people actually reinvent how you can program on a daily basis. See more in
depth arguments below.

The first thing is that people actually still need the old terminal stuff.
There are loads of old computers you want to communicate with (just think of
your pa trying to relive old times be trying to get the 80's game console to
work). And there are also a lot of technologies that really need something
that stupid, e.g., if you develop your own embedded system you might actually
communicate with it using these VT100 commands. So yeah. Wow. A Terminal
(emulator) doesn't have the task to show your text editor. You can start GVim
if you just want to show your editor. They have the task to communicate. They
can communicate with the system you are running them on or you can use them to
communicate with another terminal. If you want to replace them, you have to
replace the software in your pa's console (and probably some hardware), you
have to find a new way to develop fresh, small computers, you have to find a
new way SSH works, you have to find a new way to show your text editor. It's
not impossible, but it's probably so hard that nobody would like to spend
their whole life (work time, spare time, youth to death) doing it. It's not
worth that much. Summary: It's not worth it, if you consider the whole
picture.

And second argument: People actually do, if you just think about use cases
like coding, gaming, etc. A modern game doesn't run in a Text shell as e.g.
Nethack could. It runs in a graphical shell and is represented to you in 3D,
e.g., GTA 5. Also there are many people who use IDEs. Unix+Bash+Texteditor was
actually an IDE. Eclipse etc are a new way of thinking about the editing task
with helpers like compilers, static analysers, debuggers, performance
analysers, unit test runners, etc. There are even people who reinvent the
programming wheel from another point of view, e.g., have a look at NoFlow. The
reason the other stuff is not dying is because it's useful for other reasons.
That doesn't mean you have to still use it for programming (also some people,
like me, choose to). Summary: People do work on finding modern ways to
program.

[1] [https://www.destroyallsoftware.com/talks/a-whole-new-
world](https://www.destroyallsoftware.com/talks/a-whole-new-world)

------
EGreg
I would say Ship Often and A/B test. And before that, get really big test
pool!

The first part to success is making it safe to fail

------
whiskeySix
The first two comments in the linked article are gold.

------
jtth
im from tumblr and what is this

------
foombarder
Oh yeah, right - I forgot that we write code for the sake of engineering - why
would you wanna get stuck in the shipping loop? ...

------
McUsr
I haven't tried Mongo.db, but Node.js takes up a lot of resources in a
browser, Node.js apps feels a bit like the widest book strategy in the
bookstore, you have to shove other stuff out of your browser, in order to use
stuff that uses Node.js, and I feel that as a rather arrogant approach to
deliver software, especially, when the functionality of Node.js apps doesn't
really dictate the necessity.

AS for Javscript, there are worser languages out there, and the debuggers are
good enough now, so it can be perceived as a "normal" language.

~~~
computmaxer
> but Node.js takes up a lot of resources in a browser

You must be confused. Node.js is used to run JavaScript on the server, not in
a browser (the client).

~~~
McUsr
That may be so, but when I run something with Node.js in my browser, then I
may have to close other things. So, wherever it runs, it really uses too much
resources, and therefore I find it as a bad idea from the consumer
perspective. It shouldn't be so, that somebody else, should dictate the
content of your browser.

~~~
BSousa
Are you sure you know what node.js is?

~~~
McUsr
The short definition would be: "node.js is something that eats up my browsers
latency", something along those lines, and as far as I am concerned, that is
everything I have to know about it. Node.js is a no stop for me.

~~~
pavlov
Can you name some of those Node.js apps that have been causing problems for
you?

~~~
McUsr
PopcornTime is performing badly on my machine, I actually have to shut down
the browser to use it, I usally have 50+ tabs open in it, -that I use.

~~~
imjustsaying
Last time I ran Popcorn time, it didn't run in the browser. Has that changed?

