
Negative focus - Agres
http://nrkbeta.no/2014/07/02/negative-focus/
======
normloman
No. You're too damn positive.

"We could solve this problem if we just had some idealistic 20 somethings form
a start-up about it."

"This industry sucks, but someone will have the courage to 'disrupt' it."

"When I die, I want my body frozen so I can be regenerated after the
singularity."

"My start up is doing great. I'm 'crushing it.' All the time."

"I'm not overworked. I just require far less sleep than the average person.
Say 2 hours a night."

Get real folks.

~~~
angersock
Most damning bit:

 _This does not add up. Haven’t you noticed that we have taken over the
world?_

No, software has exploded into previously untouched fertile ground--in another
few decades, it'll be as boring and constrained as anything else, probably.

Other engineering disciplines are right to mock and laugh and be somewhat
terrified about software folks, because _they can 't engineer for shit_.

One of the biggest things that, say, the ASME does is to help standardize
practices in its field. The mark of a mature engineering branch is that it has
guidelines that are applicable to most use cases, and agreed-upon tools that
work for most things.

Software is nothing like this, for better or worse--we still haven't
standardized on a set of programming languages, we still haven't standardized
on a way of documenting and planning projects, we still haven't even settled
on the most basic of issues of training.

And you know what? We openly mock the people that have tried that--think about
the criticisms of "architecture astronauts" and UML and XML and waterfall and
testing and all the rest.

We don't _deserve_ to be called engineers...we're simply lucky enough that
people are willing to pay such large sums for such hastily-built things.

------
hyperliner
Here is a conversation that could happen building a bridge if the builders
were like some developers (aka, not engineers):

\- "hey, let's figure out the physics of this." \-------> "No, let's build a
small bridge here with these Legos. I bet they work"

\- "ok, let's trace the design key points." \-------> "What for? Let's just
start right here, aim more or less over there, and that's it"

\- "Ok, time to go rent some excavators." \-------> "But why? It takes too
much time. I can build an excavator here. Look. There. I just built an
excavator"

\- "Look, they sent us a blue excavator." \-------> "No no no. Excavators must
be yellow. Don't like it. Return it."

\- "Ok, we need to hire extra workers." \-------> "But wait, do they have
experience with blue excavators? Yellow only? That's lame. Keep looking"

\- "Ok, let's put some signage so people know what to do" \-------> "No, let's
just see where they go and we will extend the bridge there!"

... and the best one

\- "Well, we need more resources to finish this project" \-------> "Nah, let's
just open source it and forget about it!"

 _sigh_

~~~
tjr
On the other hand, a lot of software _is_ built like bridges. Avionics
software (a hefty segment of developers who don't seem to publicly blog much)
is built starting with formal requirements documents, and includes formal
verification against those requirements, quality audits, signoffs by senior
staff and FAA representatives, etc.

Software _can_ be built like bridges. But most software doesn't really need
that level of attention, so it isn't built like bridges. In a commercial
environment, the benefits of the added overhead would have to justify the
expense. I would roughly estimate that avionics-style overhead would quadruple
the budget and schedule of your typical web or iOS software project. Is it
worth it?

Maybe it'd be worth just being a little more strict, even if not four times as
strict. Someone has to make those decisions. But it's certainly possible to
take software development more seriously.

~~~
hyperliner
Yes, it is the nature of the latest innovations that are paying off. In other
words, who cares if the flapping bird was supposed to go through the opening,
but the software miscalculated and the birdie died?

This line of argument supports the point that developer != engineer.

------
csbrooks
Ranting gets attention, so people rant.

I've noticed people will use some library or platform or language, and at
first they'll love it. Then two or three years on, all they can see are the
places it falls short in the domain they're working in. I've personally
experienced a disconnect when I'm still in the honeymoon phase, and the
speaker has progressed to the "I-hate-this-it's-terrible-throw-it-out" stage.

~~~
sanderjd
I think this is just because of the classic leaky abstraction problem. At
first a new abstraction seems amazing, but after using it long enough you've
seen enough of its leaks to become disenchanted.

~~~
csbrooks
Interesting. I'll have to think about that some more.

I've been assuming it's because of human nature: when something's new and
shiny, we see all the cool new things it will do for us. After being around it
for long enough, we tend to only see the flaws, and forget all the benefits.

~~~
endersshadow
It could also be a combination of the two--they're not exactly mutually
exclusive.

------
jiggy2011
"We fail orders of magnitude more than any other engineering discipline."

Our projects also have a lower barrier to entry than other disciplines. You're
not going to invest billions developing a new car or building a bridge unless
you're pretty sure you can pull it off.

------
forgottenpass
I see this sort of negativity as an inevitable result of forced
positivity/neutrality in other contexts.

Conscious decisions to remain "professional," as well as unconscious
limitations more deeply engrained in people all contribute to a bottling up of
emotions.

There are a few outlets available in a work context where actual negativity is
acceptable. Note that the negativity singled out for this article was either
self-deprecating of a group the speaker is a member of, or aimed at inanimate
tools.

Plugging these isn't going to fix anything. Instead of treating the symptoms,
what can be done so that delivering or hearing a rant isn't a huge collective
catharsis?

------
kabdib
We've been building bridges, roads and other stuff for thousands of years.

We've been building software for maybe 70 years.

I'd give it a little time.

~~~
informatimago
[http://en.wikipedia.org/wiki/List_of_bridge_failures](http://en.wikipedia.org/wiki/List_of_bridge_failures)

~~~
arethuza
Oldest bridge still in use:

[http://en.wikipedia.org/wiki/Arkadiko_Bridge](http://en.wikipedia.org/wiki/Arkadiko_Bridge)

It's from Mycenaean times! i.e. Agamemnon, fall of Troy etc.

Mind you, I've been inside much, much older buildings (from 3000BC rather than
1100BC):

[http://en.wikipedia.org/wiki/Maeshowe](http://en.wikipedia.org/wiki/Maeshowe)

------
ejk314
I hate to resort to this cliche... but our harsh self-view might just be
because our industry is so young. We have higher project failure rates than
other 'engineering' disciplines; but we're still trying to figure out HOW to
build things. Half of the time, it's like giving an engineering crew a new,
barely tested material and telling them to build a skyscraper with it.

The way I see it, our negativity is actually quite hopeful. We know we can do
better. We know we WILL do better. So we are hard on ourselves because we see
how far we have to go and what it will take to get there.

------
jasonpriestley
The author suggests that the enormous economic success of software suggests
that the methods used are not "crap", but I don't think this is valid. In
fact, it's somewhat the opposite - the strong economic growth in the software
world (and the exponentially improving performance and reach of the underlying
hardware) allow low-quality software to proliferate.

The economic argument for quality engineering is, "it may cost more upfront,
but will deliver a more reliable product with lower total costs." But this
argument is invalid in the current software world, where network effects and
first-mover advantage are enormous, and a successful company can grow
explosively. No one would follow the strategy of "build a quick bridge out of
plywood, then when we have 10 million people using it and it starts falling
down every other day, we'll have enough money to hire some people to build a
real bridge." But that's what twitter did.

As software developers, a lot of our job is just putting up with the crap that
the software world has been built on, because the people in charge have made
the rational calculation that it's better to hire ten times more engineers and
expand to Asia, expand to mobile, expand up market and down market and into
different markets a little sooner.

I agree entirely though, that it's counterproductive for programmers to blame
ourselves or to seek a technical fix to what is basically an economic and
organizational problem.

------
FranOntanaya
There's a lot of money in making developers and tech businesses feel
dissatisfied with whatever they have, regardless of whatever they need. I
think one of the things we haven't really tackled that much is actual
dishonesty in the industry, as in, how much of the fire and brimstone is
really, really, just an attempt to sell you x or promote the one niche someone
is comfortable in. Maybe we don't want to think of software as something
higher than glorified laundry detergent, even though the same market rules
apply to it.

Then, the people that actually are doing the right stuff are probably already
busy doing the right stuff, and we rarely hear them -- or they just don't
bother, seeing it'll be drowned in the noise anyway.

------
sosuke
I always thought the negativity was part of sales tactic. We want to sell you
a new method for doing what you've always done. Insult your way of doing it,
say there is a better way, and that way is our way.

------
lazyjones
Only a small percentage of developers/engineers partake in such ranting,
evangelism, conferences.

Those who do are typically "blessed" with various personality traits that may
seem to encourage this negativity, arguing, "religiousness".

They are not represenative for the state of the industry and I suspect that
it's very similar in many other industries (perhaps experimental physicists,
biologists, archeologists could share some insights?).

------
zackmorris
As a borderline ranter/troll myself, I recognize the sentiment that there is
too much negativity in the developer sphere. But, the difference between what
I say and what the typical mouth breathers spout on, say, political forums, is
that I can point to specific examples and evidence of why the status quo
sucks.

What the people with a positive slant on all this seem to miss is that it’s
not just any one area of computer science that is facing difficulties, it’s
all of it. For any technology or methodology, I immediately see flaws that
range from merely annoying (semicolons in C, colons in Python) to downright
devastating (shared mutable state between processes, the inability to
statically analyze imperative code). Very little software has been written
that is provably bug-free, and it came at great cost. Some of the control code
that NASA has written comes to mind, or perhaps some of the early work in
lisp, maybe a few other things like Excel that attempt to avoid programming
altogether. Most everything else succumbs to an inability to scale, because
code complexity grows exponentially while performance grows linearly. So code
that works is generally small and modular. Unfortunately, when we try to tie
everything together under umbrella frameworks like UNIX, something is lost and
we’ve never solved that in a mainstream way, at least not to my satisfaction.

The great successes today have come in spite of existing technology, not
because of it. I very much wish that the world would slow down for a moment,
take a step back, and really assess the numerous ways that computing as we
know it today has failed to deliver on the promises made by pioneers 50 years
ago. It’s like the recent post on HN of how concurrency is the new memory
management. While that’s true, it merely reveals the tip of the iceberg, not
just in the fact that most languages fall down in the face of concurrency, but
that even hardware has been unable to keep pace with the newest trends. It
doesn’t even scratch the surface of the wetware issue, that generations of
programmers have been raised on orthodoxy that can’t take us to the next
level. I know that something is terribly wrong when I point out a problem and
am met with such hostility, disguised as positivity. To me, this can be as
damaging as excessive criticism. But it’s shielded behind the enormous
profitability of the tech sector, where people would rather jump through hoops
and make money than solve problems once and all for society and increase
everyone’s wealth.

~~~
angersock
Most of your points are wrong.

"inability to statically analyze imperative code", a.k.a, _the halting
problem_. And we've still done a lot here--consider PVS-Studio or similar
tools.

Tying stuff together in _nix works fine, and has worked fine for decades.

Hardware keeps pace just fine with newest trends, and in fact _enables*
programmers to try things they couldn't have before.

"Wetware" issue? What're you on about?

~~~
zackmorris
Sorry, I don’t mean to diminish what you are saying because it’s valid too. I
just disagree. The problem with imperative code vs functional code is that
it's difficult to analyze all of the states without just running it (I realize
there's a lot more nuance than that). I personally don't think that any of the
major players like C++, java or even Go can be saved because there are just
too many side effects. But, in fairness no language can be guaranteed to work
100% properly once it has access to external resources (unless the inputs and
outputs can be guaranteed to be within a known set). The other big players
like Haskell/Closure/Scala can’t be saved either, because they can do anything
imperative code can do (you could basically write a C compiler in a functional
language and make it unpredictable under some condition). Worse than that,
functional languages are basically unreadable without strong backgrounds in
the theory. As has been said many times, there is no silver bullet.

But, that doesn’t mean we shouldn’t try. I think for starters, a reasonably
safe language that is binary compatible with C could begin replacing some of
the aging code under the right circumstances. I think we are long overdue for
a tool that can exercise a binary over some number of inputs and outputs and
come up with the underlying code. Unfortunately it would also copy errors,
which is kind of bizarre to think about. We sort of have decompilers now but
they generally result in an unreadable mess because they have no notion of the
kinds of considerations humans make when they write code. I would prefer a
decompiler that takes a typical toolset (linked lists, binary trees, etc) and
arranges those metaphors in a way that generates the same code even if it’s a
little less efficient. That kind of efforts obviously needs a moonshot from
universities/government/philanthropists and I don’t see it happening any time
soon. It’s on par with strong AI. And more importantly, it’s probably not
profitable. But if we had it, we could take the behavior driven design that
business people or Star Trek characters reason in and generate a program that
performs that behavior. Genetic programming comes close here but current
hardware isn’t suited to it because we aren’t used to thinking in terms of
search spaces 10 or 20 orders of magnitude larger than what we can hold in our
heads.

I remember the first time I saw UNIX, my very first thought was “this can’t
possibly be how computers work”. It only seems to work now because we’ve been
immersed in it so long, but its fatal flaw is illiteracy. 99% of the
population isn’t educated enough to use it effectively. I remember thinking
the same thing when I saw HTML. The main reason the web took off is that it
could handle plain text, not because of some magic with parsing tags. All
people wanted to do was post MS Word files, and it failed at that (among
numerous other things). So I’m not arguing against something like the stream
as the underlying principle that makes UNIX so powerful, I’m arguing against
everything else added on top of it that was written in a non-human way and
makes it illegible and brittle.

Hardware has gone almost nowhere in 30 years. I’m still using the basically
single threaded CPU that I was using the first time I saw a Mac in 1984. It’s
faster and cheaper but fundamentally the same. Video cards have kind of gone
somewhere, but with their proprietary/closed and narrowly scoped use case,
they have made 3D programming needlessly complex compared to a true
programmable multicore CPU. Until my computer can run a concurrent version of
Go on a realistic number of processors (say 256+, or unlimited via the web) I
don’t see any revolutionary advances in AI, physics, medicine, etc coming any
time soon.

Unfortunately I really can’t go into enough detail with this stuff on a forum.
I guess that’s the gist of the wetware issue. How do I take my experience and
distill it down into something others can use to avoid the same mistakes? More
importantly, how do I convey the greater expectations that would be possible
if we had better tools? Or even incorporate other people’s notions myself when
I’m knee-deep in code? I’m thinking in terms of things like freeing people
from labor and admittedly technology has gone a long way towards that end, but
for some reason it stuttered maybe in the mid 90s and I can’t quite figure out
how that happened or how to get past it. It’s a bit like living within the
Matrix and knowing that something is terribly wrong without being able to
fathom what would be beyond it. Maybe it’s something geopolitical and people
have decided that this level of progress is enough and just don’t care about
going further. I’m certain if there was a demand, manufacturers would sell the
tech that would run circles around what we’re having to use now. But there is
more money in video games and solving first world problems, so here we are.

~~~
angersock
Ah, thank you for the elaboration--I actually agree on several of your points
now that you've provided additional context. :)

3D programming is not overly complex; it maps straight to its problem domain
in most cases, along with some trivia about the underlying hardware. It's only
as we've tried to embrace more general computational models that things've
gotten more complicated--IrisGL/OpenGL 1.x were fairly clean conceptually, and
not hard to work with.

I disagree too with your statement about hardware--it's similar to saying that
airplanes have gone nowhere in 100 years: only true if you discount the
massive improvements in engineering and functionality. Modern CPUs do a great
deal more with virtual memory, SIMD instructions, and whatnot than old chips.
New chips like the Propeller or Mill have some new tricks as well.

(Now, whether or not fundamental _architectures_ are better or worse than,
say, old machines from Burroughs et al in the 60s/70s is a different matter
altogether....)

~~~
zackmorris
Ya I actually really miss OpenGL 1.x as well, as kind of a minimal environment
needed to do projections. Something has been lost with shaders. For example,
limitations on program size has been a real burden for me, and strikes me as
not being general computation. My iPad 1 can only read like 4 source pixels at
a time, making blurs very difficult while they are comparatively easy with
handwritten blitters. Hopefully WebGL will nudge the spec back towards simpler
times.

I kind of look at hardware today like the internal combustion engine. Great
strides have been made to speed up single threaded processing, and it is truly
remarkable that we run 3 GHz chips without even thinking about it, while most
other electronics have seen very little improvement except to go from vacuum
tubes to transistors. But if I had a billion transistors to work with, it
would be kind of like going to electric motors and the chip I came up with
would look nothing like an Intel Core i7. It would be more like an FPGA, with
10,000 or 100,000 simple cores and no caching or anything like that. So it
would be abysmal at running an operating system (basically taking us back to
the early 90s) but would be faster than any video card today for embarrassing
parallel computation like image processing or search (kind of like the bitcoin
ASICs but general purpose).

------
rjknight
By telling someone else that 'X sucks', you're asserting your knowledge of X
(and often Y or Z too, one of which you might recommend instead of X). Within
limits, this is perfectly reasonable - there are pitfalls out there, and
advising other people to avoid them is helpful.

However, there are a few big risks which arise when doing this, particularly
around non-technical people:

1) The developer who cried wolf - if you have a very long list of things that
you think 'suck', it may be concluded that you're just particularly picky
(and, as the OP shows, it's not backed up by the reality of the many
_successful_ projects that exist). When you _really_ need to torpedo a
particularly bad idea, you might not be taken seriously.

2) Trying to persuade people by scaring them can backfire. Telling people that
their project will fail unless they use the One True Project Management
Methodology, or adopt some particular new library or framework might sound
like a good idea, but they're quite likely to conclude that if the chance of
failure is that high then it's better to avoid the risk altogether by doing
nothing.

3) Undermining the credibility of other developers might give you a short-run
reputation boost, but in the long-run it undermines the credibility of
everyone. Instead of "that guy sucks, but you're OK", you want people to think
"that guy is pretty good, but you're awesome!". To an extent, this applies in
other areas ("PHP is pretty good, but Clojure is awesome!" sounds like
something a happy customer might say, and we all want more happy customers).

As much as I love his writing, I think Dijkstra should take a large portion of
blame for the "considered harmful"[1] style of technical criticism. His claim
that anyone who ever learned BASIC is a brain-damaged individual incapable of
proper programming has done a lot to undermine the credibility of intelligent,
hard-working developers ever since. I, for one, would like to issue a
heartfelt "fuck you" in his general direction. Again, I love his writing but I
think this kind of disparagement has incredibly negative effects in the real
world.

We certainly do need to eliminate errors from our practices and our ways of
thinking. That's painful and does require criticism of those things. But the
main problem for the software development industry right now is not that we
are _insufficiently critical of our tools, practices or each other_ , it's
that we don't back each other up enough when it comes to dealing with shitty
project management or exploitative practices or discrimination that holds
people back in their careers. The low-hanging fruit is not _yet another web
framework_ , it's figuring out better ways of organising development.

[1]
[http://en.wikipedia.org/wiki/Considered_harmful](http://en.wikipedia.org/wiki/Considered_harmful)

~~~
informatimago
"Considered Harmful" was not Dijkstra's idea, but his editor! And there's no
way to say that "PHP is good", that's just untrue in all universes.

~~~
rjknight
OK, but he's still on the hook for "mentally mutilated".

Also, my point is that "good" is really a relative term. PHP is good, because
the baseline for programming language quality is quite high. You can do _so
much_ stuff using PHP - you can build YouTube[1], Facebook, Wikipedia,
WordPress and much more. I don't know what the total value of web applications
written in PHP is in billions of dollars, but it's a big number. My point is
that if that's what we can do with _PHP_ , just think what we could do with
better tools! In my definition, "good" means "enables the creation of valuable
things", and PHP does that. I hope that it gets replaced by something better,
for the same reason that I hope current medical treatments get replaced by
better ones, but I can't call it "bad".

It's weird, but programmers are sometimes the people with the least
appreciation for how valuable their creations are. We do things that non-
programmers simply cannot do, creating massive value in the process. Yet many
programmers are afflicted by impostor syndrome and the sense that what they do
is not really that valuable and that most of what they build is crap. This
negativity is misdirected, towards other programmers - "I'm better than her
because she uses PHP and I use Haskell" \- when it should be directed towards
the things that cause _real_ project failure. A well-conceived and well-
managed project that creates a PHP web application is _much_ more valuable for
everyone than a badly-managed project that produces a Haskell application that
nobody uses.

[1] The first version, at least. Google doesn't do PHP.

~~~
krapp
It's not that php is good per se, although that is not nearly as objective a
term as language bikeshedders tend to believe, but that better languages are
still worse at the one thing php was intended to do - or at the very least,
better in ways which don't actually matter. Almost all web application
development is CRUD anyway, and php is perfectly adequate to the task.

------
m0th87
This is a good thing, it's a sign the industry is moving fast and hopefully in
the right direction. Constant self-reflection keeps us on our toes.

------
igl
Well, we are not driven by wanting something positive either.

------
logfromblammo
I don't think we are too negative as an industry. We are exactly as negative
as we need to be. We say everything is horrible, because everything actually
is horrible and broken and held together with baling wire and duct tape. We
always seem to be complaining because, despite what we may believe, we are not
in control. We are not now a meritocracy, and we never have been. We dedicate
our lives to continuous self-improvement and refining the skills of our craft.

But we are still subservient to people who have chosen to study human
interactions and business organization. And they follow business fads. They
mandate Waterfall or Scrum because other people use it, and they believe
incorrect decisiveness is better than endless vacillating trials and
experimentation for a piddling, measly, single-digit-percentage gain in
efficiency. We want to feel pride in our work, but someone else also wants it
to make money. We want to build bridges that last centuries, but those paying
us just want to cross an inconvenient obstacle and never look back. We want to
avoid reinventing the wheel. Those paying us would rather pay us to do that
than buy someone else's crappy wheel. Hell, sometimes we even build wheels to
give away to anyone who wants one, and we still have to reinvent them.

Everything is horrible. We are always complaining about it, and always trying
to fix things. But none of us have money, aside from those few who won a
startup lottery. The people who have the money call the shots, and people with
money only listen to people who have money. So they make Cargo Cult copy
environments from our startup lottery winners, with zero understanding about
the true source of their success. We often don't even understand it ourselves.

To a person that measures worth by profits, there is a natural inclination to
discount the sheer luck of being in the right place, at the right time, with
the right tool. And even if there is someone out there holding conductive rods
and a giant capacitor bank, lightning could still strike the guy with a
shovel.

Essentially, we have a lot of useful knowledge and skills, but still no
_respect_. People listen to what physicians say, even if they tell us that
magic beans can make us lose weight. People listen to what lawyers say, even
if they tell us to just slip this neck brace on before going into the
courthouse. People listen to what engineers say, even when they crash
spacecraft into planets because they forgot to label their distances with
units. Very few people listen to what a software professional says. They are
literally surrounded by our work, which is often done so well that they don't
even notice it is there, but when we speak, it sounds to them like Charlie
Brown's teacher: "wah wah whaaa ma waa na nah ba".

And to be honest, a lot of that is our fault. We learned to communicate
effectively with perfectly logical yet absolutely moronic machines rather than
emotional and intelligent humans. What we desperately need is the _prestige_
and _charisma_ to establish professional boundaries in such a way that the
people paying us don't cross them, and so that the quislings among us won't
support those breaches just because its the only way for them to get promoted.

We need a cartel, folks. I know that most of us, myself included, find the
idea repugnant, but the current arrangement is not working for us.

~~~
informatimago
There's a fundamental dichotomy here between the logical machine (the logic
needed to do our job as programmers), and the emotional aspects. Some
companies archive at their level a connection between both (thinking of Apple
here), but this is done more by chrome than anything else.

Would the creation of an emotional computer system help? Would users be happy
when interacting with their computers to get answers like: "Sorry, I'm not in
the mood today!".

Or should hardware company make things more interesting by making computers
explode like Hollywood cars when you open the wrong file?

"Halt and Catch Fire" indeed.

A cartel or a guilde...

------
michaelochurch
It's harem-queen behavior. The Business is the brash alpha male that plays
hot-and-cold with us, exhibits blatant favoritism, walks in and mushroom
stamps us and leaves us confused as to whether that's a good or bad thing.
We're all clawing at each other and trying to tear each other down in a
general "out-bitch the other bitches" contest.

See, this Tweet:
[https://twitter.com/MichaelOChurch/status/484126337335324672](https://twitter.com/MichaelOChurch/status/484126337335324672)

    
    
        Women and programmers share: undeserved low status to the group because 
        the most popular 5% tear the rest down.
    

Offensive? For sure. But _the underlying reality_ (in both examples) is
offensive, too.

We have people who publicly say things like, "We're all terrible at our jobs"
(meaning that humans are not great at reasoning about complex systems, and
cannot build maintainable systems under the deadlines The Business tends to
demand, especially when the requirements are nonsensical and contradictory) or
"95% of us are morons who should be fired" (actually, most of those mediocre
programmers are bad because _no one ever invested in them_.) The Business
hears this and sees us as a low-status tribe that'll take abuse without
fighting back.

In a contrast to our self-deprecation and acceptance of low status (and worse
pay) we have these brash, 24-year-olds who recently flushed out of McKinsey,
or who got "accelerated into the one-year analyst program" at Goldman because
they told too many sexist jokes or mistook one too many female MDs for
secretaries, but who can talk a good game about "crushing it" and raise
millions and become "founders".

It's no wonder that The Business has its way with us. It's a legitimately
negative situation, but it's mostly not our fault.

We need to be more progressive, more self-protecting, more collaborative and
organized, and (dare I say it) more political. Mindless optimism is not the
way out.

------
CmonDev
Well, Twitter is using Scala which is among the best modern languages (and I
am not a Scala/JVM developer). It is not about the end result (worry of the
business). It is about us developers enjoying the process (nobody else cares
about that). I am much happier using <decent modern language of choice> rather
than JavaScript for example, this is why I am pissed every time Mozilla
leverages the historical browser language monopoly and preaches it's mantra
that JS is the best choice. And indeed I have no choice but to use JS on the
client-side, because everyone seems happy with the status quo (please no
transpilation offers - it's demeaning crap). The more negative - the better.

~~~
rdtsc
> Well, Twitter is using Scala

Speaking of ranting and Twitter. On a higher level, I think Twitter is helping
this negativity crystallize and come through better by the nature of the
service. "Express your opinion in short burst of quick messages". It is hard
to pass sometimes a more detailed description of what you mean so what comes
out is often a mean short response. Usually people don't talk in short
messages when they want to express an opinion or tell a story.

There is a whole slew of misunderstandings started on Twitter. Like someone
bragged about writing a sorting algorithm. Well it was for learning a language
but that didn't fit in the 140 character limit. Then they get attacked for
"That is ridiculous this bubble sort it has been around for long" also in 140
charters. It snowballs from there. It is often followed by an apology on some
blog somewhere.

