
Ask HN: What made you change your mind about a programming language/paradigm? - strangecasts
Books, courses, practical experiences or just <i>a-ha</i> moments that made you feel differently about a language&#x2F;design pattern&#x2F;etc.
======
DanHulton
Testing.

Unit testing to me seemed akin to drinking 8 glasses of water every day. A lot
of people talk about how important it is for your health, but it really tends
to get in the way, and it doesn't seem to really be necessary. Too frequently,
code would change and mocks would need to change with it, removing a good
chunk of the benefit of having the code under test.

Then I started writing integration testing while working on converting a bunch
of code recently, and it has been eye-opening. Instead of testing individual
models and functions, I was testing the API response and DB changes, and _who
really cares_ what the code in the middle does and how it interfaces with
other internal code? So long as the API and DB are in the expected state, you
can go muck about with the guts of your code all you want, while having the
assurance that callers of your code are getting exactly out of it what you
promise.

Unit test suites would break all the time for silly reasons, like someone
optimizing a function would mean a spy wouldn't get called with the same
intermediary data, and you'd have to stop and go fix the test code that was
now broken, even though the actual code worked as intended.

Integration tests (mainly) only break when the code itself is broken and
incorrect results are getting spit out. This has prevented all kinds of issues
from actually reaching customers during the conversion process, and isn't
nearly so brittle as our unit tests were.

~~~
buzzy_hacker
> Unit test suites would break all the time for silly reasons, like someone
> optimizing a function would mean a spy wouldn't get called with the same
> intermediary data, and you'd have to stop and go fix the test code that was
> now broken, even though the actual code worked as intended.

Can you or others speak more about this? I was taught that verifying function
calls for spies/mocks was good practice. But, I encountered this problem just
the other day when I refactored some Java code for a personal project.
Everything still worked perfectly, but, exactly as you said, the intermediate
function calls changed so the tests would fail due to spies/mocks calling
different "unexpected" functions.

I'm an intermediate programmer so can someone with more experience fill me in
with what's best practice here and why? Do I update the test code to reflect
the new intermediate function calls? But this whole approach now seems silly
since a refactoring that doesn't affect the ultimate behavior of the function
that is under test will break the test and that seems wrong. So do I instead
not verify function calls when using spies/mocks? In that case, what is the
use case for verifying spies/mocks?

~~~
ht85
What you want to spy on are side-effects that are part of the function's
contract.

If you have a function that fetches data, you shouldn't test that its hitting
the data layer, only that the correct data is returned. This way when you
improve the function to not hit the data layer at all under some conditions,
your tests will keep passing.

On the other hand, if that function is supposed to log metrics or details
about its execution, you should test that ,as it is part of its contract and
can't be inferred from the return value.

------
spricket
Microservices. They seemed really cool until I worked on a few large projects
using them. Disaster so epic I watched most of engineering Management walk the
plank. TLDR: The tooling available is not good enough.

The biggest cause lies in inter-service communication. You push transaction
boundaries outside the database between services. At the same time, you lose
whatever promise your language offers that compilation will "usually" fail if
interdependent endpoints get changed.

Another big issue is the service explosion itself. Keeping 30 backend
applications up to date and playing nice with each other is a full time job.
CI pipelines for all of them, failover, backups.

The last was lack of promised benefits. Velocity was great until we got big
enough that all the services needed to talk to each other. Then everything
ground to a halt. Most of our work was eventually just keeping everything
speaking the same language. It's also extremely hard to design something that
works when "anything" can fail. When you have just a few services, it's easy
to reason about and handle failures of one of them. When you have a monolith,
it's really unlikely that some database calls will fail and others succeed.
Unlikely enough that you can ignore it in practice. When you have 30+ services
it becomes very likely that you will have calls randomly fail. The state
explosion from dealing with this is real and deadly

~~~
thoman23
> "all the services needed to talk to each other"

I'm not an expert by any means, but I'm pretty sure that statement indicates a
problem.

~~~
jmole
Yes, then you're just doing bad OOP with sockets instead of a language that
was designed for it.

~~~
spricket
Heh, definitely some truth to this.

What saved us before, was our forest of code could depend on the database to
maintain some sanity. And we leaned on it heavily. Hold a transaction open
while 10,000 lines of code and a few N+1 queries do their business? Eh, okay,
I guess.

Maybe we didn't have the descipine to make microservices work. But IMO our
engineering team was pretty good compared to others I've seen. All our
"traditional" apps chugged along fine during the same period

~~~
raverbashing
Really, whenever someone blames "discipline" be suspicious

Not even the army has perfect discipline, even with hard training. They have
cross-checks, piles of processes and move slowly for the most part

(Software development shouldn't aim to be like the army though)

------
zaphar
Almost every time I experienced a shift like this in my thinking it was due to
experiencing a problem I hadn't experienced until that point.

I discovered the value of compile time type checks when I worked on large
codebases in dynamic languages where every change was stressful. In comparison
having the compiler tell you that you missed a spot was life changing.

I discovered the value of immutable objects when I worked on my first codebase
with lots of threading. Being able to know that this value most definitely
didn't change out from under me made debugging certain problems much easier.

I discovered the value of lazy evaluation the first time I had to work with
files that wouldn't fit in entirely in memory. Stream processing was the only
way you could reasonably solve that problem.

Pretty much every paradigm shift or opinion change I've had was caused by
encountering a problem I hadn't yet run into and finding a tool that made
solving that problem practical instead of impractical.

~~~
AnimalMuppet
I might even go farther. I wonder if most of the techniques (and languages)
that we think are stupid are instead aimed at problems that we don't have. (Of
course, those techniques _become_ stupid when people try to apply them on the
wrong problems...)

~~~
zaphar
I suspect you are correct. I've definitely been guilty of drinking a
particular brand of kool-aid and applying it inappropriately before.

That in itself can turn into a learning experience if you stick around for the
aftermath.

------
dvirsky
I used to hate C and thought it was primitive, ugly, dangerous, tedious to
write in and annoying to read. While writing a big project in it
([https://github.com/RedisLabsModules/RediSearch/](https://github.com/RedisLabsModules/RediSearch/))
, I've discovered the zen of C I guess. Instead of primitive I started seeing
it as minimalist; I've found beauty in it; and of course the great power that
comes with the great responsibility of managing your own memory. And working
in and around the Redis codebase, I also learned to enjoy reading C. While I
wouldn't choose it for most projects, I really enjoy C now.

~~~
0815test
I also used to hate C and thought it was primitive, ugly, dangerous, tedious
to write in and annoying to read. Later, I too, discovered the actual point of
C, the beauty and minimalism of it. _Then_ I became a better coder, and once
again saw the nature of C as primitive, ugly, dangerous, tedious to write in
and annoying to read. I suppose that's what enlightenment feels like.

~~~
scoutt
_" Before I learned the art, a punch was just a punch, and a kick, just a
kick. After I learned the art, a punch was no longer a punch, a kick, no
longer a kick. Now that I understand the art, a punch is just a punch and a
kick is just a kick." \-- Bruce Lee_

------
salex89
Using it as my main language: Python (2.7). How the hell did this thing become
so popular?

I've used it for all sorts of stuff earlier, less complex than the other.
Scripts, devops, ETL... But then I got into a company that is using it for
some quite serious stuff, a large codebase. Holly smokes this thing does not
scale (in terms of development efficiency and quality) well. I swear at least
70% of our bugs is because of the language and half of the abstractions are
there just to lessen the chance of some stupid human mistake.

Sorry, but I will not pick it ever again for even a side project.

~~~
fierro
I think dynamically types languages are on their way out. There aren't many
good arguments remaining in favor of them.

~~~
ken
I don’t think that’s the problem. I wouldn’t use Python (especially 2.x), but
I’d have no hesitation with using Common Lisp. It has lots of great features
for programming in-the-large that Python simply lacks.

~~~
yuchi
Genuinely asking: could you provide some specific example? What does CL has
that Pythos doesn’t? (Disclaimer: i never worked on neither)

~~~
chalst
If I had to point to something, I would single out CL's powerful support for
multiple dispatch. I'd hesitate to recommend CL to undisciplined programmers,
because it is too easy to write code that works but you don't understand a
week later.

~~~
ken
(I hammered out that answer right before I had to run on stage. A couple other
big items I forgot:)

A proper numeric tower. Python has complex numbers, but they don't seem well
integrated (why is math.cos(1+2j) a TypeError?). Fractions are frequently very
useful, too, and Python has them, in a library, but "import fractions;
fractions.Fraction(1,2)" is so much more verbose than "1/2" that nobody seems
to ever use them.

Conditions! Lisp's debugger capabilities are amazing. And JWZ was right:
sometimes you want to signal without throwing. Once you've used conditions,
you'll have trouble going back to exceptions. They feel restrictive.

(I've come to accept that in a language with immutable data types, like
Clojure, exceptions make sense. Exceptions feel out of place, though, in a
language with mutability.)

Other big wins: keywords, multiple return values, FORMAT (printf on steroids),
compile-time evaluation, a native compiler (and disassembler) with optional
type declarations.

Lisp is unique among the languages I've used in that it has lots of features
that seem designed to make writing large programs easier, and the features are
all (for the most part) incredibly cohesive.

~~~
chalst
Conditions, yes.

If you have multiple dispatch, then building/supporting a numeric tower is
natural.

In your other reply you pointed out macros. They are a mixed blessing, easily
misused. Other languages have them but use them more sparingly and making it
harder to overlook their special status, which leads to better "code smell" in
my opinion.

Do take a look at Julia. It has learned deeply from CL and innovated further.

~~~
ken
I can imagine how multiple dispatch could make a numeric tower a little easier
to implement, but the limitations I see in Python and other languages don't
appear to stem from that. You can already take the math.cos of most types of
numbers (int, float, even fractions.Fraction, ...) just fine, or add a complex
and a Fraction. Python has long dealt with two types of strings, several types
of numbers, etc., with the same interfaces. This isn't a difficult problem to
solve with single dispatch.

Macros are easily misused, true, but so can any language feature. I can go on
r/programminghorror and see misuses of _if_ -statements. It's the classic
argument against macros, and I hear it a lot, but I can't say I've seen it
happen.

25 years ago, conventional wisdom said that closures were too complex for the
average programmer, and today they're a necessary part of virtually every
language. Could we be reaching the point where syntactic abstraction is simply
a concept that every programmer needs to be able to understand?

I think "macros are easy to misuse" comes from viewing macros as an island. In
some languages (like C), they are: they don't really integrate with any other
part of the language. In Lisp, they're a relatively small feature that builds
on the overall design of the language. Omitting macros would be like a
periodic table with one box missing. It'd mean we get language expressions as
native data types, and can manipulate them, and control the time of
evaluation, but we just can't manipulate them _at_ compile time.

------
ilovecaching
My first big Erlang project made me completely rethink my acceptance of object
oriented programming in C++, Java, Python, etc. I realized that I had blindly
accepted OO because it was taught as part of my college curriculum. After
several years in industry I had concluded that programming was just hard in
general. It wasn't until my first project in Erlang, where an entire team of
OO devs were ramping up on functional programming, that I discovered that the
purported benefits of OO were lies. I also realized that the idea that
concurrency and parallelism must be hard is untrue. OO simply makes it hard.

Now I see OO as something I have to deal with, like a tiger in my living room.
Thankfully, so many new languages have come out recently; Go, Rust, and Elixir
being the ones that I use regularly, that have called out OO for what it is
and have gone in more compelling directions.

Hopefully one day they will teach OO alongside other schools of thought, as a
relatively small faction of programming paradigms.

~~~
dzik
Erlang. Same here. Reading first few pages of a book describing principles of
OTP (processes, share-nothing, messages, etc) was mind blowing. Company I
worked for at the time (and I still do) decided to switch from Java to Erlang
in middleware area. This decision seemed like a mixture of insanity and
enlightment. Do you switch from one of the most popular languages in the world
to something that most developers never heard of? Surely, exciting, but will
it work? How do we hire new staff? After our R&D confirmed it was promising,
me with couple of other developers were tasked with rewriting quite an
expensive piece of middleware software that was unfortunately reaching its
maximum capacity. We had no knowledge of how the software worked, we just knew
its API. We were given time to learn erlang so we did. We all switched from
eclipse to vim (some to emacs). After a bit of playing around with erlang we
did our job in just 3 months. New app was much smaller and was easily capable
of handling many more messages than the previous one. And it was written by
erlang newbies! Then many more erlang apps we have created. It turned out to
be a really good choice. Also the level of introspection you get out-of-the
box with erlang is just amazing. I have never seen anything like this before.

Now I can compare Erlang to Java and it is really baffling how the heck Java
took over the world. To do erlang I just need an editor with some plugins, ssh
connection to linux with OTP installed and of course rebar3. To do Java I need
4GB of RAM to simply run an IDE with gazillion of plugins, maven to cater for
thousands of dependencies for the simplest app and I need to know Spring,
Hibernate, AOP, MVC and quite a chunk of other 26^3 3-letter abbreviations. No
thanks.

~~~
xyzal
May I ask what was the book? Do you recommend it?

~~~
dzik
There are 2 erlang books. "OTP in action" and "Learn you some Erlang". I
highly recommend both of them.

------
arduinomancer
Static Type Checking.

I used to think Node.js was the greatest thing ever. I won't bother explaining
the benefits but suffice it to say I much prefer writing a server in Java
compared to node.

I think it takes getting burned at least once for new developers to understand
why a lot of seasoned developers like types.

Over time I've realized that there's a simple principle that applies to a lot
of stuff in software and engineering in general:

"The bigger something is, the more structure it needs"

Writing a quick script or small application? Sure use Python, use Node, it
doesn't matter, but as size increases, structure needs to increase to stop
complexity from exploding.

It doesn't just apply to typing either. The bigger a project is, the more
you'll want frameworks, abstractions, tests, etc.

If you look around this principle applies to a lot of things in life too, for
example the bigger a company is, the more structure is added (stuff like HR,
job roles, etc...).

As a corollary, the inverse of the principle is:

"Don't add too much structure to a small thing"

~~~
pacala
Good insights. Modern platforms are neither full dynamic nor rigidly static,
but gradual,
[https://en.wikipedia.org/wiki/Gradual_typing](https://en.wikipedia.org/wiki/Gradual_typing).
Start with a dynamic script, add typing as you go to strengthen the system.
Notable mentions: Typescript, Python + mypy, C#, Dart.

~~~
willio58
This doesn’t encompass all modern platforms, just those that decide to go that
route.

~~~
pacala
Fair enough. 'Quite a few modern platforms...'

------
campezzi
Immutability. I didn't really understand the benefits of having immutable data
structures until I tried building a service with no mutations whatsoever...
and noticed I didn't get any weird, head-scratching bugs that took hours to
reproduce and debug. That led me to go down the Functional Programming rabbit
hole - thus changing my entire view on what code is/could be.

[edit: spelling]

~~~
onlyrealcuzzo
To add to this: functional programming. Getting rid of state in objects was a
DREAM for me.

I used to think: come on you better than though hipsters... this shit looks
ridiculous, and it isn't intuitive... There's no way it's worth it to learn.
It's just the new fad.

Oh, how I was so so wrong.

~~~
mesarvagya
Tell us what was your starting point and references used to learn FP

~~~
onlyrealcuzzo
First -- over a few years -- I had slowly started writing functional-ish code
in Ruby on the backend and React/Redux on the front-end.

Ruby is kind of nice in that there's not an easy way to iterate over a list
without functional code. You start mapping and reducing pretty regularly, and
then discover the power of higher-order array functions, and how it lends
itself nicely to functional programming.

React/Redux is nice in that it pretty much forces you to wrap your head around
the way functional programming works.

React/Redux was definitely a step up from Spaghetti-jQuery for me, but I'd
stop short of calling it an enjoyable experience. It wasn't until I started
playing around with Elixir that I really fell in love with functional code.

In a lot of ways, Elixir is really similar to Ruby, which makes it pretty easy
to dive in (for a Ruby-ist). But in subtle ways, its functional nature really
shines. The |> is perhaps my favorite programming concept I've come across.
It's so simple, but it forces you to think about -- and at the same time makes
it natural to comprehend -- the way data flows through your application.

Don't get me wrong, Elixir is still very much a functional language. It's
allure in that it looks like an imperative language and has a lot of
similarity to Ruby is misleading.

The learning curve might not be as steep as say, Lisp, but it' still quite
steep. And I think it'd take around the same time to be meaningfully
proficient in either.

Glad I got my Elixir/BEAM rave out for the day.

~~~
fluxinflex
A "me too" for Elixir. Coming from a Ruby background, picking up Elixir wasn't
hard at all (disclaimer: I have done some FP in the past). What I found really
good about Elixir is the solid documentation[1], easy comparing of
libraties[2], the mix[3] tool that made starting a project really simple.

But what really blew me away were doctests[4]. Basically I ended up writing my
unit tests where my code was. That was my documentation for the code, so there
was no need to maintain unit tests and documentation separatetly.

[1]: [https://elixir-lang.org](https://elixir-lang.org) &
[https://hexdocs.pm/elixir/Kernel.html](https://hexdocs.pm/elixir/Kernel.html)
[2]: [https://elixir.libhunt.com](https://elixir.libhunt.com) [3]:
[https://elixir-lang.org/getting-started/mix-
otp/introduction...](https://elixir-lang.org/getting-started/mix-
otp/introduction-to-mix.html) [4]: [https://elixir-lang.org/getting-
started/mix-otp/docs-tests-a...](https://elixir-lang.org/getting-started/mix-
otp/docs-tests-and-with.html)

------
mattnewport
Our introductory programming course at university used ML and I didn't like it
or get it. I already knew some C++, BASIC and Java and was mostly interested
in real time graphics programming and the kind of examples used in the ML
course were not interesting to me and I didn't see how it would help me tackle
the kinds of programming tasks I was interested in.

I found recursion pretty unintuitive and didn't find the way it was taught in
that course worked for me. At the time it mostly seemed like the approach was
to point at the recursive implementation and say "look how intuitive it is!"
while I completely failed to get it.

Many years later after extensive experience with C++ in the games industry I
discovered F# and now with an appreciation of some of the problems caused by
mutable state, particularly for parallel and concurrent code I was better
prepared to appreciate the advantages of an ML style language. Years of
dealing with large, complex and often verbose production code also made me
really appreciate the terseness and lack of ceremony of F# and experience with
both statically typed C++ code and some experience with dynamically typed
Python made me appreciate F#'s combination of type safety with type inference
to give the benefits of static typing without the verbosity (C++ has since got
better about this).

I still struggle to think recursively and my brain naturally goes to non
recursive solutions first but I can appreciate the elegance of recursive
solutions now.

~~~
samdk
I think it's unfortunate the first/often only exposure people get to FP is a
build-up-from-the-foundations approach that emphasizes recursion so much, I
think it leaves most students with a poor understanding of why functional
paradigms and practices are practical and useful.

I've written primarily in a functional language (OCaml) for a long time now,
and it's very rare I write a recursive function. Definitely less than once a
month.

In most domains, almost every high-level programming task involves operating
on a collection. In the same way that you generally don't do that with a while
loop in an imperative language, you generally don't do it with recursion in a
functional one, because it's an overly-powerful and overly-clunky tool for the
problem.

For me the real key to starting to be comfortable and productive working in a
functional language was realizing that they do actually all have for-each
loops all over the place: the "fold" function.

(Although actually it turns out you don't end up writing "fold"s all that
often either, because most functional languages have lots of helper functions
for common cases--map, filter, partition, etc. If you're solving a simpler
problem, the simpler tool is more concise and easier to think about.)

~~~
chalst
Agreed. Point-free code is a better hallmark of good FP than recursion.

------
7373737373
The lack of secure composability in almost all existing languages. You cannot
import untrusted code and run it in a sandbox. Unless you are completely
functional, you cannot restrict the effects of your own code either.

The solution to this seems to be the object capability (key) security
paradigm, where you combine authority and designation (say, the right to open
a specific file, a path combined with an access right). There are only
immutable globals. Sandboxing thus becomes only a matter of supplying the
absolutely needed keys. This also enables the receiver to keep permissions
apart like variables, thus preventing the
[https://en.wikipedia.org/wiki/Confused_deputy_problem](https://en.wikipedia.org/wiki/Confused_deputy_problem)
(no ambient authority).

Even with languages that have security policies (Java, Tcl, ?), control is not
fine grained, and other modes of interference are still possible: resource
exhaustion for example. Most VMs/languages do not keep track of execution
time, number of instructions or memory use. Those that do enable fascinating
use cases:
[https://stackless.readthedocs.io/en/latest/library/stackless...](https://stackless.readthedocs.io/en/latest/library/stackless/pickling.html)

All of this seems to become extremely relevant, because sufficient security is
a fundamental requirement for cooperation in a distributed world.

~~~
bryal
I recently had a lecture about something related to this in a course on
advanced functional programming. Basically we were shown how you can implement
a monad in Haskell that lets you preserve sensitive data when executed by
untrusted code. Together with the SafeHaskell language extension, which
disallows libraries to use operations that could potentially break the
invariants, this seems like a very cool concept!

Functional pearl:
[http://www.cse.chalmers.se/~russo/publications_files/pearl-r...](http://www.cse.chalmers.se/~russo/publications_files/pearl-
russo.pdf) Slides: [https://1drv.ms/p/s!Ahd2uwlk3jmIlCZr0spYc_I-
OveR](https://1drv.ms/p/s!Ahd2uwlk3jmIlCZr0spYc_I-OveR) Source:
[https://bitbucket.org/russo/mac-lib](https://bitbucket.org/russo/mac-lib)

------
lukifer
Rich Hickey's "Value of Values"[0] is what finally sold me on the benefits of
pure functional programming and immutable data structures. (It remains
horrifying to continue working with MySQL in my day job, knowing that every
UPDATE is potentially a destructive action with no history and no undo.)

[0]:
[https://www.youtube.com/watch?v=-6BsiVyC1kM](https://www.youtube.com/watch?v=-6BsiVyC1kM)

~~~
spudlyo
Often times MySQL is set up with auto-commit set to true, where every DML
statement (like UPDATE) is wrapped in an implicit BEGIN and COMMIT. It doesn’t
have to be that way though, you can manage the transaction yourself, and you
don’t have to COMMIT if you don’t want to, you can undo (ROLLBACK) if
necessary.

~~~
lukifer
It’s true, transactions in MySQL work great. But once the change is committed,
the previous value is overwritten permanently. If the user wants to undo five
minutes later, or I want to audit based on the value a month ago, we’re hosed
unless we’ve jumped backflips to bake versioning into the schema.

I think Hickey’s comparison to git is apt: we don’t stand for that in version
control for our code, why should we find that acceptable for user data?

~~~
vp8989
Because there is vastly more state in the world than there is code. And
frankly, most state is not that important. Just wait until you work with a
sufficiently large immutable system, they are an operational nightmare.

You should opt-in to immutability when the state calculations are very complex
and very expensive to get wrong.

I do wish mainstream languages had better tools for safe, opt-in immutability.
Something like a "Pure" attribute you assign to a function. It can only call
other pure functions and the compiler can verify that it has no state changes
in it's own code.

------
chadcmulligan
This is probably an unpopular opinion - Haskell, I used to think it must be
cool (and useful) since people go on about it so much. I spent quite a bit of
time learning it, and imho the usefulness to practicing programmers is
marginal at best. It does present some useful techniques that are making their
way into languages (e.g. swift optionals), but in general it didn't live up to
the hype for me. I feel a lot of the things they go on about are overly
complified to impress.

~~~
lallysingh
I tinkered with it for years before finding the aha moments that really made
it day more productive than my day job languages.

Now I feel I could race a team of programmers in those languages and be far in
front.

~~~
chadcmulligan
In what sort of application would you say that its better?

~~~
lallysingh
Every one I've tried so far. Command line, server side, and some light web. I
haven't tried mobile.

~~~
darsnack
Could you comment on these “aha” moments?

~~~
lallysingh
Basically you start to ingrain how to use the tools there for problems that
you solve differently in other systems. Then you start looking for ways to get
rid of the awkward parts of your code

The old list of official aha's is: Monads, applicative functors, lenses. But
really it's about spending time learning them well enough to use normally.

------
ssivark
I was struggling with Mathematica in grad school (it's bread and butter for
theoretical physicists, since it is particularly good at symbolic math, and
good enough for ancillary numerics, plotting, etc). My previous programming
experience included C/C++, Python (numpy/scipy), Matlab/Scilab/Octave, and
somehow, getting Mathematica to do things seemed like pulling teeth.

Incidentally, around the same time, I was exploring lambda calculus and "Learn
You a Haskell" [1] for fun. Suddenly, at some point, several things about the
declarative paradigm just _clicked_ , and within a space of a few days, I went
from struggling against Mathematica and hating the experience, to appreciating
it's elegance and enjoying getting things done with it! For functional
programming in Mathematica, see [2, 3] and the fact that it gives you access
to the expression tree in a structured manner means that some really cool
stuff becomes really easy!

Later, I learned that Mathematica's computation model is based on pattern
matching and term rewriting. I don't know enough computing theory to
disambiguate functional/rewriting/declarative, but I find it very satisfying
to write code in roughly that style -- it feels clean and understandable. See
Yann Esposito's write-up [4] for a fantastic illustration of this.

[1]: [http://learnyouahaskell.com/](http://learnyouahaskell.com/)

[2]:
[https://reference.wolfram.com/language/guide/FunctionalProgr...](https://reference.wolfram.com/language/guide/FunctionalProgramming.html)

[3]:
[https://mathematica.stackexchange.com/questions/163992/what-...](https://mathematica.stackexchange.com/questions/163992/what-
makes-mathematica-a-functional-programming-language)

[4]: See Section 3.1 (you don't actually have to know any Haskell to get value
from this example): [http://yannesposito.com/Scratch/en/blog/Haskell-the-Hard-
Way...](http://yannesposito.com/Scratch/en/blog/Haskell-the-Hard-
Way/#functional-style)

In that illustration, Yann takes a simple problem with an obvious procedural
solution, and proceeds to rewrite it step-by-step in ten versions, each time
making it slightly more modular. The modularity and flexibility of the later
versions is breathtaking in its simplicity!

------
smt88
Using TypeScript for about 2 hours made me wonder why anyone would write
JavaScript ever again. This was several years ago, when TypeScript wasn't
nearly as good as it is now.

~~~
MarvelousWololo
I've been using TS with RxJS for the last 4 months and I think JS is totally
fine. What bothers me the most is JS abuse. Why everything has to be a SPA and
thus have to be recreated in JS. E.g. history API. And wtf we need RxJS at
all? Other than that management has decided to go with the most strict rules
for TS, so I'm spending countless hours fixing silly type errors that don't
add anything to the product. I've worked with huge JS codebases a couple of
times and we were fine without it. TS is overhyped in my opinion. Let's make
more use of the server instead of doing everything on the client.

~~~
RomanPushkin
> I'm spending countless hours fixing silly type errors that don't add
> anything to the product

+1

Also [https://medium.com/javascript-scene/the-typescript-
tax-132ff...](https://medium.com/javascript-scene/the-typescript-
tax-132ff4cb175b)

------
danra
Every once in a while I search for something I need in Python and find that it
only exists in Python 3, but I'm not _quite_ convinced it's worth the effort
to transition from 2.

However... I've just learned about Python 3's f-strings and they gave me a big
smile, and probably the final push I needed :)

(Yes, I do know that sounds silly and there are probably much better reasons
to transition to Python 3 ¯\\_(ツ)_/¯ )

~~~
ahaferburg
The non-Python-specific term is string interpolation, and it should be in
every language.

From a language design perspective, the problem is that the compiler needs to
know about it, so it's a core feature, but for maximal efficiency it might
require some feature that isn't part of a language's core, like string
streams. Another possibility would be a powerful macro system that allows the
transformation of a format string into code, but few languages offer that.

------
pikamonad
Long term job and total comp prospects. Having done tacked on FP in Node.js
(Fantasy Land, Sanctuary, Ramda) and Scala (scalaz, cats) for quite some time
and now programming in "proper" FP languages like PureScript and Haskell
daily, I absolutely love life as a programmer.

However, having a decent sized family with deep roots, wanting to do college
for my children as a sole earner, not living near a tech-hub of any sort, I
come away with this wish list -- and I have to sustain for 20 to 30 years.
~150k TC, remote, FP languages, for the net 20 to 30 years. Seems like a tall
order. We could add up all the FP Scala, FP JS, Clojure, Rust, Haskell,
PureScript, Elm, OCaml, ML, etc gigs and it would be a sliver compared to the
prospects of just picking the same list with just any of the other top 10
languages.

I could hop into a remote Node.js/Scala gig and slowly sell them on FP, but
having done this for the past few years it gets a bit draining to fight over
small FP items when you are ready to do so much more.

------
simonh
I was working on a Perl script, which kept throwing an error and Perl was
somehow convinced the bug was about 15 lines further up the script, in a piece
of code nothing whatsoever to do with the bit that was failing.

I was dabbling with Python 2.4 at the time and also hit an error. The moment I
saw a Python stack trace, I was sold. All my anxiety about white space melted
away and I don't think I ever wrote another Perl script (from scratch anyway)
again. I'll always be grateful to python for making things I found painful and
confusing in Perl (which I used for over 10 years) trivially easy.

I do have my fair share of prejudices and probably misconceptions. For example
I have a deep distrust of java apps from working with several monstrously
unwieldy, poorly performing, hard to support monstrosities developed in the
language. Yet to be fair I also worked on a Java GUI client app that was fast,
flexible and had really great logs and diagnostics. Personal history and
experience are hard to transcend.

------
ridiculous_fish
Anyone want to swim upstream in the static types -> functional programming ->
immutability current?

I'll do it:

1\. LLVM's IR. In many ways this is dorky OO: everything has a reference to
everything else (an Instruction knows its BasicBlock knows its Function...).
It also has many _dynamically-enforced_ constraints, e.g. a basic block must
have its phi nodes at its beginning, and terminators at its end. These are NOT
enforced via the type system but instead dynamically through verification. I
was surprised, but I have come to realize the ergonomic benefits of having
everything reachable from everything, allowing transient illegal states, and
how type-level enforcement enacts a complexity price.

2\. Clojure's threading macros, in particular thread-as. This is basically
Haskell's do-notation, but more flexible and without involving the compiler.
LISP continues to be relevant and eye-opening - are there any static type
systems that have such a construct?

~~~
wqdqdwqdq
Nope, Haskell's do notation is context based computation passing mechanism
that in most cases obey laws, while Clojure's threading macro is a
programmer/human convenience. Both have nothing but superficial similarity

------
zyxzevn
1) This talk made me realize that changing a paradigm is not just about
changing a language. It is a way of changing the way you approach all problems
and solutions. Even in real life.

 _Keep Ruby Weird 2018 - Closing Keynote by Avdi Grimm_
[https://www.youtube.com/watch?v=UJnsXUVsr7w](https://www.youtube.com/watch?v=UJnsXUVsr7w)

2) And this talk made me very interested about Elixir

 _Elixir: The only Sane Choice in an Insane World - Brian Cardarella_
[https://www.youtube.com/watch?v=gom6nEvtl3U](https://www.youtube.com/watch?v=gom6nEvtl3U)

------
redact207
DDD - Domain driven design. My entire career pivoted the day I worked with a
team that was building a large financial system (ie: multiple teams around the
world, hugely complex set of business domains, zero margin for error). The
entire approach to code, thinking, and organisation meant that a code base
that could easily be a mess at 1/10th the scale, was maintainable and actually
increased velocity of the teams over time.

It made me realise that technology is rarely a solution. In fact every new
language promises a silver bullet, when in reality I've always seen its the
approach to writing software and complexity that kills projects.

DDD - a sane approach to keeping complex systems simple.

~~~
Naac
What language/framework did your team use for that project?

~~~
redact207
It was C#, NServiceBus framework. Nowadays I'm doing similar work in
Typescript. The approach is somewhat technology agnostic

~~~
lioeters
Would love to hear more about domain-driven design in TypeScript!

------
bayindirh
I didn't like python at first. The mentality of "hardware is cheap, there's no
need to be that fast" drove me away from it full speed. However, I was short
on time for something, and I gave it a try, then I actually liked it a lot.
It's very practical and fast enough for most cases.

The string processing capabilities has blown me away, however it didn't change
my love for higher performance, compiled languages. Now I use it for small
system tools or prototyping, but for highest performance I still use C++.

Similarly, I disliked Java. It felt slow, heavy, bloated, _unnecessarily
Enterprise_. Again, using it changed most of the perception. I found out that
for some stuff it's pretty competent, it's easy to write sophisticated stuff.
GUI classes and event based XML parsers are very well thought out. I also
found out that Java's infamy came from bad programming, especially in the GUI
part. JDK tends to correct your GUI programming errors to simplify the stuff,
however this "auto-mending" came with a performance penalty which adds up with
every small mistake you make. So you need to write your code carefully.
Frameworks like JaDE made me like Java even more.

In the end, most of the like/dislike comes from preconceptions and prejudice
if you ask me. If the language you're using is up to the task you have at
hand, and you understand the trade-offs you make with a language, there's no
need and reason to dislike a language.

~~~
Sohcahtoa82
> It felt slow, heavy, bloated, unnecessarily Enterprise.

You can avoid the "unnecessarily Enterprise" by not writing code in the
Enterprise fashion.

Not every class needs to be an extension of an abstract class that implements
an interface. Not every class needs a factory class. Not every operation needs
to be abstracted. If you have a class that has just a constructor and a single
function, you probably don't need a class and can just write a function.

Most complaints about Java are really complaints about some idiotic Java
paradigms that seem to have been invented by someone that measures
productivity in LoC written or classes implemented.

~~~
bayindirh
Java's "unnecessarily enterprise" feeling didn't come from the paradigms that
increase the LoC (abstraction, interfaces, factory classes, etc.), but from
the way it developed, where it used and how it behaved.

This feeling is also reinforced by the _prime_ editors of that time: Eclipse
and Netbeans. Both are behemoths which were written in Java, heavy, resource
intensive and somewhat slow. In the 2000s, Java wasn't that fast and refined,
and it showed and felt profoundly, at least by me.

With the open sourcing of Java and the performance jump happened with Intel's
Core i series CPUs, Java got a lot lighter. I've never had hard feelings
against any programming language, and I hold no grudge against Java, but I
currently don't need it, so I'm not using it.

------
gumby
General hardware. I used to do all my work on dedicated hardware with support
for Lisp (CADRs, 36xxs, D-machines, and before that PDP-10s) because those
general purpose machines couldn't implement important features like
generational garbage collection without specific hardware assist.

Then one day in 84/85 I saw a generational GC using the MMU on a 68K and the
light dawned.

Thus while I'm pretty interested in ML chipsets, I figure the necessary
portion of functionality will ultimately end up subsumed in the generic chips
and most if not all those ML hardware startups will go bust.

~~~
chalst
I wouldn't bet on the ML hardware startups either, but don't you think that
what we regard as general hardware is always going to be too heavyweight to be
suitable for massive parallelisation?

~~~
gumby
Well, take a system approach to the hardware. It's possible you could have
multiple paths to memory. It's more likely you won't on non-high-end-systems.
The primary path will for the CPU, especially on mobile.

Also look at the workload: often many of the primary cores will be idle when
you're crunching a big dataset.

Those two factors suggest that you want to use CPU hardware for mult-add fops,
especially when you consider using smaller float sizes (not just f16 but even
f8). And then you consider that the compiler can perhaps properly interleave
these ops with the regular instruction mix...

This is who general purpose von Neumann architectures ate hardware, both RISC
and CISC.

Now the current crop is pretty dire (e.g. AVX* is hard to use with a typical
workload) but perhaps these won't be typical workload? Or the power/timing
problems will be solved other ways.

~~~
comex
Perhaps.

But before ML was on dedicated hardware it was on GPUs, and programmable GPUs
have been around for almost two decades now. They're getting more and more
CPU-like, but so far there's no sign of them actually being subsumed by CPUs.
If anything, attempts to make CPU-GPU hybrids have historically tended to
fail, such as Larrabee or the Cell processor on PlayStation 3. Of course, ML
is a somewhat different workload from the usual use of GPUs... but not that
different. The future could be different from the past... but as CPU speed
increases get slower and slower, why not expect things to shift even further
in the direction of dedicated hardware?

------
harel
Using it.

I try to learn a new platform every once in a while. The best way to learn is
to do. So I pick a project from my list and try it out. That's how I learned I
didn't like Meteor and Angular when they came out and were hyper hyped but I
enjoy Go very much but only for certain things. Or how I learned I disliked
PHP or loved working with Python, or that MongoDB is ok for some things and
less so for others (or that Postgres is bliss). I used them for something and
I find what is the problem those "tools" are best at solving and where they
don't fit. Then some of them I keep using over and over again, and some I file
under been-there-tried-that. What I really really try to do, is get peer-
biased. Many times the internet hypes something for 5 minutes and dumps it
later. Or ignores something for 5 years only to rediscover it and rave about
it.

~~~
repsilat
Totally agreed. I started respecting dynamic languages after I got a job
writing Python. I'd underestimated a lot of the upsides and overestimated a
lot of the downsides.

Ditto JS (and web apps in general), testing, functional(ish) design, package
management and code autoformatters. Containers to a lesser extent. A bit of
getting used to it, and then a very clear sense that actually this _does_ make
my life easier, that the benefits aren't just propaganda.

For a counterexample, I "used" and rolled my eyes at daily stand-ups, close
in-team communication and project management process, and now that I'm working
on a team that doesn't do any of that I miss it a lot.

------
EdwardDiego
Kotlin's non-null type checks seemed nice, but I didn't start appreciating
them until I had to debug a NPE being thrown 100 frames down a stack-trace in
an entirely unsuspected piece of code in a legacy Java app.I had to release
new debug code to production just to figure out where the hell it was
happening.

Just having the type system making 'this could be null' and 'this is assured
to never be null' explicit would've saved me hours.

Also, playing with Erlang made me really appreciate the actor model of
concurrency.

------
nurettin
Python. Having picked it up and dropped it like fire during the 2.7 era, I
never looked back. For a long time, my main all-purpose scripting language was
Ruby.

Then I had to write Python3 for my current project and wow. Type annotations,
async, mypy, context managers, actually supporting unicode, the language
improved over the last decade.

Not to say list comprehensions are any good beyond the most mundane tasks, but
you can write a decent library that has the equivalent of ruby enum and array
methods to bypass that limitation now.

------
jl2718
Labview. I hated graphical programming. I don’t even know why. Then I worked
on this fairly complex real-time system with FPGAs and lasers and whatnot with
several people. Works first time. How does it work? Obvious. Encapsulation is
obvious from the diagram. Comments barely needed. Need to make a change deep
in the code that breaks an interface? Everything you need to change turns red.
If something doesn’t look good, it probably isn’t. design patterns are
obvious. Was it slow to write? Yes. Definitely. Were there times I was
frustrated because I wanted to write some hasty junk imperative code? Yes.
Labview made me do it right. Could all this be done easier with lexical code?
Probably, but I rarely see it. Can you make a disaster out of labview? Yes;
even worse maybe. But... it makes bad code harder to write than good code, and
quite obvious visually.

~~~
rthomas6
I could not disagee more. Sorry... But fuck Labview. I'll take VHDL over that
mess any day.

------
lcuff
My opinion about Perl went from "It's hideous" to "It's okay". I tried to
learn Perl by just reading code, like I'd learned several other languages
(Fortran to Pascal to C). I finally read the Camel book. On page 83 it said
"Everything in Perl happens in a context. Until you learn this you will be
miserable." I was.

~~~
cutler
Perl was my first love. If you take the time to really grok it Perl is a
beautifully designed language.

------
alexbanks
Types - I worked on an enterprise Java application for a few years and grew to
hate them. What I actually hated was J2EE, but at the time I didn't know that.
So I swung hard into "You don't need types", only to work for another two
years on the world's worst Node.js app. I was young, I was naive, I didn't
realize how bad things could get. We didn't have anybody that cared about
testing, and I'd never been around testing, so I didn't either. It was a mess.
More of a mess than J2EE ever was - we were constantly writing code on
quicksand. So now I'm back to types for the most part. I think 2012-but-
actually-2007 era Java was also part of my pain, and a few years ago I moved
on to Go/C# and have been very happy.

Testing - as noted above, prior to my last two jobs I had never had any
experience with people pragmatic/dogmatic about testing. At my first few,
there just wasn't any testing, for better or worse. I worked on a team with a
person that was fairly dogmatic about TDD, and I spent almost a year
painstakingly focused on caring about testing. What I found was that focusing
on 100% test coverage across each layer of your app is a fool's errand -
you'll never get to where you think you're going. All the tests in the world
are useless if you're bad at writing tests, which (at that job) we were. I've
now moved onto "Just enough E2E/Integration tests" as a paradigm. If a PR
contains _some_ testing, and that testing is focused on making sure units of
logic work together, I feel much better. Prioritizing fast and easy response
time to bugs, and making sure that all the test we contribute having meaning
and add value, as opposed to just mindlessly boosting our stats has been a
huge boon.

~~~
twic
> I've now moved onto "Just enough E2E/Integration tests" as a paradigm.

Another soul is saved!

------
vbsteven
Parse.com shutting down is what turned me off from BaaS services and made me
realize that vendor lock-in is real and that all future services I write
should be as generic as possible so I can easily switch platforms when needed.

Besides that, spending months fixing issues that pop up in production but
should have been caught at compile time has turned me off from dynamically
typed languages in favor of static typing.

------
enriquto
I was a fan of object oriented programming (and over-engineering for the sake
of it) until I came to know the McIlroy pipe for computing a histogram. It was
so terse and beautiful that it flipped my world upside down. Since then, I
have been hooked onto the unix philosophy.

See here an enlightening read about this programming style:
[https://www.oilshell.org/blog/2017/01/15.html](https://www.oilshell.org/blog/2017/01/15.html)

------
wmnwmn
Changed my mind on OO after seeing how difficult it is for ordinary humans to
design objects that can withstand the spec changes that are inevitable in any
real project. Bad procedural code is a mess but bad object code can be
completely irreparable.

~~~
techsin101
This.. bird is an animal but now we add air planes. It too flies. Is bird an
animal or flying object? Already problems.

Reusability was biggest reason I liked oop. But you can't do that if while
world doesn't cleanly derives from singularity.

it gets screwy fast if world changes.

------
ianai
Not what you’re asking for but related.

I’ve always found programming interesting. It’s possibly one of neatest ways
to figure things out with computation. But early on I got the impression the
profession was abusive - maybe inherently. I got fired from my first
programming job for prioritizing finishing high school over a programming job.
I dabbled after that and every project related to computers had this “natural”
way of feeling like something I have to do “now” - even if it meant not
sleeping. Then I saw the cs majors I knew in college doing the same. I heard
about the industry being the same way. Not only that, most programming work
was downright boring to the point of tedious. CRUD apps, basically everywhere.
Nothing ever very original. Maybe it’s different now, but it sure doesn’t look
like it’s progressed very far over the years.

~~~
icebraining
Maybe it's different in other places, but I've worked in a few EU countries
and I don't agree that abusiveness is inherent. I see the company culture and
attitude of the bosses as the determinant factor, but overall most places I've
worked have been fine. And even in the worst, where overtime was rampant and
stress was high, it was unthinkable to fire one of the intern students for
prioritizing school.

I can't argue with boring; CRUD is rampant. But I think you can escape it if
you really care.

------
badrabbit
As someone who appreciated C and C-style languages I hated on python for a
long time.

I disliked python because of syntactical annoyances,mistakes I would notmally
iron out at compile time now happen at run time and version fragmentation with
no backward compatibility at times. On that last point,new java versions for
example would deprecate features over time allowing the programmer the option
of enabling these features. I still think the simple things like mixing tabs
and spaces shouldn't have caused a hard failure,warnings or ability to
override the deprecation would have been nice. Not only that,I hated how I
would ship a python program having tested it in one python version and five
years from now the default python version on linux distros (or default
installer for windows) would install an incompatible version,I hated how users
of my program would have to figure out the correct version.

But I no longer hate python mostly as a result of having seen how easily it
has helped me solve different types problems that are otherwise difficult or
time consuming to solve. At the end of the day,it gets the job done well,is
easy to learn,harder to screw up in and has saved me a ton of time.

~~~
pbourke
This sounds similar to my journey with Python. For years, I used it as a go to
for programs that were bigger than a bash script but smaller than an
application or service. I mostly wrote single .py files with maybe the odd
module and avoided setuptools and friends. In the last year, I’ve been working
on a larger library in Python, which has meant getting into the packaging and
deployment tool chain and structuring a project with dozens of files, tests,
etc. My background has been in mainline languages such as Java and C#, with
excursions into Clojure, F#, etc. We adopted the new Python type annotations
and they seem to increase the visual clutter of the language without
delivering many benefits in terms of confidence or ease of development/code
completion. Our library is aimed at data science, so I’m sure not many users
will actually bother setting up the tools to do type checking.

I do still appreciate Python, but I wonder whether my issues are due to our
approach, my current level of familiarity with Python and the ecosystem or the
fact that we’re working in the data science world which is new to me as well.

------
i_feel_great
The Smalltalk feedback/living-in-the-debugger coding style. I was happily
coding in Python, C#, Java etc when I ran into Squeak, and then Pharo and
Dolphin Smalltalks.

This style presents the smallest barrier between idea and working code for me.

~~~
brianberns
Smalltalk is great. I haven’t used it in years, but I miss the feeling that
everything is part of one big Smalltalk environment.

------
ken
I knew about bootstrapping (e.g., from college compilers class) but I never
really appreciated how you could build something _using that same thing you’re
building_ , simply by breaking the chain and providing a base case, until the
_second_ time I read AMOP.

The first time I read it, I completely missed the point. Now I agree with Alan
Kay that it’s one of the most important books in the field.

------
ionforce
Scala.

It really opened my mind up to what I was missing, as a recovering Perl
programmer. Perl folk tend to think their shit doesn't stick. But it's the
Blub Paradox in full effect.

Types matter. Functional programming matters.

Anyone advocating for dynamic typing in the year 2019 is captain of the USS
Blub; and it's sinking.

~~~
icebraining
What do you think about the fact that the person who coined the Blub Paradox
was an advocate and creator of dynamic languages?

------
yellowapple
I've always strongly disliked Python. Every time I used it, it felt like I was
banging my head on a wall all the time. The ongoing quagmire that is Python 2
v. Python 3 certainly didn't help.

My current job is with a company that primarily uses Python, and I still
strongly dislike it even after using it for a lot more things. I'm very much a
proponent of using the right tool for the job, and Python is almost never it.

Almost.

One of my solo projects was to write a desktop client to unify a couple
different warehouse systems. I experimented with quite a few different
languages and toolkits in the hopes of finding something that'd satisfy all my
conditions:

\- Support for an embedded WebKit widget (one of the systems being integrated
was entirely web-based)

\- Portable (I needed to support Windows clients, but didn't want to use
Windows as my dev environment, so the app would need to support Linux; plus, I
wanted to extend the possibility of replacing the Windows machines with Linux
at some point in the future)

\- Support for Windows' winspool API (I needed to send raw ZPL data to label
printers; in Linux this is easy to do by wrapping the lpr command, but with
Windows this entails either winspool or a bunch of special filename hackery)

\- Preferably no C++ or Java, since I dislike those languages quite a bit, too
(and statically-typed languages like that didn't feel like the right fit for
the amount of JSON mangling I needed to do)

Ultimately, I ended up settling on Python 3 + Qt5, since it was the only stack
that actually checked all those boxes, and consequently that stack is in my
"right tool for the job" list (when "job" equals "graphical desktop
application with the need to embed a browser view"). I'm still on the hunt for
something better (and for cases where I don't need an embedded browser, Tcl/Tk
is still that "something better"), but it does the job well enough for now.

------
lbill
I had my doubts about Rust. I knew it was a powerfull tool, but I thought it
was too hard to learn, to complex, and that it was reserved for heavily
resources-constrained environments, or any other situation where pure
performance was one of the prime concerns.

Then I attended a brilliant conference/demo about Rust, in which the speaker
proved that one could build something with Rust without giving much thought
about the complex principles of the language. You could get familiar with the
language by using it, and then have a much better chance to understand the
core concepts of Rust.

The conference is here (in french) :
[https://touraine.tech/talk/o9KtP8ZrZ130zLZPzdqn/](https://touraine.tech/talk/o9KtP8ZrZ130zLZPzdqn/)

~~~
twic
> Rust est souvent présenté pour ses concepts de gestion de mémoire avancés
> comme les lifecycles ou le borrowing.

«les lifecycles ou le borrowing», ouf!

------
johnramsden
I used to dislike Java, then recently I discovered java streams, and some of
the nice features with newer versions like more immutable types. After that I
found Java a whole lot more enjoyable to use.

~~~
anoncake
But in this case you didn't change your mind. You dislike old Java but like
new Java.

------
revskill
Ladder of abstraction and Functional Programming

[1]
[http://worrydream.com/LadderOfAbstraction/](http://worrydream.com/LadderOfAbstraction/)
[2]
[https://en.wikibooks.org/wiki/Computer_Programming/Functiona...](https://en.wikibooks.org/wiki/Computer_Programming/Functional_programming)

~~~
beidou
Very illuminating.

------
vannucci
I realized just how useful shell scripting is (bash) when it occurred to me a
simple five line script I wrote to erase and reclone my main work repo is the
single most useful piece of code I’ve written in my first year as a web
developer. God that’s a time saver.

~~~
h_r
The first thing I thought when I read this is why do you have to do this so
often that you scripted it? It's good that you thought of a way to save time
but this solution seems like a very blunt instrument to get back to some
previous state.

~~~
randomsearch
Most likely answer is “git’s ux”, in my experience. So often it’s just faster
and reassuring to reclone rather than resolve some unusual state you’ve
accidentally put your local repo in, because a command didn’t do what you
expected it to do.

------
mnemotronic
Not chronological order ....

Peer pressure. \--- "What? You DONT know regular expressions?!?! I thought an
ace like you would be all over those".

Not enough official work. \--- "I think I'll try learning Java. Maybe use it
to create a moving average stock buy/sell tool..."

Need it for work. \--- "Stop bothering the database guys. They can't be
answering your WHERE clause questions all the time. Just go learn SQL
yourself"

Work (again). \--- "Well, we don't have a Pascal compiler for CP/M. You're
gonna have to learn C. Like or not."

Work (again). \--- "The 3060 uses an HP-9825. The HP-9825 runs HPL. Not BASIC.
Here's the manuals for it. Now, get busy and stop complaining."

------
int_19h
The more code I write in dynamically typed languages, the more I believe that
static typing is a must.

Generally speaking, I noticed that I'm shifting more and more away from "stop
annoying me and let me do what I want, I know better!" part of the PL/API
design spectrum, and towards "better safe than sorry". Static typing, runtime
checks, data schemas, design by contract, fail fast etc. Yes, it's overhead,
but it's _predictable_ overhead - unlike, say, trying to debug an
incomprehensible error the night before release.

~~~
vivekseth
One downside to static typing is the overhead required when writing/running
programs.

For example, let's say you have a class `Dog` that you want to rename to be
more generic so you now call it `Animal`.

In Python you can test out snippets of code with `Animal` without necessarily
having to worry about other pieces of code still referring to `Dog`. You would
just need to make sure that the code you want to run happens before later
pieces that might use `Dog`.

In C/C++ you'd have to change all the references to `Dog` in your codebase,
comment them all out, or add a new target to your project to build a subset of
files which do not include an invalid reference to `Dog`.

This extra friction can make rapid prototyping a little harder in static typed
languages.

IDK if any language supports this today, but ideally I'd like to be able to
run a language without compile-time type checking for prototyping and be able
to progressively turn on type checking as I'm nearing the final stages of
iteration on a project/feature.

~~~
koyote
In C# with Visual Studio and Resharper:

* Have cursor on 'Dog' * Ctrl-R-R * Type 'Animal' * Press Enter

Done. And your code won't be littered with 'old' types and other garbage.

~~~
brianberns
This is native to Visual Studio now. You don’t even need Resharper.

~~~
kkarakk
i've only used the native one but i've found it fails on some edge cases where
it can't detect variables outside a certain scope. still trying to reproduce
it but has made me double check all references since then

------
iElectric2
Types, because of testing.

Python code is really hard to maintain once you grow over a few thousand lines
of code and a few developers.

I used to believe that a good culture and seasoned developers can delivered
testable code, but it rarely happens in practice. Even if, deadlines kill
testing budget.

Types make refactorings sane, they are you contracts. Ended up in Haskell.

~~~
throwaway_391
[http://mypy-lang.org/](http://mypy-lang.org/)

------
analog31
Way back in the 80s, I looked into Pascal and C, and was dead certain that
Pascal would win out. Oops.

Many years later I was dragged into writing C because it's the de facto
language for microcontrollers, and my assembly language programs were getting
too big to manage. So I've finally gained an appreciation for C, even though
I'm far from being a great C programmer yet.

I've often got two IDE's running on my desktop -- C for the microcontroller
and Python for testing my embedded designs.

------
danbolt
Writing OCaml for a programming languages course in university.

The blend of statically-typed functional programming with eager performance
characteristics changed how I thought about writing computer programs. I
started using C after that and found the language much easier to use.

------
jacobush
I have been using Python in anger for just about anything I could get away
with since the year 2000. Shunning Java if I could help it, later the same
with .NET etc. Feeling mighty snug with duck typing, using Python as sort of
my Lisp substitute. (I managed to inflict Lisp on my colleagues once, but it
was not a match made in heaven.)

So, recently I was tasked to fix a (pretty small) codebase in VB.net. It was
not a pretty code, but not awful either. I went through it, cleaned up as I
saw fit. (I had a lot of freedom to play around, since the code was basically
in undead state. No one was going to come after me and say "why did you change
so much" or anything like that.)

And I discovered some things. Visual Studio is magically good. (Take it for
what it is, I am sure you have some fav IDE. As a Unix neckbeard, VS is
unicorn _magic_.)

VB.net is _nice_. Auto variables, but still static typing. The full .NET
toolset and datatypes, iterators etc. The fluffiness of the syntax doesn't
matter when the IDE is closing your blocks for you etc.

Also VB.net is _easy_ \- it feels a bit like I imagine Python with types.

So, long story short - getting helped by the static types so many types, it
felt downright awkward going back to Python for other projects. :-/

I am using MyPy now, which add some static typing as annotations, but I wonder
if I'm just prolonging my stay in a local optimum. I should probably brake
free from the stranglehold Python holds me in. But where do I go? I'm not a
native Windows citizen.

IntelliJ looks great, but super complicated. Might try that and the Java
world. Or Rust?

It's a big world and I'm not getting any younger.

------
Insanity
Usually I recommend that people actually _use_ a language to build something
small rather than judging it just by the looks.

------
a-saleh
Actually using go-lang for a project.

I really don't like the language. It is verbose, with really simplistic error-
handling, over-reliance on accepting interface{} in methods (meaning arbitrary
object).

More-over, I really like the basic building blocks of functional programming,
being able to map,fold/reduce,filter my way through the data I am dealing
with, and with the lack of generics the golang-attitude seems to be "just
write the damn for-loop".

But after ~3 months of writing golang services, it is actually quite pleasant
to work with. Especially after they introduced `go mod` for dependency
management. Ecosystem is nice. Libraries are nice. I even enjou the damn
multi-megabyte staticly-linked libraries. I know I could have them in other
languages. But I would have to fiddle.

With golang, I don't have to fiddle. That is really nice.

------
Ephiiz
Types, I never could stand them. I started out programming C++ for tiny side
projects, eventually moving to python2 for a majority of my projects until I
got into a programming class in my highschool and was taught java, and I hated
java as soon as I touched it. Java's type system was gross to me, I never
really got advanced enough in C++ to use types much and python has no type so
I just assumed java was an outlier and types where a stupid addition by it. I
continued to believe this even once I got a job in node, until I used
typescript. Actually seeing why you would want a static typing system, and
being old enough to appreciate the structure they provided changed me, would
hardly consider using a dynamically typed language for even a reasonably sized
project now.

------
watwut
Working with that crappy thing for few months. At some moment the crappy
disappears and it may even becomes actually cool.

Alternatively, working with that cool thing for few months on something
complex. At some moment the cool disappears and it may even becomes actually
crappy.

------
jtreminio
I avoided having to learn Javascript in-depth because I thought its type
system was like coding in the dark.

Your IDE would either return all possible methods of all possible objects in
your whole project, or wouldn't return anything at all.

You need to either memorize the API of whatever object you're currently
working with, or have its API open in a separate window to always see what's
what.

Wait, no, I still think this is true. I discovered Typescript a while ago and
it is a joy to work with. I don't understand why developers would subject
themselves to the modern equivalent of coding in notepad.exe when they could
use a language that actively helps them.

~~~
colejohnson66
Modern editors such as VS Code have type inference and you can add special
comments above the functions to list they types

------
zzzeek
ORMs, until i wrote my own.

~~~
JesseAldridge
How exactly did your mind change?

~~~
dvh
Not op but:

Type "user.", waits 500ms, see complete db schema of user table with types.
Foreign keys and comments 1 Ctrl-click away.

No more typos in column names.

Db schema changed? Press build and you have 234 errors in 56 files.

What was the name of that table? Adr(alt space)essCommentFooBar

Need to insert new row?, Ctrl-c class interface and just assign the values.

But selects I write my own.

------
vqng
Peter Norvig's design of computer programs

He showed how to express complicated business logic in Python in a simple and
elegant way.

------
analyzethis
Having worked with .Net for many years, I always had a bad feeling expanding
my experience with a closed source, Windows-only platform (I liked the C#
language and tooling, though). I kept looking for an open source, cross
platform language with the same features and characteristics as .Net. .Net
Standard and .Net Core being open source (MIT licenced), cross-platform, more
performant than .Net Framework and Mono and being supported by free cross-
platform tooling really gives me peace of mind in this regard.

~~~
IanSanders
I really enjoy C#, .Net and VS features, but man VS2017 has been broken for a
while. Numerous 5 year old bugs either not fixed or coming back. (SQL Schema
compare not authenticating, Intellisense not working, compilation errors not
displaying in the errors window, winforms designer messing up form layout when
opened on high DPI screens, slowness and freezing on medium-sized projects and
decent hardware, the list can go on and on)

~~~
Volrath89
MS resources and focus are now on VS Code. Which is a shame because I'm sure
the market for companies that pay VS Pro/Enterprise + Resharper is not small

------
js8
I was a fan of Python and Lisp, and skeptical of functional programming (with
full referential transparency). I knew many people praised Haskell, but in
particular, I didn't understand how it could replace Lisp macros for building
abstractions (the article [http://www.haskellforall.com/2012/06/you-could-
have-invented...](http://www.haskellforall.com/2012/06/you-could-have-
invented-free-monads.html) changed my mind on that).

I was also skeptical about static typing. Then I tried to program in Haskell
and I was able to do what I was unable to do in previous languages - write
short enough functions that focus on one thing, and which compose in type safe
way. So now I am a big fan of Haskell, if you don't particularly need speed
(Haskell can be made fast but it takes quite an effort) then it's a great
choice of language.

(Interestingly, common refrain of many comments here is "then I tried it"; I
suggest that you really do, if you are personally skeptical about something,
but there is a reason that suggests you might be wrong.)

------
sirsuki
Any one of Sandi Metz's talks. Made me realize just how easier things are when
you can abstract them to an object with a well defined messages.

------
lmilcin
I have decide to switch to Clojure recently. It all started after I use
Swagger Editor to generate REST service clients in a dozen languages and
noticed the Clojure one was at least couple of times smaller and much cleaner
looking than in other languages. I already knew Lisp, revisiting why the
notation in Clojure is so clean and why it is not so in other languages was
the push I needed.

------
ggm
Repeated re-use of forms of code, with no library to call on made me value the
python 'include' and 'include from' moments because I stopped re-inventing the
wheel.

Learning to use lambda functions lightly made code which I could understand
and explain to people because by reducing more complex functions to repeated
applications of function in one (changing) argument (with possibly bound
static other arguments) reduced the complexity of explaining to the one moment
of change across the mapped lambda call.

yield() taught me the distinction between a bounded list or set of things,
which you know exists in its entirety, and a sequential stream of things,
which may be so big you cannot hold it, but you can still compute over it.
Sequences of yield() become compositions of functions.

I disliked python for years. I was a perl and C hacker. I now realize that
what I disliked, is the neccessary clarity of thought over what I am doing.

My coding style is finger-painting but I am in a world where fine brushwork is
needed.

------
elamje
Java/JVM perception changed by learning Clojure.

I did a lot of Java/Android/Hadoop back in college and honestly was under the
impression that Java was for legacy enterprise apps, and college classes. With
the exception of Android, I felt that it's prevalence was dying as many
companies are using MVC type of products in simple frameworks like Django and
Rails now.

I was a little disenchanted that I had to do so many assignments in Java in
college, then I found Clojure.

My entire programming mindset has changed after working with a FP language, in
my case, Clojure. This has got me more excited about the JVM than ever before
because I see it as having new abilities I didn't see before, and the added
benefit of great libraries that have been rigorously tested.

I highly recommend learning a LISP. A great beginner starting point is CLojure
with Clojure for the Brave and True as a way to learn. The other go to is
Structures and Interpretation of Computer Programs which is in Scheme.

------
moonlet
Used to hate Ruby, then I picked it up for a side project. I wouldn’t exactly
write a compiler in it, but it’s great for small scripting tasks that don’t
need to be fast. I’m still not entirely sure why people do so much with it,
but it’s much more friendly than bash scripts for doing many of the same small
tasks.

------
prasanna83
Scala.

When I started learning Scala, it was such a breadth of fresh air and I
enjoyed it. Cut to 5 years later, it offers so much flexibility that using it
in a team results in significant overhead and becomes very easy to introduce
unnecessary complexities in the code. My personal experience has been that
proficient programmers in Scala are more into the language than the product
that all they want to do is refactor the code with next strongly typed library
(scalaz, cats, akka streams to monix). Combine this with not so involved
manager, it affects the team dynamics and not meeting deadlines regularly.

I don't see the language itself as an issue, I do like Scala but it is one of
those languages where one can try to do the same thing multiple ways again and
again that there needs to be a strong guidance and captain to steer the team
and keep course.

------
mlthoughts2018
For a long time while my day jobs were centered in Python, C and C++, I was
separately very interested in strict functional programming, particularly in
Haskell and slightly less but somewhat in Scala. I taught myself enough to be
roughly an “advanced intermediate” programmer in both languages just via
tutorials and side projects.

In the time since, I’ve had a job working in a large Haskell codebase and a
separate job in a large Scala codebase. I remember feeling a lot of pride and
self-confidence when I passed difficult technical interviews hitting on
pragmatic day-to-day issues in these languages despite having not had prior
job experience with them.

After years of working in these languages, I’ve basically had a 180 degree
flip in my opinion of them at least in terms of solving business problems.

I no longer have any desire to work in functional programming, and believe
that many of the promises it makes in regards to type safety, encoding IO or
side-effectful operations into the type system, automatic parallelism, fast
compile times, etc., are mostly just fantasies achievable only in idealized
settings, and that there are fundamental incongruencies between the types of
mutable systems that customer-facing products represent and immutable design
patterns that make the goals of software as a product development activity
incompatible with functional programming.

Since my background is in mathematics, I had always loosely believed that
programming languages are supposed to go in the direction of removing any
mental burden placed on the programmer to know how to represent a concept
inside the syntax of the language.

What I mean is, I had always thought for example that if a language requires
me to know something about computer memory layouts or the data structure
internals of say, floating point values, then it’s a failure of the language
and it should instead present me with an abstraction that renders those memory
details or data structure internals to be unnecessary in all possible
situations.

In many ways I think functional programming and the idea of formal
verification generally is a type of expression of this way of thinking.

But my 180 degree change of heart has made me feel based on my experiences
that it’s really the opposite. In all situations, I will frequently need
access to every layer of abstraction possibly all the way down to the actual
chemistry or physics of hardware components, and at the very least I’ll need
no abstractions wrapping memory layout or data structure internals.. I’ll
_always_ need to get outside of the abstraction and “do it myself” so to
speak.

This has in turn led me back to C/C++, some assembly and direct llvm
programming, and writing a lot of Python tools that expose more of these lower
level concerns at the Python level.

In a sense, I feel now more like there are not really programming languages so
much as there are specific pieces of software tightly coupled with specific
pieces of hardware, and I may need to modify or subvert fixed rules or
abstractions of any part of it. Then a programming language just ends up being
whatever common pattern of stuff on the software side ends up being useful
enough to get repeated from project to project, and I don’t think it’s an
accident that C is such a ubiquitous language in this sense.

------
reacweb
When I was student, I learned that many languages are equivalent to a Turing
machine and my teacher told that the expertise level in a language is
generally more important than the choice of the correct language for a task.
About 7 years latter, I have learned Perl. Using Perl, I was able to code in a
couple of hours programms that would require couple of days using my favorite
langugage (C/C++). I was shocked by the huge difference in productivity.
Before that, I was sold to the strong typing side because of my experience in
software maintenance. I still believe in strong typing, but I have more
insight to choose the most effective tool.

------
AllegedAlec
Minimalist languages, as in: languages that support a single kind of paradigm.
The more I work with C#, I find that supporting all kind of different
paradigms severely hampers a language, since you never know how a module is
going to do this.

I didn't notice this until I started learning Pharo and F#. Both of them were
just so incredibly clean, and so much simpler. I never have to think about how
to do anything in those languages too hard, since there's one good way of
doing it, and if you're doing it in another way, you'll run into so many ugly
bits of code that you instantly see that it isn't working.

------
hestefisk
So called ‘BPM suites’ such as jBPM, Oracle BPM etc. These monolithic monsters
are supposed to do away with code and make systems more agile. It’s just
another money drain for expensive consultants and buzz words.

------
crb002
Ruby got a generation of programmers interested in letting the compiler keep
track of types, and created broad demand for C++ auto in 2011. Now if only C
had auto ...

------
oldie
I the early Nineties, I had a good idea of what an ideal programming language
would be. Then I encountered Perl, which was diametrically opposite to
everything I thought I wanted. Perl enabled me to do in a morning what would
have taken a week in C. I was blown away. I was converted. I became a Perl
evangelist. I used Perl everywhere I could.

20-odd years later, writing Perl for fun and C++ for money, I had a Perl
program which, over ten years, had gradually grown to about 5,000 lines and 25
classes. It needed more testing than anything else I maintained. I missed the
help I would have got from a C++ compiler -- what some Perl people
disparagingly call BDSM. In C++, if I need to add a Widget argument to a leaf
method, the compiler will find all the call sites that need updating. If they
in turn need updating to accept a Widget, the compiler will find _their_ call
sites, and so on. By the time the program compiles again, it's much of the way
towards working. But Perl doesn't do that for me. I can't even specify the
number of arguments a method should take, let alone their types.

I eventually rewrote the whole thing in D. It came out at almost exactly the
same length, which was a surprise, but I got native speed, much-reduced memory
consumption, and proper type-safety. My hunch is that a translation to modern
C++ would have been longer and taken more work than doing it in D, but not
_that_ much more, and the performance would have been better still.

This is not so much a story about Perl, C, C++ and D as about weakly- and
strongly-typed languages. (Perl 6, in particular, remedies some of the
deficiencies in Perl 5's type system, and some of the bolt-on object systems
for Perl 5 take steps in that direction as well. These are heroic and
ingenious efforts to give users the best of all possible worlds, to the extent
that that's possible, and I wish them well.) Languages that are too weakly
typed, too dynamic, too flexible, are fine for short programs that are
maintained over a short period of time, but they just don't scale --
maintenance becomes too difficult, too time-consuming and too error-prone. At
some point, it's best to rewrite in a well-chosen language that imposes more
structure and provides more type-safety, and work with that structure (not
against it) to make your program rigid enough to stand up under its own
weight.

As a result of that experience, I no longer use Perl, or any other very
dynamic language, for anything but the smallest throwaway programs. For
everything else, I anticipate the need to grow by using C++ or D, and the next
language I learn is more likely to be Rust than, say, Python.

A more general point: if you only know one composer, one band or one style of
music and you think the rest aren't worth knowing, you don't know music. Even
though I don't use dynamic languages much nowadays, I'm a stronger engineer
for having done so. Do take the time to learn several different languages that
differ widely from each other and from what you already know -- preferably
including some assembler -- and don't assume that the language of the month is
best suited to your unique temperament or to the job in front of you. Having
an abundance of tools and choosing between them wisely is better than having
one and using it for everything. It doesn't matter how dexterously you can use
a soldering iron when what you need is a chainsaw.

------
flyinglizard
1\. Python. I was amazed at how awkward it was to scale a Python program and
coming from C type strictness, how much time I would spend running a program
again and again to get over silly typos and stuff. I absolutely refuse to use
Python for anything beyond one page long program. It's still awesome for
transforming data, but nothing beyond. It really felt to me like "the king is
naked" because everyone around me was/is heavily using Python and I just
couldn't stand it (and I've been using Python here and there for over 15 years
now). I felt like I was doing something wrong - no, guys, _you_ were doing
something wrong. Not even talking about the pathetic performance and the
abysmal concurrency; just the quality of tooling and static checks. (much of
the critcism obviously applies to other dynamic, scripted languages like JS,
but at least everyone admits JS is shit up front, not so with heavy Python
users).

2\. C# and its tooling. I consider it the apex language. If there's a useful
feature or convention somewhere else, Microsoft will make sure it makes its
way into the C# standard in the next version. Super elegant, readable,
straightforward and powerful. Superb tooling all around, from the blink fast
compilation to catching most issues with static analysis before you even get
to build. Linq and fantastic abstractions. Now that .NET Core is open source,
there's very little reason left not to use it as the go-to language for
backend programming. It got me excited about programming again, because my
ideas would translate nearly instantly into high quality, high performance
code that I could almost always rely on to run right the first time.

3\. The Linux kernel. I've been doing kernel development in many opportunities
for the last two decades and it's always a great example of an elegant, no
bullshit codebase. It's a window into the working of many great minds. Early
on, it got me educated on proper error handling in low level code (goto
stacks).

4\. Actors for concurrency. I started using the actor pattern in a commercial
embedded RTOS I designed, and it was the answer for all my concurrency needs
lots of time later. If you're extensively calling lock/unlock in your user
code, you're doing it wrong and it won't scale.

5\. Functional programming principles, most notably immutability and avoiding
stateful objects. It makes complicated logic so much more scalable and avoids
ripple effects when doing modifications to the program flow.

------
JesseAldridge
The best code is code you don't have to write, because someone else has
already written it. I've decided that number of libraries is the most
important feature of a language, which is of course strongly correlated with
popularity. Javascript is king in this regard.

~~~
Daishiman
Close, but much more important than the number of libraries is the number of
high quality libraries for all purposes you could ever need.

In this regard, JavaScript is far from good. There's huge amounts of package
instability, a lot of mucking with node_modules, a lot of useless breakage.

I have found that Python and Ruby are much better in this regard. The
libraries serve way more purposes, they're stable and work well, and I can
count on library improvements being incremental and easy to digest

~~~
chalst
Perl too. The CPAN guys are obsessed with testing.

------
Jeaye
I built games in C++, game engines in C++, and real-time communication servers
in C++. As I pushed the language more and more, I wanted to squeeze out as
much safety from the compiler as possible. My object-oriented polymorphism
(like a function taking a ref or pointer to a base class) became parametric
polymorphism (templates). Then my templates became template meta-programming
(TMP). Then I was trying to enable/disable safe code paths using SFINAE, but
this was only helping me on a type level. I still have undefined behavior (UB)
all over the place, due to C++'s history. I still had issues with mutability
and parallelism, but modern processors and game requirements demanded that
things must not be single-threaded.

So how can all of this happen? Well, safer systems languages like Rust can
help, but another approach is to remove the mutation entirely. If we could
have a practical language which was built around immutability, we'd have
thread safety by default. So I found Clojure and thus began my foray into
lisps and functional programming. After spending a while completely baffled,
things began to click. I would thing of each of my tasks as a series of pure
data transformations instead of a class + members + methods to mutate those
members. It was data in, data out. Alas, none of my colleagues were so
interested, and I was still their go-to C++ guy for anything non-trivial. So
why was the "C++ guy" giving talks about referential transparency?

Ultimately, I had had enough. To me, though I had spent several years building
an encyclopedia of C++ knowledge, the state of things didn't seem practical.
Not if we wanted to take full advantage of parallelism. Not if we wanted to
feel confident in the safety of our code at compile-time. I gave a final talk
at my day job, regarding C++ value categories. It's a nasty subject and a deep
dive into something many professional C++ developers still don't grok.
Ironically, I wrote it in Clojure and it became the go-to cheat sheet for
everything value category in C++14 and before:
[https://github.com/jeaye/value-category-
cheatsheet](https://github.com/jeaye/value-category-cheatsheet)

So, finally, we're still left with the incomplete journey: Clojure's type
system. Alas, there's nothing to be done about that. Similarly, as someone who
enjoys game engine development, kernel development, and other systems work,
Rust is likely the best option for me there. However, it's not functional-
first. It doesn't have persistent, immutable data structures (which trade data
locality for thread safety; a worthy trade in many cases) in the stdlib. We
have other functional languages which do have much stronger type systems, but,
in my humble opinion, they often lack the practicality of Clojure. Adhoc side
effects, s-expressions (very little syntax), and utter simplicity.

That's why I've been working on
[https://github.com/jeaye/jank](https://github.com/jeaye/jank) for a few
years. It's slow going, but I want it to be a statically typed Clojure
dialect, basically, which compiles to native code. Similar to Clojure's spec,
jank should allow folks to start with a baseline of data transformations and
then build up stronger type checking after things are in place. The big
difference is that jank isn't dynamically typed at all; its baseline is much
more secure and the additional validations which can be presented at compile-
time are akin to dependent types. So, static typing, functional-first stdlib
with immutable data structures, s-expressions (Clojure-compatible syntax),
gradual dependent typing, and a compilation target of LLVM. To me, that's the
dream.

As a last addition, trying to write C++, or most any similar language, these
days is quite upsetting to me. I no longer have a taste for it.

------
yxhuvud
The type system of Crystal finally convinced me that typesystems could be a
net ergonomic positive.

------
rmoskal
Many things have changed my mind. Most recently it's been the move towards
thinking in terms of data processing. Solving problems by transforming data
structures.

The gateway drug was transforming json with lodash/underscore/ramda. Clojure
cinched it.

------
rafiki6
I haven't ever changed my mind about a language, because instead I choose not
to have strong opinions about something to begin with. Fact is most developers
are not experts in a language or framework, even if they've worked with it for
10+ years, including myself. The only thing I've learned is that developers
are humans and like to rely on their uninformed opinions and biases. All
languages and frameworks have their strengths/weaknesses and their appropriate
use cases. You can likely hack any language or framework or toolkit to do what
you want it to do. At the end of the day these are all just layers of
abstraction. It's important that you do a deep dive into something before you
form an opinion about it's appropriateness for a given use case. Deep diving
can include reading the informed opinions of experts (generally the people who
wrote the language or are actively maintaining it).

------
pjmlp
Trying it out, and a couple of life experiences to already have a gut feeling
how things are going to turn out, specially helpful those that went bad.

------
RandomGuyDTB
Thought Python was great. Then I found out Ifs rely on indentations in order
to be parsed correctly. As someone who generally uses whitespace-independent
languages (Lisp and Processing mostly cover my bases) I really despise the
fact that I _have_ to tab-indent my code and that I can't add leading spaces
just because.

Normally I use three spaces instead, and when naming variables I add leading
whitespace to the line to prettify it up and make the "="s align. It irks me
but I can get past it. Still wish it could've been done better.

------
CalChris
Force. Literally I was forced to use it for some practical reason and then
discovered it wasn't as bad as I'd thought.

------
Octoth0rpe
Seeing redux dev tool replaying state changes with a connected router. This is
what completely sold me on react/redux.

------
baybal2
I was a critic if JS, and am still is, but JS _works_. Pretty much the today's
PHP

------
known
Memory.

Scripting is perfectly rational due to abundant memory available.

------
chillacy
Monorepos. They started making sense when we started refactoring code into npm
packages, and one-repo-per-package would have been insane to manage as 30
separate build scripts.

------
laserson
Hadley Wickham

------
tsuberim
I got into Haskell a while back and dove deep into the purely functional,
strongly typed, category theory rabbit hole. This small but very hard-core
community exposed me to completely different (and radical) ideas about
programming, software engineering and PL design after I was sure I've seen
everything under the sun when it comes to PLs. I soon turned into one of those
annoying zealots that keeps preaching functional programming. After a while, I
wanted to write my own language/s and set out to build a compiler in Haskell,
I kept wanting to make it more and more general brushing against the edges of
what the type system could handle and thinking in more and more abstract
category theory terms until I just gave up. I felt that the type system was
restricting my thinking. Another example of this absurdity in the Haskell way
of thinking is a very general scala library:
[https://github.com/slamdata/matryoshka](https://github.com/slamdata/matryoshka)
.

TL;DR: Haskell makes you think in a deep, general and often very beautiful
mathematical way which makes you write beautiful but unintelligible code for
the non-mathematician programmer.

The switch came when I became increasingly fascinated by machine learning
(which is written mostly in Python). Programming in Python felt extremely
liberating to me, I could express complex ideas with minimal number of lines
that was also obvious for others to read. This is not to say that I abandoned
the ideas I learned from the functional land, on the contrary, I still think
in a very functional way and Python only made it so much easier to express
those concepts. A powerful middle-ground between the two approaches is gradual
typing (ie. mypy or typescript) which lets you take advantage of type checker
without letting it get in your way, this is the best of both worlds for me.

TL;DR: Strong types are for weak (or lazy) minds. Python FTW.

------
ratling
Most of the time the answer is google or amazon uses it (The exception being
Java. Nothing will induce me to ever approve of Java.)

NodeJS is the main example for me.

------
faissaloo
I discovered the value of EAFP after learning about race conditions.

