Hacker News new | past | comments | ask | show | jobs | submit login

I love Python as a glue language. So much heavy lifting done in numpy or opencv or whatnot. But Python as the interface makes it trivial to explore, experiment, and glue together a workflow, especially when the solution is unclear.

Then at some point if Python isn't needed because you know exactly what you want your software to do, rewrite it in C++ or whatever.

Also with CFFI and other interoperable libraries, it's really quite easy to write some heavy work in a more appropriate language and call into it.




For that kind of workflow you would be far better off with e.g. Julia. You get the same advantages as Python as having a language you can experiment with until you find a solution. Only difference is the optimization step later does not involve having to rewrite in another language.

If you already know Python, and Python packages already do all you will ever need then sure stick with that. But I don't get why people would go to such lengths to avoid using a new language. Being proficient in Julia is a lot less work than maintaining proficiency in Python and C++.


The last time I checked, using Julia was clunky at best with ridiculously high jit-compile times, packages that refused to build on my machine etc.. What is more, many of the "best" Julia libraries were seemingly just linked-in Python code.

I don't mean to discredit the advantages Julia clearly has over Python, but these are just the kinds of problems that make people like me stick with tried and tested last-gen languages like Python.


Did you ever check after 1.0 was released? In the earlier days it was a lot of problems with packages. Totally agree. JIT compile times are much better now.

A lot of the issues are simply that people have not learned a sensible workflow with Julia. Python guys have a lot of habits that don't translate well to Julia. I know because I work daily with two hardcore python guys. I notice all the time how we approach problems in very different ways.

Python guys seem to love making lots of separate little programs they launch from the shell. Or they just relaunch whole programs all the time.

In Julia in contrast you focus on packages from the get go and you work primarily inside the Julia REPL. You run Revise.jl package which picks up all the changes you make to your Julia package.

I guess it just depends on the workflows you are used to. For me it is the opposite. Whenever I have to jump into our Python code base I absolutely hate it. It is very unnatural for me to work in the Python way. I also find Python code kind of hard to read compared to Julia code.

But I know Python coders have the opposite problem. Basically Python guys look a lot at module names when reading code. Julia developers look more at types. The difference makes some sense since you don't really write types in Python code.

I found that the new Python type annotation system helped me feel at home in Python.


Likely because of this:

https://pypl.github.io/PYPL.html


OMG. Until now I didn't realize Cobol is more popular than Haskell and Delphi. I had just read there are ~2.5 Cobol programmers left in the world, who are old like Gandalf, enjoy ridiculously high salaries and can't retire because nobody learns Cobol any more while the world needs somebody to maintain those legacy systems that still run its economy. While Delphi used to be the PHP of the desktop world until very recently (and could hardly be expected to decline so fast) and Haskell seems to be the computer science lingua franca.


>The more a language tutorial is searched, the more popular the language is assumed to be. It is a leading indicator. The raw data comes from Google Trends.

That's the problem, nobody is actively learning Cobol these days and nobody knows how much is running out there in the wild, because none of the big banks or credit companies will actually admit it.

You joke but the jobs are still out there, still being posted and companies are still hiring. https://www.wellsfargojobs.com/job/irving/apps-systems-engin...



For the pretty much perfect language, try Erlang or Elixir. The runtime guarantees are like no other languages, and it can easily drop down to C, C++, Rust and several others when needed.


Neither of those is arguably very expressive as a language though. It's the runtime and the built-in libraries around it (OTP) that holds the power.


What do you mean by "expressive"?


Able to abstract concisely at a high level, provides decent facilities for modeling business logic in a succinct yet correct manner etc.

Elixir and Erlang both lack a type system and maintain a focus on keeping the language constructs simple rather than providing various high level abstractions some other more expressive languages have.


I mean sure, you are right, but is there a language that is expressive by your criteria?

You seem to be describing DSLs -- that can be done by all LISPs and some others -- and not programming languages per se.

Elixir has very powerful macros by the way, they are compile-time and not the runtime `method_missing` stuff that Ruby does, so they are basically code generation that you don't get to see but get all the benefits from.

With those macros you can go quite far creating business DSLs with Elixir (and a bunch more language, in the interest if full honesty). That's why many people qualify Elixir as "LISP-y".

The lack of static typing is indeed something that is poking my eyes as well, but so far I have been managing. The extra cognitive load due to it -- plus the need to utilise the strong but dynamic typing system -- is definitely there though, cannot deny that.


I'm aware, I'm running Elixir in production, including custom macros.

That said, even the macro system pales in comparison with say, Haskell, in terms of expressiveness. Haskell has such a depth of constructs and allows highly sophisticated (& safe) abstractions that you often don't feel like you'd even need a separate DSL or an embedded DSL.

If anything, writing Elixir macros feels a lot like writing Typescript AST transformers to me.


That they are indeed.

And yeah, I am aware others have more powerful macro systems. OCaml is also pretty amazing at similar workloads.

Doesn't seem that we disagree on anything. You seem dissatisfied with the maturity of certain areas and that's okay.


static typing (especially HKT), macros, or laziness? The BEAM is amazing, but Erlang is kind of meh. Also the BEAM is very slow so you end up writing C to get any real work done. I haven't used Elixer, but I suspect it suffers from the same problems. When the JVM finally supports preemptive multi-tasking, Erlang and the BEAM will become completely unnecessary.


As mentioned in a reply to your sibling comment, the lack of static typing is definitely a minus. Not denying it.

Full laziness I found quite overrated while trying to write several very small business services in Haskell. Even in Elixir when you can utilise laziness in limited areas (in the I/O processing stdlibs, but there are almost no data structures for it) it's a proven fact [with benchmarks] that unless your data is at least a list with 500+ elements then laziness introduces both performance penalty and complexity. Not bashing the idea but it's not so universally good as several groups of rather religious programmers seem to make it. Right tool for the right job and all. ;)

Macros is Elixir are quite powerful and very different from stuff like it's done in Ruby. They are basically compile-time code generators, utilised with crushing success in Phoenix (web framework), Absinthe (GraphQL) and Ecto (DB data mapper library). Friendly programmer advice: don't underestimate Elixir macros. Many people stayed with Elixir mostly because of them.

---

> When the JVM finally supports preemptive multi-tasking, Erlang and the BEAM will become completely unnecessary.

1. Maybe, but I was hearing about preemptive multi-tasking on the JVM when I gave up on Java and that was around 2009. Believe me when I tell you, I really want at least 4-5 other runtimes to gain the capabilities of the BEAM. But alas, they still don't. Only Rust 3rd party runtime devs seem to be truly trying to push the envelope. Everybody else is like "yeah, we're gonna get it done by 2050, no worries". Sorry if that comes across as a bit cynical and dismissive, but I do read history.

2. The preemptive multi-tasking of the BEAM is indeed a huge selling point but not the only one. The actor model -- basically green threads with message inboxes -- are historically being proven more and more as a much more successful parallel computing model compared to... pretty much anything else. Really good Golang devs can kinda-sorta-partially emulate that with channels and mutexes/semaphores but it's obviously manual and error-prone. Rust's `actix` devs also agree and if you look at Techempower's benchmarks that try to emulate real web apps, you'll notice that `actix-web` is on top. Kinda funny when you think about how many people here in HN keep saying the actor model is only a good idea on paper and would never perform well in practice.

Additionally, if you keep an eye on the whole ecosystem (as much as that is actually even possible) you'd notice that almost everyone, JS included, is begrudgingly moving to immutable data and lock-less synchronisation. A lot of tech is starting to come to grips with the reality of us humans being flawed and not as bright as we'd like -- me included. I paid my dues with manually crafting mutex and condition variables workflows in C for years, then translating that to Java, then Go, etc. And I can confidently say: it doesn't work sustainably well. You can make a few simple pieces with it but for long-term projects, just go with immutability and lock-less sync.

---

I guess my point is, Erlang/Elixir are still quite strong and don't have much serious competition in their niche (where they are doing well).

Lastly, I agree that the BEAM in general isn't the fastest thing to run code on. Absolutely. In practice however -- and I mean importing a few million lines of CSV/XML a day -- I found that if you invest just a little more effort, Elixir will stay out of your way and your code will be mostly I/O bound. I mean yeah, Elixir isn't as fast as C++ or Rust. But I already used it for several pretty different projects and it has been performing absolutely excellently.

I am looking forward to learn Rust's `actix-web` though. I feel it could be a better fit for some more dramatically more computationally demanding web projects.


Actor model is not for parallel computing, it's for concurrency. It introduces unnecessary overhead for parallel computing and is not ideal for it. For concurrency, on the other hand, it's probably superior to anything else out there.


Sure. Just saying that the BEAM lends itself to parallelism rather excellently. Conceptually you are correct but in practice -- in the BEAM at least -- those ideas can be converged.


You're right, Erlang still has an important feature that (almost?) all other languages lack: preemption. I think the actor model is great, but is crippled if required to use cooperative multi-tasking (like Akka); the code starts simple and then becomes increasingly tangled and complicated. I just think that the Erlang ecosystem has stagnated and the language itself feels dated. If/when the JVM or maybe Rust gets preemption, l think it would be best if all moved on to an actor system built on a more performant, robust, and expressive ecosystem.


> If/when the JVM or maybe Rust gets preemption, l think it would be best if all moved on to an actor system built on a more performant, robust, and expressive ecosystem.

If Rust gained all BEAM capabilities I'll switch tomorrow, dude.

At the moment though, Elixir is just the only choice for excellent parallelism and concurrency.

I know Rust is working hard to get there. It's not there yet though.


"...I paid my dues with manually crafting mutex and condition variables workflows in C for years, then translating that to Java, then Go..."

Quite frankly most of the multithreading things an synchronization is quite simple and does not bother me except some rare cases (dealing with DirectShow for example). If some languages fail to support real world constructs directly it is the problem of their designers and users. All this stuff about horrible, horrible manual memory management and awful synchronization issues is overblown I think.


We'll have to respectfully agree to disagree here. My first job was with 50+ year old guys who chased such bugs in micro-kernels and drivers literally every day. They swore that the world hasn't seen a dumber abstraction than pthreads, lol.

They were exaggerating of course -- and they were like gods, they fixed hundreds of sync bugs in C code, mixed with assembly for like 15 embedded platforms.

I am not such a hardcore programmer like them to this day even though I am almost 40 but the mutex/condition_var combo has bitten me many, many times. Spurious wakes, for example: not all platforms do them right (side note: I realise these may remain a forever fact of life and be a key to having preemptive scheduling like in the BEAM VM [Erlang/Elixir]). A lot of defensive coding is needed. And that's only one example out of dozens.

IMO you are just used to it. If you code in Erlang/Elixir on very parallel-friendly projects I am pretty sure you'll never go back to the pthreads model!

It's just the conclusion of the group of programmers to whom I philosophically belong (basically people who ran away screaming from C/C++ after working 2 to 10 years with them) that this stuff looks deceptively easy and it works quite well in many cases... until the projects gets big, many people get involved and bugs start to creep in. In such conditions, immutability + actor model have so far proven to be much more people-friendly paradigms that make introducing certain class of bugs much harder.


"We'll have to respectfully agree to disagree here"

Sure thing, I have just stated my personal opinion, not looking for anyone to agree. To each their own. Everything has a price, immutability/actor included. The rest is all about the trade-offs one is willing to make


That's unequivocally true! Immutability + actor model definitely punish you with some performance losses, especially in dynamic languages like Erlang and Elixir. No two ways about it.

I am looking forward to see if the Rust community will manage to make a statically-typed BEAM-like runtime with much stronger performance characteristics. That would be amazing.


But this is a benefit not of python but of the ecosystem around it. We should be making those libraries, as easy to use, in other languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: