Hacker News new | past | comments | ask | show | jobs | submit login

I see Python as the BASIC of the 2020s.

It's not to knock it, but the real strength of Python is as a language that the non-professional programmer can pick up right away and use to put their non-programming skills on wheels.

Jupyter, numpy, pandas, scikit-learn and such are the center of the ecosystem for many people. You can point your finger at these people and tell them to try Julia because it is better on paper... And they will, and they won't like it and they'll go back to Python.




Julia is only better on paper if you write the missing parts yourself or only trade in potential. There's just still much more ready-made libraries for Python, with more features.


> Jupyter, numpy, pandas, scikit-learn and such are the center of the ecosystem for many people.

Exactly this. At this point I'm not sure where I'd even go for the tasks I use this suite of tools for. I'd check out Julia but I haven't run into many problems that would inspire me to switch.


What languages do folks “graduate” to after Python?

(this was a genuine question as someone who primarily writes code in Python for fun, not professionally, and is looking to level up for fun)


It depends on what "graduate" means (I note your "scare quotes"). For some people, lexical styles matter a lot and it is easier to answer this question with that kind of constraint.

While its ecosystem is surely much smaller than even Julia's, Nim [1] has a lot of the concise, lexical feel of Python, but semantics more like Modula3/Ada and with Lisp-like syntax macros/metaprogramming to fill in other gaps.

Another possibility is Cython [2] which is basically a superset of Python with gradual typing..more targeted at writing new C-like Python modules/leveraging the Python runtime environment than "standing on its own".

[1] https://nim-lang.org/

[2] https://cython.org/


Python has PyPi and numba. PyPi is good for branchy things like RDF triple stores, business rules engines, etc.

CPython (and most other interpreters) has the big problem of the global interpreter lock which prevents it from taking advantage of threads... And that's a problem in a world where (1) 8-core laptops are common, (2) much bigger machines can be found in the server room and (3) many workloads can be parallelized over threads with just a little work.


Cython lets you do `with nogil`. That said, often processes with communication are enough to leverage multi-core with no global interpreter lock (GIL). CPython's pickling junk for its multiprocessing library can add a lot of overhead to that, though. A simpler pure binary version of multiprocessing could lower that overhead enough to make multiprocessing competitive with multithread unless inputs/outputs are really big (in which case tossing them in files and passing around pathnames is a not crazy fallback...). (EDIT: In the Linux kernel processes and threads are both just created by a system call named "clone" with various sharing flags. Threads are just processes with very unsafe settings - settings that many prog.langs think they can "tame well enough".)


In the ‘00s I found a python/C combination very effective. All control in python, compute in C and the class mechanism easily tied things together. I would recommend this for any new heavy cpu development as the needs for each portion are so different.


Right, that's one of the architectures for complex programs: use a scripting language for parts and a systems language for other parts.

Video games are a great example. You'd like a level designer to be able to edit scripts that control arbitrary things about the game play, but you also need a performance and GFX pro to program other parts.


The way I hear it frequently is: declarative shell, imperative core.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: