Hacker News new | past | comments | ask | show | jobs | submit | more thanatropism's comments login

What are the advantage of Cython over something like C++ with pybind11 or whatever the equivalent in Rustland?


Cython is similar to python and is fairly easy to write for a python user, while C++ or Rust have much larger learning curves.


However if one is using C++/Rust anyways, then it's a good idea to stay away from Cython. From afar, Cython seems like a viable solution for Python/C++ interop. But the details get messy: you need to clone the .h headers into .pxd Cython-readable headers; and more advanced template-magic C++ constructs may end up being not directly usable in Cython due to missing features or bugs in the C++ support.

In the end, we ended up with quite a number of layers wrapping each other:

  1. actual C++ implementation
  2. actual C++ header
  3. C++ wrapper implementation, avoiding constructs that Cython doesn't support
  4. C++ wrapper header
  5. Cython .pxd for step 4
  6. Cython .pyx exposing `cdef class`es to Python with a nice Python-style API for the original C++ library.
  7. Hand-written .pyi for type checking the remaining Python code, because Cython doesn't have support for auto-generating these yet.
Had we used pybind11 / nanobind instead, we could have stopped at step 3. Cython started easy, but ended up being a major maintenance burden.


It lets you refactor your code to line by line.


I've been using ChatGPT to develop the concept of a novel or TV show in which Al-Andalus never fell to the Reconquista, but rather conquered chunks of Central Europe... and fast forward we're traveling the stars. It's pretty good at following the "story A in the present / story B in historical flashback / stories converge in major themes" pattern.


This is unintuitive, but modern fortran is a shorter jump from Matlab than Python.

The greatness and wretchedness of Python is "pythonicity". It's very hard to miss the mark with Python (all things considered: life is finite, most people are more like analysts than developers, analysts deliver generally more value per hour than developers, Rust is hard to learn) but you basically need to submit to the "Pythonic" brainwashing -- which includes "anything performance-critical should be written in $fast_lang and then used from Python; but almost nothing is perf critical, and most of what's conceivably needed already exists".


I wish the Python<->Rust interop story was a little better. I learned to write some C++ for an embedded thing (smart flashlight, story for another time) and immediately started writing the Python extensions I struggled to write with Rust.

(The average data science/ML-ish person encounters/figures out custom algorithms maybe three or four times a year, and three of these are fast enough with vectorization contortions. I've had two cases that were recursive and 25X faster in $fast_lang than I could possibly make in Python.)


I wish there were more serious studies on "Matlab risk". There's a lot on $old_lang risk already, and Excel risk is now a serious thread of research, but many in academia (and very possibly greybeards in industrial R&D) just won't budge from Matlab. So e.g. all the codes from my thesis are in Matlab because my advisor wouldn't Python.

Matlab in theory has OO, but it's very slow and who bothers to verify that? So practically everything is matrices and there's very little "semantic expressed in code itself" (to paint issues with a broad brush). Also matrix calculations can get botched numerically, particularly matrix exponentials, but the whole culture (that I can't even expunge from my own thinking) is that I have a theorem so tests schmests.


There’s been numerous articles in nearly every field of science about this!


Matlab is great for a lot of things, though it's not really a great general purpose language. Its main problem is that it's not free (though octave exists, obviously), which limits interoperability. Octave can trivially be embedded in C++ and Python; if it was easier to do that with Matlab I wonder if numpy would have ever existed...(after all, numpy is essentially a port of Matlab semantics to Python... most Python numeric programmers are unwittingly basically writing Matlab already...).


This process of language love and hatred over a short period is what's called language fad. Ten years ago, people wrote articles praising MATLAB over established languages. I do not recall any of those writings ever mentioning Matlab's licensing as an issue. MATLAB is now >1000 fold better than ten years ago. Yet, the new generation throws it under the bus daily because it's not their favorite. Change is the only constant in the world of programing fashion and language fads.


matlab is incredible for "I have some data, I want do a bunch of calculations and then spit out nice plots". It's why matplotlib is a thing. But not at all well suited for OO / building larger software applications / CI & testing, which is partially at least why the former is the case


Underperformance in OO is not why MATLAB is problematic.


A better idea yet is to spend the time with Ralph Waldo Emerson.


Not without some solid companions that bludgeon transcendetalism. For contemporaries, probably Poe. (On New England Transcendentalism, specifically). Melville and Hawthorne make good reads as well.

Emerson, like so many self-help books, offers what looks on the surface trivial and immediately obvious, providing a seemingly simple "if-only" path. Meanwhile, his ideas on individualism have done untold damage to society as a whole.

This isn't to imply something simple as "Emerson is wrong". He has valid insights. But reading them in a vacuum, assuming Emerson alone is sufficient reading, is not the best approach.

Ultimately, it is no accident Emerson suggests to "set at naught books and traditions", because that is the only way his ideas can survive unscathed.

(fwiw, the advice of "read widely" holds for any given book. Never believe one person has the answers)


What would be a good starting point with Emerson?


Presumably… Waldo?


My biggest issue with i18n is not the grammar. It's the sheer uncanny valley of it.

There are translations to "Hello" in Portuguese but I'd cringe at being greeted with them by a webmail client instead of the more formal Good Morning/Afternoon.

The formal/informal gradient is very culture-bound and even hard to pin to a scalar space of possibilities. In a work environment people will fluently code-switch too -- say, between ranks or in the middle of a tiresome meeting when everyone takes five minutes to kick back and comment on lighter matters. It's hard to situate a computer in this social context.


I used to be all in on the Blub Paradox discourse -- but I've been learning some C++ for the Arduino and the differences in power with, say, Python, are mainly related to how easy it is to think-as-you-code in a dynamic language.

Of course, my life story is such that I've used Python professionally to the extent that I've started to forget all other languages -- so I naturally find the Blub Paradox Discourse favorable. But all the footguns that come with passing around pointers to memory addresses also come with a whole different way of expressing problems; whole different range of thoughts that are thinkable.

(I valiantly await the Rust brigade in my replies...)


What. There are no proofs of convergence for many of the most popular NN optimization algos. IIRC Adam is known not to converge in some cases.

The bitter lesson is that Messy AI is better able to cope with Messy World Problems than Neat AI (by light-years at this point), not that it can hack Neat Problems.


I remember the hype around XGBoost and Kaggle contests asking to solve problems in prime number theory.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: