Hacker News new | past | comments | ask | show | jobs | submit login

The implementation of static typing by way of processing type annotations is an implementation choice, and not a part of the Python language specification.

That comes close to being the worst idea in the history of programming languages. Types aren't checked, you can't trust the annotations, and you give up a big optimization opportunity. But Guido's naive interpreter will still work.

If you say something is a float, the compiler needs to both enforce that and use that. Then maybe the Python crowd wouldn't need to call out to C code whenever they needed to do some number crunching.




> Then maybe the Python crowd wouldn't need to call out to C code whenever they needed to do some number crunching.

Numpy and Scipy use a lot of Fortran at the backend, and unfortunately, beating Fortran when it comes to number crunching is insanely difficult, not just because of the language, but because those libraries like LAPACK have spent decades being refined into incredibly fast and accurate systems.

Python's ability to talk to those libraries is a bonus. But Python itself could never compete directly with them. Pure-python code isn't capable of some of the insane speed those libraries can pull off.


>because those libraries like LAPACK have spent decades being refined into incredibly fast and accurate systems.

In addition, due to their use in benchmarks, x86 has in part been designed to make LAPACK fast.


>That comes close to being the worst idea in the history of programming languages. Types aren't checked, you can't trust the annotations

As for types not being checked, that's not a problem. JITs do extra specialization on assumed types even without annotations all the time, and just drop them at runtime when they are not valid anymore (e.g. a variable that only pointed to integers now stores a string). Still, this runtime overhead of dropping specialized code aside, those optimizations make e.g. v8 hella fast.

(And of course you could also just enable optimizations after you've statically checked the whole program types with something like mypy -- this is even easier and less dynamic than what v8 does).

>and you give up a big optimization opportunity

You don't give anything up, since an implementation can take advantage of this "big optimization opportunity" if it wants and has the manpower to add it.

>But Guido's naive interpreter will still work.

It's also less work, which unless we have a volunteer here, was and remains the intention.


Why is calling out to C code via something like numpy a bad thing?

Python is never going to be as fast as C. Ever. Even if you somehow managed to decouple numeric types from PyObject in a backwards compatible way it would not be as fast.

Type Hints can’t be used as an optimization in the interpreter due to the way they are resolved. And in any case, it would not be safe to do so and there are very few cases where optimizations make sense.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: