Hacker News new | past | comments | ask | show | jobs | submit login

Where I work, there is currently a push to get python on the computers that manage physical equipment operation. These computers are not allowed to connect to the internet, and have extremely limited connectivity to the rest of the business network. Installing anything new on them requires risk assessments like you wouldn't believe, since the consequences of malicious code could easily hit 10s of millions of dollars and a nonzero number of lives.

If your risk assessment says that the exact same tkinter outside of Python stdlib is riskier than in Python stdlib, maybe your risk evaluation process needs reevaluation.

I hear this sentiment frequently. Come on, one software engineer cannot steer the huge ship that is BigCo Risk Assessment. Well, they couldn't do that and the original task.

It might me more helpful to think of these types of external factors as fixed points that cannot be moved and just engineer around them.

You'll burn out if you try to boil the ocean on every business process that doesn't seem "logical" from your cursory examination.

On one hand, this is true. On the other hand, this is being put forth as a reason to not make a change in the entire Python ecosystem, and it's not really Python's job to bend over backwards for shops that have bad risk assessment either.

As long as you cannot even prove that due to a lacking python code signing infrastructure for packages (wheels can do it, but it is far from wide spread).

And setup.py is a trainwreck, e.g. some packages compile download and compile huge dependencies (e.g. a full Apache httpd...), the default compiler flags may lack all the mandatory security flags (e.g. for using ASLR on python 2.x), or ship their own copy of openssl statically and break your FIPS-140 certification that way...

And since setup.py is a Python file, you can't express build time dependencies properly. Pyproject.toml let's you do that, but it's new, nobody knows about it, and older pip clients don't support it.

Yes but it won't get it. And at the end if the day, people need to be able to get work done.

The corporate world is full of stupid things that will never not change, or take years to change.

Where I work the solution was to use a proxy to pypi. Basically an internal pip repo (and docker, npm, maven, everything else...). All internal apps go through the internal repository that creates a local version of the package from pypi. That gives the security / compliance folks a way to block packages with security issues, etc. and at the same time provide the developers flexibility to get most of what is needed.

In a large company this gives the compliance folks a central place to blacklist packages - along with a trail of what systems have downloaded the package to target for upgrades.

Many technical solutiins exist, but the problem is political or organisational.

Agree. At this point it was more a case of executives saying they wanted internal dev teams to use and contribute to open source and supporting orgs to come up with solutions on how that can be possible with a 0-touch approach. That’s what tipped the balance.

I disagree. tcl/tk is written in C and C can be compiled in very very badly indeed (from a security perspective).

Maybe a stupid question but can you ship code on the machine? If you can, what is stopping you from including the source of the library that you're trying to 'install'?

Many do, but you can get fired for it.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact