Which suggests that, because python's on-ramp just got easier, it's long-term prospects just got better.
What I'm saying is that bundling a specific Python interpreter and separate package environment should be the default. I'm not sure what it'd take to get there for a vanilla python install. But more system-level interpreters is not it.
Currently, the way I know of to do that is pyenv . And I know PyCharm let's you pick a python version when you start a project (though I don't use pycharm so that may not be true anymore).
though ironically in a very unpythonic way there are a bunch of these python environment creators. (conda, virulenv , pipenv....)
pycharm lets you select you favorite, which is pretty nice.
I’m a seasoned programmer and I absolutely hate how you’re forced to use VirtualEnv with python. One of the absolutely biggest warts of its already poor package-management.
It literally makes everything which should be simple, something you suddenly need to handle. Atrocious.
I like it much better like .Net, Node or Rust (and many others) where the dependencies are entirely handled within a project, completely obsoleting the need for a hack solution like virtualenvs in the first place.
Even now for many languages I shy away from installing globally (I usually install to user if I'm lazy). I feel like in almost any language a "project" (venv) should be taught early on and be simple to use. Basically, the first time you need a third-party module. I feel like newcomers should be shown the REPL, how to save code to a file and run it, standard libraries, then a workspace and how libraries work. Python could be a great example because of the batteries-included model I don't need external libraries for many tasks.
I would like to reverse the question. If you can't install a virtualenv as step 1, the process of creating a virtualenv is simply too difficult and should be streamlined.
There is a lot of work in this area going on now, though. I think RHEL 8's application streams concept is pretty good: https://developers.redhat.com/blog/2019/05/07/what-no-python...
Fedora has an analogue to this as well. It seems like a good move in the right direction.
Maybe one day we'll take the lessons learned and natively build them into the OS, but distros are still obsessed with distro-specific kludges that require extra packaging work.
We have great universal packaging when it comes from the community at large, but not from single organizations. Pip is a platform-independent package manager! Why don't we have that for Linux software in general? Because each distro already has their own solution, and they won't work together to build a common one.
By this logic, GNOME should have died a long time ago.
Focusing on the desktop makes Flatpak better at managing desktop apps. For servers we have, well, Docker.
You may be right, but I don't think you should be quite so cynical.
18.04 has a ten year Long Term Support, and Ubuntu was first released in 2004.
Figuring out how to establish a build environment for any program you want to install really can't be the best way to do things.
(This was on a Ubuntu LTS system)
It's really not hard. I do this with other compilers and don't really have issues, though I tend to package it up into a container if I'm going to ship it (e.g. build environment for a team, interpretor for a production service, etc).
This stuff is definitely quite new, and I guess the need for it has risen as a consequence of all the great new features that have shipped in Python during the last few years.
I recently had to write an utility script in Python. I chose to use Python 3 and type hints syntax, requiring at least Python 3.6. This was a problem when I realized some of my team use Ubuntu and Debian versions that ship with Python 3.5 (not to speak of Macs with Python 2.x, but these guys use homebrew anyway). Plenty of time was spent researching virtual environments and writing deployment instructions, and I ended up using pip in pyenv (but stopped short of using pipenv, since creating private distributable packages with it seemed convoluted.)
Not to mention there are other distribution tools to acquire a pre-built python release other than installing a .deb.
~ $ docker run -it --rm ubuntu:xenial
root@b8cda311d766:/# apt-get update -qqy
root@b8cda311d766:/# apt-get install -qqy software-properties-common
root@b8cda311d766:/# add-apt-repository --yes ppa:deadsnakes/ppa
gpg: keyring `/tmp/tmpe354n0av/secring.gpg' created
gpg: keyring `/tmp/tmpe354n0av/pubring.gpg' created
gpg: requesting key 6A755776 from hkp server keyserver.ubuntu.com
gpg: /tmp/tmpe354n0av/trustdb.gpg: trustdb created
gpg: key 6A755776: public key "Launchpad PPA for deadsnakes" imported
gpg: Total number processed: 1
gpg: imported: 1 (RSA: 1)
root@b8cda311d766:/# apt-get update -qqy
root@b8cda311d766:/# apt-get install -qqy python3.7
debconf: delaying package configuration, since apt-utils is not installed
Selecting previously unselected package libpython3.7-minimal:amd64.
(Reading database ... 7603 files and directories currently installed.)
Preparing to unpack .../libpython3.7-minimal_3.7.3-1+xenial1_amd64.deb ...
Unpacking libpython3.7-minimal:amd64 (3.7.3-1+xenial1) ...
Selecting previously unselected package python3.7-minimal.
Preparing to unpack .../python3.7-minimal_3.7.3-1+xenial1_amd64.deb ...
Unpacking python3.7-minimal (3.7.3-1+xenial1) ...
Selecting previously unselected package libpython3.7-stdlib:amd64.
Preparing to unpack .../libpython3.7-stdlib_3.7.3-1+xenial1_amd64.deb ...
Unpacking libpython3.7-stdlib:amd64 (3.7.3-1+xenial1) ...
Selecting previously unselected package python3.7-lib2to3.
Preparing to unpack .../python3.7-lib2to3_3.7.3-1+xenial1_all.deb ...
Unpacking python3.7-lib2to3 (3.7.3-1+xenial1) ...
Selecting previously unselected package python3.7-distutils.
Preparing to unpack .../python3.7-distutils_3.7.3-1+xenial1_all.deb ...
Unpacking python3.7-distutils (3.7.3-1+xenial1) ...
Selecting previously unselected package python3.7.
Preparing to unpack .../python3.7_3.7.3-1+xenial1_amd64.deb ...
Unpacking python3.7 (3.7.3-1+xenial1) ...
Processing triggers for mime-support (3.59ubuntu1) ...
Setting up libpython3.7-minimal:amd64 (3.7.3-1+xenial1) ...
Setting up python3.7-minimal (3.7.3-1+xenial1) ...
Setting up libpython3.7-stdlib:amd64 (3.7.3-1+xenial1) ...
Setting up python3.7-lib2to3 (3.7.3-1+xenial1) ...
Setting up python3.7-distutils (3.7.3-1+xenial1) ...
Setting up python3.7 (3.7.3-1+xenial1) ...
Python 3.7.3 (default, Mar 26 2019, 01:59:45)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
I'd say yes, the coupling of operating system and distribution of software applications has always seemed very wrong to me.
Software should run on the operating system which functions as a platform, not be coupled with the operating system. Linux package management essentially works like an app store with the additional burden of maintenance loaded on the distribution owners. I'd be very happy if flatpaks and snaps take off quickly because distribution agnostic up to date software, maintained by the developer, should be the norm.
I agree with this in principle, but how far down the chain do you go? Obviously, you need to have some base libraries included. You just need to accept that some of those libraries will be old, and that they aren't going to change so rapidly for this to be a problem.
But more broadly speaking, I'm not sure that's going to be enough. Even if you're using virtualenv's, having something installed at the system level means it's shared by all the software you're using. There's bound to be some incompatibility eventually. And then you're stuck waiting for one piece of software to support the latest version of Python. So, you have to keep putting off updates.
I think a better solution is to use something to set the python version for each program (ex. pyenv or docker).
But then I remember one scene in high school. I was talking to a friend who coded in IB American History and asked him how to get started with Python. He said just go to this website and download this tarball. I had no idea what a tarball was and was afraid it contained something bad. So I didn't download it and didn't play with Python until a few years later in college. If I had started playing with Python earlier, my life could be a lot different. I was not a guy back then who made programs on TI-84 calculators; I thought that was magic. I was planning on econ/finance. I used Windows (!) and didn't know anything about what Unix was, and barely knew about OS X.
It's hard to see why some decisions pan out and some don't; if you install a system version of Python how would the general public / PSF know how many people use it? But I don't think that's a good enough reason to remove system installs entirely. I think my younger self would have liked it.
Ubiquitous access to these tools provides wonderful opportunities, and the drawbacks are minimal.
How come you didn't start playing with the TI-84 programming language instead?
There's no more "battery included" than that, and it's not a pun only: you can program and run on the calculator (not a GREAT way to program it, but quite a nice way to spend your time instead of paying attention to the class :^) ) if you bought it new I guess you had a paper manual for the language (I had one for my TI-89), and there's so little you need to learn and take care of (libraries, dependencies, versions, execution bit, paths... None of these are an issue the novice user will be confronted with. It's like a C64 all over again).
People wondering why you didn't just power through your doubts may be forgetting what the world was like not too long ago. 'Nerds' weren't cool. If you chose to be a nerd it meant _offering_ yourself up as the _loser_ in every sitcom, movie, or book. Playing video games was something you were supposed to grow out of. No one was throwing 80k at a graduate just because they could pronounce 'WWW'. Perhaps most importantly, _every_ school was pushing the race to no where. If you were thinking about a 'trade' it meant you had given up and would crack out fixing sinks for the rest of your life. At _best_ computers were taught as something you'd use at other careers. Not a career itself, really.
* Please note the last line is less an insult to plumbers and more a commentary on how the perception of success is hammered into you as a very white collar thing.
However, the software distribution model right now is fundamentally broken. And I do not think it is worth leaving broken simply because it makes learning easier for some people.
In general, I think the decision to push people to tie their app less to the system configuration and rely on virtualenv, containers etc more is a good move in the long run.
What about people who dip their toes in Python, then switch to another language? What about people who want to just replace their VBScript integrations with Excel and don't care about Python beyond 1-2 dependencies on local? You'd just push them back towards the less optimum situation.
Oh, and I just remembered, Windows 10 Home Edition does not ship with a hypervisor. You can't run Docker on Windows without Windows 10 Professional. That screwed my team over at a data science hackathon once.
If you're at the point where you're worrying about distribution environments, you're probably ahead of most people already, and you have the wherewithal to figure out what the state-of-the-art is in packaging and distribution, whatever that may be and regardless of how broken it is.
That's how Apple does it. Ubuntu LTS, Debian Stable, CoreOS/Redhat, etc, are useful because the old versions of the software are maintained. Though Windows has not traditionally bundled GNU or Open Source packages, it is also built with a strong commitment to long term maintenance: consider Notepad,MSPaint, and .NET.
Microsoft is including Python 3.7 which has planned support until at least 2023 and possibly 2026 https://www.python.org/dev/peps/pep-0537/#lifespan
Also, as the post sort of points out, this is being built by the Python community itself (rather than by Microsoft directly; contrasting with the distro model), using mostly the same CI/build infrastructure they already use for the builds at python.org and presumably may be as up to date as the Python team wishes it to be.
It has nothing to do with Apple scenario where system included interpreter is vendor supported and used by other OS components, this is just a built in shortcut for the store version if python is not installed.
I think a big difference here is expectations: the Apple way is to have the tool itself installed, which creates various expectations with respect to presence and capabilities (or the lack thereof).
Because microsoft only adds a hook / stub, there's no expectations that the tool will be present (as by default it's not), it just makes it easier to enable the tool.
What? The reason to use the distro-provided python package is because it's maintained by someone and gets you security updates. If Apple is just shipping a version and never bug fixing that version, well Apple sucks.
As it happens with Python, they do micro releases which other upstream projects might not. But, those releases might take longer than, say, a Linux distro might to get the fixes out to their users. So, while developing on an upstream codebase might be nice, what you need to focus on is the deployment lifecycle of your project.
These tools aren't (or at least shouldn't be) only meant for professional engineers.
Only if your system isn't getting any updates. It would be up to Apple to provide a recent version.
No one seems to have a problem with that if it's bash.
You definitely shouldn't be willy-nilly modifying the environment of system interpreters, but relying on them for ops-like uses and bootstrapping dev environments is sensible.
Why can't Apple just install a few languages and keep up with them? It will basically kill them on the platform if they're not in common use. I suspect the people making these decisions believe it's good for apple's brand, but I think it will lead to an inward-looking insular culture.
i guess Microsoft agrees with you
On the other hand, if you can bundle dependencies with your code, you are sure that a certain version of the language will be present and to some degree, you can just rely on that.
There goes probably 10% of the software I can apt-get install.