Python virtual environments
In a short session at the 2018 Python Language Summit, Steve Dower brought up the shortcomings of Python virtual environments, which are meant to create isolated installations of the language and its modules. He said his presentation was "co-written with Twitter" and, indeed, most of his slides were of tweets. At the end, he also slipped in an announcement of his plans for hosting a core development sprint in September.
The title of the session was taken from David Beazley's tweet on
May 1: "Virtual environments. Not even once.
" Thomas Wouters defended
virtual environments in a response:
![Steve Dower [Steve Dower]](https://static.lwn.net/images/2018/pls-dower-sm.jpg)
But Beazley and others (including Dower) think that starting Python tutorials or training classes with a 20-minute digression on setting up a virtual environment is wasted time. It does stop pip install from messing with the global environment, but it has little or nothing to do with actually learning Python. Dower noted that Pipenv is supposed to solve some of the problems with virtual environments, but it "feels a bit clunky", according to a tweet by Trey Hunner.
In another Twitter "thread", there was a discussion of potential changes to pip so that it would gain the notion of local versus global installation. That might be a path toward solving the problems that folks see with virtual environments and Pipenv. Dower said he is willing to create a PEP if there is a consensus on a way forward.
He would like to see a way to do local package installation without using virtual environments. He also would like to have a way to invoke the "right" Python (the right version from the right location) without using virtual environments. But for those who are using virtual environments, he would like them to be relocatable, so that users can copy them elsewhere and have them still be functional. Barry Warsaw suggested making pip --user the default as it is in Debian and Ubuntu; Dower said that only "localizes the damage" and doesn't really solve the problem.
Core development sprint
Dower has volunteered to host a core development sprint to work on CPython. He has scheduled it for September 10-14, 2018 in Redmond, Washington on the campus of his employer, Microsoft. They will have an entire building to use for the sprint. There will be a hotel block reserved in Bellevue, since it is a more interesting place to stay, he said. Around 25-30 developers will be invited to attend; active developers or those with a PEP under consideration should expect to get an invite. He is hoping that the Python Software Foundation will pick up the travel expenses for the invitees, but any core developer is welcome to attend.
Index entries for this article | |
---|---|
Conference | Python Language Summit/2018 |
Python | Virtual environments |
Posted Jun 13, 2018 18:59 UTC (Wed)
by pj (subscriber, #4506)
[Link]
Posted Jun 13, 2018 21:30 UTC (Wed)
by unsignedint (subscriber, #92715)
[Link] (3 responses)
Posted Jun 13, 2018 23:48 UTC (Wed)
by jhoblitt (subscriber, #77733)
[Link]
Posted Jun 14, 2018 7:50 UTC (Thu)
by karkhaz (subscriber, #99844)
[Link] (1 responses)
Even that hasn't entirely mitigated environmental problems, though, so I'm now considering buying a new machine for each python package. Py3 packages will each have an Arch Linux box, while Py2 (or "Python Heritage") programs can have a PDP-11 to run on, with power supplies separated by audiophile-grade isolation transformers, each located in a separate Biosafety Level 4 containment facility. Typing "pip install --user" through the gauntlets of my positive air-pressure spacesuit will be a tad cumbersome, but it sure beats expending the modicum of effort required to get something packaged by my distribution.
Posted Jun 22, 2018 8:11 UTC (Fri)
by fredcadete (guest, #81023)
[Link]
If only you would give it a cool name, logo, and set up conferences with lots of handouts... then I could use it in production.
Posted Jun 14, 2018 7:50 UTC (Thu)
by robert_s (subscriber, #42402)
[Link] (3 responses)
Posted Jun 14, 2018 14:54 UTC (Thu)
by servilio-ap (subscriber, #56287)
[Link]
Posted Jun 15, 2018 11:16 UTC (Fri)
by sbaugh (guest, #103291)
[Link]
Apparently people find it painful to have to start any kind of Python training with 20 minutes of Python package manager/virtual environment tooling training. I know that one reason I would find it painful to teach people about those kind of tools is that they are not tools for life. Effort invested into learning Python tools will just be wasted when the tool of the day changes, or I want to use a different language.
But if people had a long-lasting packaging tool they could use for all languages, I don't think I would mind teaching that tool at the start of Python lessons; I'd know it would be long-term useful knowledge for people. I think Nix (or something like it) could meet this criteria.
Posted Jun 18, 2018 15:58 UTC (Mon)
by civodul (guest, #58311)
[Link]
Posted Jun 14, 2018 13:13 UTC (Thu)
by gdamjan (subscriber, #33634)
[Link] (5 responses)
single line, no need of virtualenv, the directory is relocatable, no need for hardlinking or copying of the python executable (ugh how I hate that), and only needs setting a single environment variable to setup.
Posted Jun 14, 2018 13:41 UTC (Thu)
by cortana (subscriber, #24596)
[Link] (3 responses)
Posted Jun 15, 2018 18:27 UTC (Fri)
by k8to (guest, #15413)
[Link] (2 responses)
Essentially it's an extremely overwrought way to say something like export PYTHON_LIB=/a/path
Adding onto this, its magical nature and overuse means that you can get extremely hard to debug problems when you create a virtualenv and install a tool that itself uses a virtualenv.
It's just brain damage, and it's high time that python simplified this stuff.
Posted Jun 18, 2018 21:32 UTC (Mon)
by stevedower (guest, #116614)
[Link] (1 responses)
Posted Jun 19, 2018 15:57 UTC (Tue)
by k8to (guest, #15413)
[Link]
However, that's obviously not sufficient for some problems, so I salute modernizing sys.path setup.
Posted Jun 25, 2018 13:37 UTC (Mon)
by jsmith45 (guest, #125263)
[Link]
You cannot just type /path/to/my/script.py from a newly opened shell and have it work, unless you install globally which is not recommended. This also means you cannot simply run by double clicking in Windows either.
Compare that with node.js. Sure you do need to run `npm install` once, but after you have done so, as long as you don't change the script, you can just directly execute in a newly opened shell and everything works fine.
That alone is a large part of the reason that many projects try rather hard to avoid using anything not in the standard library, since if they do so they must either contaminate the global environment, or lose the ability to run the script directly (they would need to create some shell script wrapper to enter a virtual environment and then execute it).
Posted Jun 15, 2018 0:04 UTC (Fri)
by flewellyn (subscriber, #5047)
[Link] (13 responses)
Posted Jun 15, 2018 1:40 UTC (Fri)
by rahulsundaram (subscriber, #21946)
[Link] (7 responses)
https://fedoraproject.org/wiki/Changes/Making_sudo_pip_safe
Posted Jun 15, 2018 3:23 UTC (Fri)
by flewellyn (subscriber, #5047)
[Link] (1 responses)
Makes sense. I suppose there's less issue if you use pip to install inside a container?
Posted Jun 15, 2018 10:28 UTC (Fri)
by rahulsundaram (subscriber, #21946)
[Link]
Posted Jun 15, 2018 6:16 UTC (Fri)
by ceplm (subscriber, #41334)
[Link] (3 responses)
Posted Jun 15, 2018 7:02 UTC (Fri)
by kushal (subscriber, #50806)
[Link] (1 responses)
Posted Jun 20, 2018 2:19 UTC (Wed)
by k8to (guest, #15413)
[Link]
Personally I've found virtualenvs to be so fragile that I will never use them again. I would advise teaching students a path that doesn't fall over so easily
Posted Jun 15, 2018 10:32 UTC (Fri)
by rahulsundaram (subscriber, #21946)
[Link]
Not everything is packaged by distributions and available in the latest version. Pip is unaware of distribution installed version and will overwrite it if a module is a dependency and vice versa.
Posted Jun 20, 2018 17:52 UTC (Wed)
by cortana (subscriber, #24596)
[Link]
Posted Jun 21, 2018 7:55 UTC (Thu)
by callegar (guest, #16148)
[Link] (4 responses)
Because python does not support versioned packages, never had.
So you cannot have on your system 2 different versions of the same package in the same way as you can have on your systems two different versions of e.g., a C library.
You cannot say anything like
import xxxyyyzzz as xyz at {versions}
or
from xxxyyyzzz at {versions} import abc
where versions is some version specifier e.g. ['>2.5.0', '<3.0.2']
so, when you pip install something you end up "hiding", or "breaking" another version of that same something that you may have already had installed (e.g. by your distro) and on which some part of your system may depend.
Not that this is not a problem that gets magically solved with virtual environments either. Virtual environments only mitigate it.
If you write program A that depends on package B, which in turns depends on C at version Y and then you realize that you also need D that also require D that depens on C at version Z, then you are in big trouble.
So the real question should probably be: "why does not python support versioned packages"?
Posted Jun 21, 2018 18:06 UTC (Thu)
by raven667 (subscriber, #5198)
[Link] (3 responses)
Posted Jun 22, 2018 6:49 UTC (Fri)
by marcH (subscriber, #57642)
[Link]
Source code used to be versioned file by file. Git and others made the single version number popular. Then submodules and repo stroke back.
Etc. Modularity is *hard* and what goes around comes around.
Posted Jun 22, 2018 12:15 UTC (Fri)
by callegar (guest, #16148)
[Link] (1 responses)
As mentioned before if you write program A that depends on package B, which in turns depends on C at version Y and then you realize that you also need D which depends on C at version Z, then containers or virtual environments are not going to help.
But what I find weird is that versioned "libraries" or "packages", that is a simple solution that was invented together with shared libraries many tens years ago, which would cost very little to implement in a high level interpreted language and which would not preclude at all the use of containers or virtual environments did not make into Python that, for other things, does not seem to completely avoid rather complex solutions and their associated cost.
The funny thing is that the very language developers feel the need to introduce "naming" conventions to work around this limitation (e.g., because versioned packages do not exist, they end up attaching the major version number to the package name itself as in urllib vs urllib2). Same goes for extension packages on Pypi (e.g., gmpy vs gmpy2).
Posted Jul 3, 2018 1:24 UTC (Tue)
by ghane (guest, #1805)
[Link]
Posted Jun 15, 2018 9:56 UTC (Fri)
by auc (subscriber, #45914)
[Link] (1 responses)
Posted Jun 18, 2018 21:33 UTC (Mon)
by stevedower (guest, #116614)
[Link]
I use pipsi for 'local' (per-user) python tool installs. It sets up a new virtualenv per tool so 1) requirements can't conflict and 2) doesn't pollute the global system install.
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
When I have the choice I skip language "toy" package managers entirely and use Nix shells
And this have the advantage of being able to express dependencies that are not part of the language-specific package manager, e.g.: OpenSSL, libxml2.
Python virtual environments
Agreed, I feel that Nix shells and guix environment provide similar functionality but are more robust (they make no assumptions about the environment) and not limited to Python.
Python virtual environments
Python virtual environments
PYTHONUSERBASE=$PWD/py-env pip install --user A-LOT-OF-DEPENDENCIES
(ps. in uwsgi just set env = PYTHONUSERBASE=/here/there/anywhere)
Python virtual environments
Python virtual environments
Yep, it really is this simple. I too do it all the time, which is why I was confident enough to stand up and say it should just be automatic.
The only challenge is agreeing on "what magic marker means we should put a certain directory in sys.path", and I think we'll be okay there (likely __pypackages__ - if that directory exists, you can import everything from it and pip will default to installing into it).
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments
Python virtual environments