I want to routinely test out various CPython branches, typically main and whatever happens to be closest to release (currently 3.12.0rc1). No problem, update my repo, switch to the relevant branch, build, then create a virtual environment like so:
./python.exe -m venv DESTDIR
Then I switch to that environment and use pip to install from a requirements.txt file and I’m good to go.
Except the python.exe file (I’m on a Mac) is a symlink into my git repo and as far as I can tell, all stdlib modules/packages are referenced in-place from my git repo. This is fine as long as I don’t switch branches back in my repo, which I tend to do. It also prevents me from having, say, both 3.12rc1 and 3.13a0 virtual environments fed from the same repo. I can’t see a way to create a wholly contained virtual environment using venv in which all the relevant bits are copied. Is the solution to just make install into my destination directory instead?
I use make altinstall into a ~/lib/cpython/$version directory and
then have symlinks like ~/bin/python3.12 →
…/lib/cpython/3.12.rc0/bin/python3.12 (with ~/bin in my personal
$PATH) so that I can have as many minor versions available as I like
and can also switch between different patchlevel versions just by
altering a symlink. Of course, if I update a particular cpython
build or switch where a symlink is pointing I also rebuild any venvs
I use it with, but that’s just a trivial matter some scripting and
being diligent about tracking them.
I think if you try to run the interpreter and stdlib directly out of
a git checkout, you’re going to be swimming upstream.
“a wholly contained virtual environment” seems like a contradiction to me. The whole reason for having virtual environments is to share most of the files with the base python installation, which you don’t want here.
Two possible solutions for what you’re trying to do:
Use multiple checkouts for the various branches instead of switching branches. This uses more disk space for the git clones, although there should be ways to avoid this. The advantage of this is that you don’t have to use “make install” after updating a python module.
Use “make install”. This uses more disk space for the installation, and you have to “make install” every time you do updates to the source tree.
When I test with multiple versions I generally use throwaway virtual environments (if any) and make sure I either remove them before switching branches or be careful about when I use which environment. I also use out of source tree builds (e.g. mkdir build; cd build; ../configure ...).
Ah, thanks. This is exactly what I needed. I completely missed that in the --help output. My mind told me to look for --nosymlinks. When I didn’t see it, I assumed this feature didn’t exist. I guess I was in too much of a hurry.
I agree my use case is a bit out in left field, but let me explain. Python 3.12 has reached the rc1 state. I have a Flask-based project I want to test with that and with cpython main. It currently runs in “prod” using 3.10, and my day-to-day dev environment is a Conda-based 3.11. I want to answer the call for testing beyond the rather trivial
./configure && make -j && make test
routine. It makes sense (to me, at least) to be able to easily create multiple (non)virtual environments which I can quickly switch between for testing. I don’t care about the small extra disk space usage. It’s more important to be able to have multiple isolated environments. I could use tox I suppose, but my current testing regime has me running a shell script which does more than just run unit tests.
If you’re actively working on multiple Python versions in parallel, you can use git worktree to have multiple branches checked out from a single repo at the same time.