Migrating from Python3.9 to Python3.12 issues

I recently migrated from Ubuntu 20.04 LTS To Ubuntu 24.04 LTS and that also meant migrating from 3.9 to 3.12 version of python.

I had to learn what a virtualenv was when I try to install non Debian software.

I ran my virtualenv in the following way

virtualenv venv

Then followed by

source ./venv/bin/activate

Then within the virtualenv I type

python3.12 -m pip install . 

Everything goes well and I get a message saying installed fine. However when I check under
/usr/local/lib/python3.12/dist-lib I see nothing. No egg file (that was there under python3.9)

and when I type python3.12 test.py (my code includes a import pandas which is successfully installed)
I get a message module pandas not found when it is successfully installed on my machine.

I did do a machine wide find on the egg file and found nothing.

Where am I going wrong ? Can someone explain ?

Regards,
Ashwin.

Which directory were you in when you ran this?

That looks like a global Python, not a venv Python

Check in the actual venv, in ‘./venv’. Or try reinstalling via a plain python or pip, once the venv is activated. I wouldn’t assume virtualenv adds an alias for python3.12 to the $PATH.

Pandas may well be installed on the machine globally too. But once a venv is activated (if it was built without the inherit deps setting), Pandas must be installed into that too. That’s a use case of venvs - testing different versions of Pandas on the same host, without them affecting with each other.

I was in the directory of the pandas source code directory when I type

python3.12 -m pip install.

So if I understood you correctly you want me to set up a virtualenv again and look in the venv directory for the egg file ? I did do a deactivate of the venv when I finished compiling so and hwn I look now there is no venv directory. There is no egg file associated with any of the 3.12 packages that I have built so far - scipy, pandas and cython,etc.

There is no pandas installed on the machine globally AFAIK.

How do I move forward ?

Reactivate the one you already made, as long as you didn’t actually delete the directory

Yes when I activate the venv directory again there is clearly a egg file under venv. Should I move this egg file to the global python dist directory ? Is there any way to automate this process ? I have over 100 python non debian packages. I dont want to do this for all of them.

Should I move this egg file to the global python dist directory ?

No

Is there any way to automate this process ? I have over 100 python non debian packages.

List them all in a requirements.txt file (or even better, in a requirements.in) or in pyproject.toml.

pip install -r requirements.txt

Ok this seems like the final question.

When you mention “list them all in a requirements.txt file” you mean you want me to enter the location of the egg files (the full path location) in a text file and run pip install -r requirements.txt outside the venv envrionment.

I presume you want me to run this command within the venv environment as the egg files won’t be visible outside the venv environment.

Just the names of the libraries, as they are on PyPi.

There’s no need to do anything with egg files anymore, unless you need some really old libraries.
Up to date packages and pip prefer wheels.

1 Like

Just to add a bit of context here, the purpose of the changes is to nudge people away from getting Python packages installed to the global python dist directory, unless those packages need to be installed there for use by other system-level packages. For your own projects, you shouldn’t want anything in the global dist directory. The reason is that mixing pip-installed libraries and apt-installed libraries in the global dist directory is likely to lead to conflicting dependencies and errors that can potentially break system-level tools.

Installing 100 Python packages should not be any harder now than it was before. You can still run all the same pip commands you used to run to install everything (or put the package names in a requirements.txt and install from that as @JamesParrott said). The only difference is you should activate the venv first, and then all the packages will be installed into that venv, leaving your global python environment untouched.

If a package installed in a venv needs a global python package i presume it knows how to look for it ?

The format of the requirements.txt file should it be as given here in this SO answer - python - How to install from requirements.txt - Stack Overflow ?

You can set it up that way, but by default no. The idea is that the venv is isolated from the system environment. You would separately install into the venv whatever you need, even if you already have another version in the system environment.

As shown in the answer, yes.