Python source code to dynamic-link C library

Think about it, you just developed a Python project, which consists of a bunch of .py script. You can run it on your own linux server, which of course, has a Python environment.

But you have to implement this Python project into another linux server, which has no Python environment, or with a Python environment but get a version issue, or with a Python environment but doesn’t have some third-party packaged your project rely on (numpy, pandas, tornado, tensorflow…). Trying to make sure and complete all the environment you need can be very troublesome. Even if you fix one linux server, maybe someday you get another one to implement, and you have to perform the same procedure again, again and again…

My goal is to wrap the python interpreter and all frequently used packages (numpy, pandas, tensorflow…) into a big dynamic library(.so) on linux, from my own linux server, which of course, has a Python interpreter environment. Also, I’ll convert my .py script of my project to .c or .so using cython and gcc command from my own linux server.

In this way, after this big dynamic library is generated, I can copy this file to any other linux server. Then, after I copy my project to this linux server (already .c and .so), I can run it through linking with the big dynamic library above. I don’t have to use an Python interpreter, and also I don’t have to install any third-party packages like numpy or tensorflow. Is this possible?

1 Like

PyOxidizer is actually fairly close to what you’re asking for.

1 Like

Docker is also a good solution, with the added benefit of consistent and containerised environments.

1 Like

That’s a great way to ensure that security bugs in your application
never get fixed.

When I install a security fix to the Python interpreter on my systems, I
expect that every application and script that runs in Python will use
the upgraded interpreter with the new fix. Your system would ensure that
there were hidden interpreters with dozens of third-party libraries
buried in .so files that would probably never get the necessary security

No thank you.

Of course this is “possible”, but honestly you would be better off
looking at writing a simple deployment script and running that.

Pretty much every Linux server now has Python, at least for the most
popular distros (Debian, Ubunto, Fedora, Red Hat);

Windows now supports Linux from the Microsoft App store.

So at least two of the three major platforms make installing Python

Once you have Python, you can run your deployment script that installs
pip, installs the packages you need, and copies your application onto
the system. Write it once, run it over and over again.

And each server will have the most up to date environment instead of an
old, obsolete, insecure environment.

Howdy Xixiang Yu,


And for commercial projects you also need to ensure that every customer works on the very same set of versions of said packages - as you just can test your software for one or at least just a limited, small number of such sets - unless you do not want to sell untested software.

That exactly is the reason, why I developed

blythooon · PyPI

Python Runtime Environment for Scientific Applications with Qt based GUI - Blythooon, Part 1 - YouTube

(Belonging example project: COVID Demo App - YouTube )

Nice idea :+1:! Good luck and keep us informed!

Cheers, Dominik

that’s a great way to ensure that you have to care for customers with thousands of different systems and that every new bug / incompatibility also finds the way into at least one of said systems:

I agree with @steven.daprano
Don’t forget about the standard Python module venv and pip freeze / pip install -r which solve some of the issues raised.

1 Like

Hi Peter,

I have no objection against using venv / pip; by the way, Blythooon does exactly this…

But I as well have no objection against Xixiangs idea of creating

Both approaches have advantages as well as disadvantages.

Blythooon is a NET installer. Nice, if the computer, on which you want to install, is connected to the internet. Not so nice for the rest (although I made it possible to let Blythooon download the necessary packages on another computer).

My objection just was against “the most up to date” part. I prefer “the tested” ones…

Cheers, Dominik

First of all I have to clarify that all the scenarios we’re discussing are about Python3. It’s meaningless to discuss Python2 since basically every linux system comes with a Python2.

I’m trying to solve the problem on linux servers. Below are the procedures but still have some problems. Start with the simplest one

On a linux server with Python3 installed,

  1. Use cython --embed command to convert into a hello_world.c file.
  2. Get the python3.7m folder from /root/anaconda3 and copy into the same directory as hello_world.c.
  3. Get the and from /root/anaconda3 and copy into the directory /usr/lib.
  4. Run command ldconfig
  5. Run gcc -I ./python3.7m/ -lpython3 hello_world.c -o hello_world.out && hello_world.out, SUCCESS!!!
    But all the above are on a linux server with Python3 installed

On a linux server without Python3 installed, apart from uploading all the files from the other linux server, the procedures are basically the same.

  1. upload the hello_world.c file from the other linux server.
  2. upload the python3.7m folder from the other linux server.
  3. upload the and from the other linux server and copy into the directory /usr/lib.
  4. Run command ldconfig.
  5. Run gcc -I ./python3.7m/ -lpython3 hello_world.c -o hello_world.out && hello_world.out, FAIL!!!
    The Error message were below:

Anyone who knows how to deal with this bug? Keep in mind we wanna fix the problem on a linux server without Python3 installed. Thank you!

I just came across this PyEmpaq that looks like it might also solve some of these needs (Disclaimer: I haven’t used it yet)

1 Like