Think about it, you just developed a Python project, which consists of a bunch of .py script. You can run it on your own linux server, which of course, has a Python environment.
But you have to implement this Python project into another linux server, which has no Python environment, or with a Python environment but get a version issue, or with a Python environment but doesn’t have some third-party packaged your project rely on (numpy, pandas, tornado, tensorflow…). Trying to make sure and complete all the environment you need can be very troublesome. Even if you fix one linux server, maybe someday you get another one to implement, and you have to perform the same procedure again, again and again…
My goal is to wrap the python interpreter and all frequently used packages (numpy, pandas, tensorflow…) into a big dynamic library(.so) on linux, from my own linux server, which of course, has a Python interpreter environment. Also, I’ll convert my .py script of my project to .c or .so using cython and gcc command from my own linux server.
In this way, after this big dynamic library is generated, I can copy this file to any other linux server. Then, after I copy my project to this linux server (already .c and .so), I can run it through linking with the big dynamic library above. I don’t have to use an Python interpreter, and also I don’t have to install any third-party packages like numpy or tensorflow. Is this possible?