I am responding here and not directly to you because your discussion is beyond my expertise.
I report some personal considerations:
- Using the command:
–check-build-dependencies
causes a dump as it detects unresolvable conflicts for me, see for example:
ERROR: Some build dependencies for scipy from … conflict with the backend dependencies: numpy==1.24.2 is incompatible with numpy==1.19.5; … , pybind11==2.10.3 is incompatible with pybind11==2.10.1.
If, as I believe, I must choose all package versions in order to have a compatible stack, I will not force this check since I am unable to understand the implications.
- System packages and pip packages are different. I found myself having to install
Cython
with pip even though I had installed
cython3
from apt-get previously.
- Some dependencies are not straightforward fixable: a missing package
mesonpy
was fixed by installing a package with different name
meson-python
or installing both scipy and scikit-learn fails as scikit-learn is not able to recognize the building chain order (?) (scipy was already processed but not found): I don’t know if that is caused because scikit-learn needs scipy to be installed already.
Anyway given these problems and time spent on, I guess this is a rabbit hole for me, considering also my lack of competences.
At the moment I am trying to build like this:
# Create virtual enviroment without bootstrapped pip
#!!! TODO: check if setuptools is bootstrapped, if yes then should be deleted to optimize image size
# https://docs.python.org/3/library/venv.html
RUN python -m venv --without-pip ${VIRTUAL_ENV}
# tools needed to build requirements from source:
# https://docs.scipy.org/doc//scipy-1.4.1/reference/building/linux.html
# https://numpy.org/doc/stable/user/building.html
# https://numpy.org/install/
# https://packages.debian.org/source/stable/cython
RUN set -eux \
&& buildScietificPackagesDeps=' \
build-essential \
cmake \
ninja-build \
gfortran \
pkg-config \
python-dev \
libopenblas-dev \
liblapack-dev \
#cython3 \
#patchelf \
autoconf \
automake \
libatlas-base-dev \
# TODO: check if python-ply is needed
python-ply \
libffi-dev \
' \
&& apt-get update \
&& apt-get install -y --no-install-recommends $buildScietificPackagesDeps
# Install dependencies list
# --prefix
# used to install inside virtual enviroment path
# --use-pep517 --check-build-dependencies --no-build-isolation
# used to solve https://github.com/pypa/pip/issues/8559
# "# DEPRECATION: psycopg2 is being installed using the legacy 'setup.py install' method, because it does not have a 'pyproject.toml' and the 'wheel' package is not installed"
# --compile --global-option=build_ext --global-option=-g0 --global-option=-Wl
# used to pass flags to C compiler and compile to bytecode from source, see:
# https://towardsdatascience.com/how-to-shrink-numpy-scipy-pandas-and-matplotlib-for-your-data-product-4ec8d7e86ee4
# https://blog.mapbox.com/aws-lambda-python-magic-e0f6a407ffc6
#
# https://pip.pypa.io/en/stable/cli/pip_install/#options
RUN pip install --upgrade --no-cache-dir pip wheel setuptools Cython meson-python pythran pybind11 \
&& pip install --prefix=${VIRTUAL_ENV} --no-cache-dir --use-pep517 --no-build-isolation \
#--check-build-dependencies \
--requirement dependencies.txt \
# https://discuss.python.org/t/how-to-use-pip-install-to-build-some-scientific-packages-from-sources-with-custom-build-arguments/
# https://github.com/pypa/pip/issues/11325
--no-binary numpy,scipy,pandas --config-settings="build_ext=-j4"\
&& pip cache purge
I will report the results when the build ends. Feel free to share your opinions, thanks.