Thanks for showing the structure of your project; its very helpful in giving you advice specific to your situation. One key point to get out of the way first, in case you (or others) aren’t already aware, is the overloading of the term “package” to mean two entirely different things “import package” and “distribution package” :
-
An import package is any directory containing an __init__.py
file (and typically, Python code), and the name of that import package directory is what you import in Python.
-
A distribution package is how your project is distributed to others (on PyPI, etc), and may contain one or more import packages. Each pyproject.toml
corresponds to one distribution package, and the project.name
specified therein is the name under which you find the project on PyPI, install it with pip
, specify it as a dependency, etc.
In your case, you have two top-level import packages, core
and postgres
. Each top-level import package is contained within its own distribution packages, with respective names tnc-connector-core
and tnc-connector-postgres
(I can see the names above due to using legacy editable installs with .egg-info
files). core
and postgres
would be the names you’d import
them under in Python, while tnc-connector-core
and tnc-connector-postgres
would be how you’d find and install them from PyPI, and how you’d specify them as dependencies.
As others have mentioned, assuming you’re using a current version of any modern backend that supports pyproject (“PEP 621”) metadata (Setuptools, Flit, Hatch, PDM, Meson-Python, Scikit-Build-Core, etc, basically everything but Poetry) you can specify your project’s dependencies in pyproject.toml
following the standard specification, like this:
[project]
name = "tnc_connector_postgres"
version = 0.2
dependencies = ["tnc_connector_core>=0.2"]
The deps specified in pyproject.toml
are, in general, abstract dependencies; they state what distribution package names and versions your project requires in order to function. However, what you might actually be asking is for a way to tell packaging tools to build and install your dependency locally, rather than from PyPI by default. This corresponds to concrete dependencies, i.e. specifying the exact dependency versions and artifacts you want installed, and where to get them from. For that, you’ll probably want to use a requirements file or lock file for this purpose, which also lets you specify an editable install. Support for relative local paths appears to be patchy at the moment, but you should be able to get it to work.
Sidenote: To note, giving the import packages such generic names if they aren’t going to be under another top-level import/namespace package (like luci
) isn’t usually the best idea, since they will clash with any other import package that might happen to have such a generic name (e.g. e.g. a postgres
binding library of that name, or core
for another project you’ve created).
The simplest solution is to incorperate luci
, tnc
, etc. in the import package (directory) name, e.g. luci_core
or tnc_postgres
. Alternatively, you could make a top-level luci
namespace package, so you’d have one top-level import package (luci
, with your subpackages accessed by luci.core
, luci.postgres
, etc.)