PEP 722: Dependency specification for single-file scripts

You’re hardly an average user, so I’m taking this with a huge grain of salt. Could you elaborate where the user feedback came from and where the feature was requested?

I’m uncertain if it’s considered bad, but it’d be interesting to understand what’s wrong with a pip requirements file or pyproject.toml in this context. Is that what you mean with “very bad”? Are you suggesting that the user experience with those real solutions is bad and we should improve it, by adding yet another way?

Sure, off the top of my head I can only think of making existing solutions “official”, mostly centered around my perception that this is an issue of users struggling to turn their one-off-scripts into executables or at least easy-to-execute CLI scripts.

Essentially, I think this is an application packaging issue in disguise, e.g. to turn a script into an executable. We could improve this by promoting projects like pyinstaller and PyOxidizer that bake in the dependencies. Could the PEP describe a standard way to do that, where we reuse pyproject.toml? It might solve the dependency caching problem that PEP 722 tools will surely have to face on consecutive runs.

That’s… not what I was said. I’m saying that telling them for this use case there is a slightly different way to define requirements via docstrings, instead of the many other ways, is not going to make it magically easier, but just increases the cognitive effort needed to conduct Python packaging. Yep, I’m basically calling this a case of xkcd://927. Could you add a comparative analysis to the PEP showcasing what people currently do?

Imagining the PEP goes ahead:

  • Would it essentially support all of pip’s requirements file syntax (and/or PEP 508) in the docstring of files (including pip’s hash-checking mode)?
  • Would this also be supported in more complex projects that are not single-file-scripts (would it be allowed in my_package/
  • Are there other features from pyproject.toml that would become needed for real world usage (e.g. optional dependencies)?

Good call-out, I didn’t have my conda hat on, in this case, but it’s a good question. Going to have to think about it. My feedback was purely focusing on the Python packaging.

Well, I’m for progress, for the record, in case that is in doubt. I’m submitting that adding yet another way to specify requirements is not progress but only distraction and more choice doesn’t equal a delightful user experience.

I definitely would say this PEP might harm what we do in the area, by not making it more obvious what users should do to package their scripts (which implies best practices around versioning etc). As such, I would strongly encourage presenting the PEP draft to real world users (not you!), if that would help them before we go ahead.

No apologies needed, as always, I’m providing this with an open mind and am looking for more clarification, obviously you’ve put lots of thoughts into this proposal.

That said, I wasn’t aware that you had argued for years about this! It might be helpful for the proposal if you’d add some of the past feedback you’ve received to the proposal and how the historic conversation around this topic went?


Is there something you want here beyond multiple folks saying they want this as well as the information provided in the Rationale section of the PEP?

This is covered in the PEP and the discussion here: it supports PEP 508 and pip’s various features aren’t supported because they’re pip-specific.

Allowed, yes. Something you’d want to do? Maybe. I don’t have concerns about this since it’s not changing how python runs code but rather how pipx run behaves.

We’ve already discussed how shared code across scripts is possible with this setup (PEP 722: Dependency specification for single-file scripts - #101 by pf_moore).

We aren’t really adding one more way either TBH, since it’s already something that multiple/all(?) tools do in the “run a script” space.

A quick pointer - this implies that Paul isn’t a real-world user of Python, which is probably not what you wanted to imply. :sweat_smile:

1 Like

Beyond what @pradyunsg already noted, I worked for many years in a large consultancy company, where we used many languages and scripting solutions. Python was just one option of many. Every Python user I knew in that environment would write single-file Python scripts, and would view Python as being similar to (but better than) shell scripts or batch files. They would never even conceive of giving a Python script its own directory, or of having to build the script in one location and then deploy it, and if they had been expected to do so, they would have considered Python unsuitable for many of tasks they used it for.

This is the experience I share, and as a user I was unable to find a really good way of handling 3rd party dependencies. Yes, I concede that I’m not an average user, but if I, with all my experience, can’t find a satisfactory approach, maybe that is stronger evidence that there’s an issue, not weaker?

This really has been discussed repeatedly already. I suggest that (if you haven’t already) you read the posts that have already been made here. Basically:

  1. There’s a real use case for single Python files living in a shared directory with other scripts (Python or otherwise).
  2. There’s no common idea of a unified “project” that encompasses all of the scripts in that directory.
  3. Building the code in a “project directory” and deploying it is not suitable for this use case.

Yes. Python doesn’t have a good approach for allowing those scripts to have dependencies. The common existing solution is creating (and managing) one or more virtual environments, containing a bunch of dependencies. This could be one “dump everything in here and hope there’s no conflicts” environment, or one environment per script, or something in between. But the environments need managing - for example, where do we store them? Do we back them up or have another script (which has its own dependency, virtualenv) to rebuild them on demand? All of this is extra admin work that (for example) shell scripts don’t need.

pip-run (and now pipx) offer an alternative approach. It’s not caught on widely yet, and it’s not necessarily the ultimate approach, but it’s an option. This PEP doesn’t change anything there, it simply acknowledges that there’s work going on to improve this area, and if we standardise the means of storing per-script dependencies, that means that users can store their dependencies without having to commit to a specific tool - which is good because I don’t think we have arrived at the ideal solution yet.

I’m really getting frustrated repeating this so much, but those “real solutions” are not solutions for this use case. They solve a different problem, but they violate a key requirement in this case, namely the “single file” requirement.

We’re not adding “yet another way”. We’re adding one way for a use case that isn’t addressed by existing solutions.

OK, that’s the problem. Your perception is wrong. This is not a matter of “turning one-off scripts into executables”. It’s about one-off scripts being a perfectly acceptable and valid use case in their own right, except for the fact that they have no means to declare dependencies on 3rd party libraries.

Nobody who’s interested in this solution wants to “turn” their scripts into anything else. They want to leave them exactly as they are. All they want is to enable tools to manage dependencies transparently.

You’re wrong. Would you be happy if I told you that all your shell scripts and batch files were no longer considered the “right way” to do things, and you had to create a project directory for each one and run a compiler on it to build an executable? Why do you think that it’s unacceptable to use Python in the same way as you use a batch file?

I’m sorry, but it really is. See my previous paragraph. In my ~/.local/scripts directory I have a script You are explicitly saying that the problem is “to turn a script into an executable”. I don’t want to turn that script into an executable. I want to be able to run it as is without having to manually create or manage a virtualenv. That’s it. You’re telling me that what I’m explicitly saying I want, isn’t how I should work.

No, I can’t, because the current standards explicitly exclude this use case by requiring a project directory.

  • No
  • No (except by accident, it’s not an intended use case)
  • No

Agreed. But adding a way to specify requirements for a use case that currently isn’t supported at all is progress, and will delight users who are frustrated that currently their requirements are being ignored.

I’ll take that on board. I’ll try to make it clearer in the PEP that “if you want to package your scripts for distribution, or for project management purposes, you should use the existing Python packaging workflows”. But the key is if you want to.

Apart from the fact that I resent being described as not a “real world user”, what do you think this thread (and the positive feedback from various real world users) is doing?

There’s not much I can link to. It’s more a case of linking to Nathaniel’s post (which I linked above, you should be able to find it by reading the thread) on multiple occasions, supporting tools like pip-run and pipx, my comments on the pip issue about adding a pip run command (again I’ve already linked this) and generally presenting my use cases as a user in various discussions. Mostly, I’ve just had my comments dismissed with the same “you can create a project” sort of arguments you’re repeating now. And I’ve tried - I am genuinely willing to follow the “official workflow” if it does the job - but not once have the results been as effective as simply writing a Python script and only using the stdlib (or hacky adhoc solutions for vendoring 3rd party dependencies).

Oh, and you’ll also see me regularly arguing against stripping down the stdlib, and quoting single-file scripts as one example of why “using 3rd party dependencies” isn’t as easy as people would have you believe. Again, I usually get shot down with “just use a virtualenv” (as if I wasn’t aware of that option…)

(BTW, sorry for the excess of emphasis and boldface. I’m getting really tired of repeating myself here.)


I’d like that yeah, conducting user interviews outside the self-selecting group here, e.g. people in Python groups in various companies, educators, admins?

Of course making it a Python packaging PEP elevates it to be one of the “official” ways, at least in the eyes of the users. That’s what I meant with “adding another way”. I want to know if the use case warrants making it a standard like this and have not been convinced so far from the arguments here.

Hey now, some self-reflection is important here, let me elaborate what I mean: given our many hats and deep experience with the packaging internals, I don’t think it’s easy to have a beginner’s mindset sometimes. Whether “real-world” is the right term for that, I don’t know and I didn’t have any ill intent. I meant to say that it’s worth speaking to people using Python that aren’t in the Python packaging bubble sometimes.


Feel free to do that. I don’t have the resources or contacts for that. Make sure to represent the PEP fairly, though - as I’ve already noted, some of your interpretations are simply incorrect.

And I’d question why it’s necessary for this PEP (which is literally just standardising some existing behaviour) rather than any other one? To my knowledge, no packaging PEP in the past has ever been backed by a user interview process like you’re suggesting[1]. Please do point out examples if I’m wrong.

  1. We’ve had many examples of users complaining when pip has implemented agreed PEPs - clearly those users didn’t get asked about the PEPs in question… ↩︎

1 Like

The way I solve this problem (and perhaps many other users, too) is simple: I don’t use virtual environments. So I pip install everything that I need for such scripts in my main (not system) python and then everything… just works. (I agree that knowing the dependencies is useful even in my case, but using something like pip-run seems heavyweight.)

This is a case where venvs are a classic non-solution to a non-problem, except in the edge case of conflicting dependencies (see also my rant here). But those are not really solved by this proposal, since it doesn’t specify version numbers.

1 Like

That might be a part of the problem. :wink:

I’m only half-joking there, and I’ll also caution that we shouldn’t really take an all-or-nothing approach – doing a round of UX research isn’t “free” and we ought to be congizant of that. At the same time, doing a round of UX research isn’t a panacea either; there’s definitely ways to do it that are meaningfully worse that not having done it in the first place.

While I agree that having a better sense of what users want/need is definitely something we want & need – and we have some of that through the survey conducted – we also know that there’s knowledge above and beyond what’s in the surveys that we’ve collected through interacting with people and the projects directly. Having a round of user interviews done is a benefit to a spec like this that makes it more likely (assuming positive feedback) but not doing that shouldn’t be a blocking concern.

1 Like

Thanks for the extensive response, I appreciate it and will have to chew a bit on our misalignment regarding the problem space.

Oh, actually, let me write down my thoughts on this conda/PyPA compatibility issue quickly:

I don’t worry about the PEP related to conda specs any more than before since it’s essentially the same mapping issue between PEP 508 specifiers and the conda matchspecs, as we have elsewhere. Tools like conda-lock have been able to work on rectifying that, and I believe the mapping issue could also be solved for conda users if there is a need (which is a big if, which has been discussed elsewhere).

The only wrinkle is a UX issue around the already existing conda run which behaves differently than the PoCs (pip-run and pipx run) mentioned in the PEP: conda run requires specifying a conda environment by name or path and only works with executable scripts/binaries installed in that environment. It might be possible to retrofit it with a --file option and default to a temporary environment if no env name or path is given. Let’s see how the PEP goes before we dive deeper into that.

I haven’t looked at the existing pip-run implementation, but it’s
possible it could be extended with the option of specifying the name
of a persistent environment (venv) to reuse directly or to create if
it doesn’t exist, which sounds like would bring it at least
superficially closer to the conda run behavior.


The way I solve this problem (and perhaps many other users, too)
is simple: I don’t use virtual environments. So I pip install
everything that I need for such scripts in my main (not system)
python and then everything… just works. (I agree that knowing
the dependencies is useful even in my case, but using something
like pip-run seems heavyweight.)

This is a case where venvs are a classic non-solution to a
non-problem, except in the edge case of conflicting dependencies
(see also my rant
But those are not really solved by this proposal, since it doesn’t
specify version numbers.

It seems orthogonal to the use of venvs, at least to me. I
personally would actually use it to determine what needs to be in
the venvs I create on the fly for my random one-off scripts. As I
mentioned earlier, this idea is something I (and many others, I
think?) are already doing, but it offers a standardized format for
listing the Python package dependencies of a script within the
script itself so that creating the environment that script needs
(either manually in advance, or in an automated fashion on the fly)
is a task we can collaborate on interoperable solutions for.

Well, the proposed PEP explicitly says

Of course, declaring your dependencies isn’t sufficient by itself. You need to install them (probably in some sort of virtual environment) so that they are available when the script runs.

And, indeed, that is the way both pip-run and pipx would handle the situation as far as I understand…

Well, the proposed

explicitly says

Of course, declaring your dependencies isn’t sufficient by
itself. You need to install them (probably in some sort of
virtual environment) so that they are available when the script

And, indeed, that is the way both
pip-run and
pipx would handle the situation as
far as I understand…

Sure, but it doesn’t mandate the use of venvs, so I don’t understand
the point to your rant about venvs. Just because the author of the
proposal, and the authors of the tools already doing something along
these lines, use venvs that doesn’t mean you have to use a venv to
get some value out of the proposal. I get that you don’t like venvs,
so… just don’t use them? And accept that there are others who do
find them useful as actual solutions to broader problems than you’ve
encountered (or perhaps ever will encounter) rather than derisively
declaring them a “non-solution” or pretending the problems some of
us deal with are nonexistent.

I could see, for example, using a tool which checks your preferred
environment for the presence of the packages specified in a PEP 722
compliant comment block in a script before running that script, and
either warning you that you’re missing dependencies (by telling you
what exact packages you want) or simply installing them for you,
whatever your comfort level. That could work equally well in a
non-isolated shared environment or with a persistent venv (and the
latter is how I would use because I need the additional isolation in
many cases). It’s something I find useful and already do because I
need to rebuild my environments semi-regularly, so welcome the
opportunity for an interoperable specification around it.

1 Like

Of course, and if my “rant” (which I admit is my own word for what I wrote!) implied otherwise I certainly should not have done so. And indeed a tool which just ensured that my environment (virtual or otherwise!) had the correct packages installed would indeed be useful!

I could say more about virtual environments but that is not appropriate for this thread.

1 Like

To be clear, those are not PoC implementations of the PEP. They are existing tools that have implemented a “run-with-dependencies” operation for scripts, because there was demand for that feature. They will continue to exist whether or not this PEP is accepted.

All the PEP does is document in a common place, the format that both tools use for extracting dependencies. It also tries to make the format useful should people write other tools that need this data, and it will almost certainly make slight changes to details of the format based on feedback here (which pip-run and pipx will probably implement, because following standards is a good thing, but they don’t have to, of course).

Think of pip-run and pipx more as existing use cases for this PEP, and it might make more sense.


But this expansion of covered use-cases (great!) should not make the situation worse for everything else, and the packaging survery could not have been more clear about reducing the divergent tools in packaging, so adding yet another way to specify dependencies understandably meets resistance – and it’s on the PEP to prove this necessity.

It’s also a really ugly way: magic comments. This breaks syntax highlighting, much automated tooling (due to parsing two different syntaxes in the same file), is prone to diverging in semantics from requirements.txt / pyproject.toml / poetry lock files, etc.

So I do not buy the “single file” requirement, or at least not that it trumps all of the above. This does not mean that I’m dismissing your use-case, but I do believe the same result (having a script without too much structure or ceremony & a reasonable way to specify its dependencies) could be achieved differently, for example by:

  - my_fancy_script.requirements.toml
  - ye_old_workhorse.requirements.toml
  - [...]

That would still give a clear approach, without much overhead: Start hacking away in, and once you need third-party dependencies, add xyz.requirements.toml. The actual suffix for that is bikeshed central, but that way we could:

  • reuse some existing infrastructure (e.g. a reduced form of pyproject.toml), rather than having yet another way for dependency specification.
  • users who want to “graduate” their script for some reason just have to rename that file and add some extra metadata to make it a full-fledged project.

True. My point was more that as you say, UX research is costly, and it’s not always easy either to ensure the data is representative and unbiased, or to interpret the results accurately. And while the participants in this discussion are certainly self-selected, I don’t imagine that the sort of user research we could reasonably undertake would avoid at least some level of self-selection. Even the “big” user survey that @jezdez has referred to involves a certain amount of self-selection, even if it’s only selecting “people willing to take a survey” (which is likely to be biased towards “people who have a point they want to make”).

Eliminating such bias is a complex, specialist task. I have some background in statistics, so I know enough to know I don’t know how to do it properly, but that’s all :wink:

User research is absolutely a good thing, and we should do more of it. But it’s not a way of avoiding having to make choices based on our experience and knowledge. And sometimes choosing what (in our view) is right over what’s popular.


This PEP adds literally no new tools, and no new data formats. All it does is make one existing format (used by two existing tools) into a standard, so that if we (for example) later replace those two tools with a single new one (reducing number of tools?) then users don’t have to change their code (reducing churn for users).

That’s a fair criticism. I’m open to other suggestions. But many other languages use the “structured comments” approach, so it seems like it isn’t so bad in practice.

… and we’re back here again. How many people stating on this thread that they have a requirement for being able to declare dependencies in a single-file Python script are needed to demonstrate that this is a real-world use case?

OK. Maybe that would work. My gut instinct is that it would be something I’d use reluctantly, and be frustrated by various “papercut-level” annoyances. But I don’t want to reject a reasonable proposal just because it’s not my favourite. Also, none of the other languages mentioned in the survey of languages linked above use a separate file[1], so it feels like it’s going against common practice. Do you have examples of other languages using this approach that you can point to?

If you’re serious about this suggestion, are you willing to get it added to pip-run and pipx? What’s the transition plan from the existing behaviour to this proposal? There’s a whole “backward compatibility” section of the PEP that will need writing if we go down this route.

  1. Yes, I concede that’s at least partly because the survey is of single-file solutions. ↩︎


You snipped my statement in a somewhat unflattering way; I do accept the use-case. Luckily, your PEP is named “dependency specification for single-file scripts”, which I have no problems with as a requirement. My point was the the dependencies do not have to be in the same file to achieve that.

My response was aimed at pointing out the potential solution space between “single-file script” and “single-file script+requirements”, and that it’s possible to support the former in a way that doesn’t (a priori) create yet more UX & teachability problems.

I do care about python packaging (and not increasing divergence further), but between 2 jobs, my FOSS “responsibilities”, and a sliver of social life, I don’t have time to write, much less implement, a PEP, sorry.

1 Like

I know this is addressed and currently rejected in the PEP, but something like __dependencies__ with a restricted syntax (only string literals, for instance) could be a simple solution that doesn’t require a complete parser.

1 Like