PEP 722: Dependency specification for single-file scripts

Like I said before, it feels like a very small expansion of the current situation. A script like is explicitly downloading and installing things, and someone who downloads the script can read it (and the helpful comment that it’s not doing something nefarious).

Whereas I’m imagining an evil pf_mօօre[1] on StackOverflow solving questions with a script that actually does answer the question with a benign, working standalone script, like this:

#!/usr/bin/env pip-run --extra-index-url

# Dependencies:
#     rеquеsts

import rеquеsts


This would, at the time of running, Just Work™ and earn a shiny green checkmark on SO and therefore be downloaded by other people who find the question. And the malicious rеquеsts installs itself because this feature is allowed to use extra indexes.

I think this situation is just slightly more tricky and dangerous than the status quo, where at least you can read the code you’re executing and need to explicitly install the dependencies.

But again, there are existing security efforts to prevent dependency confusion attacks, and that might be adequate protection from this already. I just thought it was worth thinking about more.

  1. note the օ ↩︎

1 Like

But also, with a minor tweak to your example shebang, you can do
what you describe now without needing a PEP 722 requirements comment
block. Just stuff the package names into the shebang itself.


The feature doesn’t allow extra indexes. The security issue (if you want to call it that) here is with pip-run, if anywhere. And you can do this right now with pip-run. You don’t even need the Dependencies: block, because you can specify packages to install in the pip-run command line.

So no, this isn’t a new security risk introduced by this PEP.


Ah, I didn’t know this as I am not familiar with what pip-run can currently do. So it’s not a new risk, although it might make the attack more attractive if more people use this.

Since these typo/lookalike packages aren’t allowed PyPI, the attempt is more obvious anyway.

edit: regarding “this feature”, I would just add that it’s possible to specify that this method of specifying dependencies can’t use an index besides PyPI. As you discussed in the PEP, there’s no requirement for it to replicate everything that pip can do.

I’ll just quickly say that while I have spoken on behalf of VS Code about supporting this PEP if it gets accepted, I will also say as a user I would use this. I too have utility scripts scattered around my machine where I have to keep some virtual environment for installing like 1 or 3 dependencies.

As an example, I have a script that just check the images in a certain directory are of a certain size and file name pattern. To check the file sizes I use Pillow, so I have to keep a virtual environment in that directory with the script just for this single purpose. Having a pipx run style workflow would be much nicer. And if I need it to work like a binary I would just do #! /usr/bin/env -S pipx run [1].

  1. But since these are utility scripts for specific instances, I really don’t feel the need to have them work like a binary on PATH. ↩︎

1 Like

Looking at this, I got the following impressions:

  1. Lots of languages use “structured comments”.
  2. Having a separate “script runner” isn’t uncommon.
  3. The list doesn’t mention isolation. Some (Mix and Ruby) look like they install things in your “main environment”, but that’s just a guess.

So overall, this is reasonable evidence (IMO) that the proposal here isn’t particularly unusual. Which is great news - I’ll add this list as a reference in the PEP.

Of course, if I’ve missed anything, or misinterpreted something critical, please do let me know.


One advantage of the dependency information that hasn’t been mentioned applies to the inline toml route. Many tools could hijack this and use it for standardized per-file configuration. For example,

# Inline TOML section start marker.
# [tool.ruff]
# ignore = ["B008", "B011", "D100"]
# Inline TOML section end marker.

So it could do more than just dependencies, but also provide a standard way to do whole-file tool confguration? Or maybe that’s too far afield.

The PEP as written really only talks about scripts – personally, I think when you’re getting into tool configuration or similar it is quite divorced from a simple script. It might be useful to standardise ‘tool metadata’ or similar, but I think it should be done as a new PEP rather than as part of this one.


I doubt I would use this personally, but it seems to me like this could be useful to others. Especially now that dead batteries” are getting removed, such a proposal making it easier for simple scripts to depend on 3rd party libraries might be welcome.

Regarding the format, my mind went towards something like in the following. Because it seems to fit with things that already exist in Python. But I think there were already plenty of arguments in this thread against such a notation.

#!/usr/bin/env python

This script does foo.

:Requires-Dist: SomeLibrary==1.2.3
:Requires-Dist: AnotherLib<2

def foo():

if __name__ == '__main__':

You’re hardly an average user, so I’m taking this with a huge grain of salt. Could you elaborate where the user feedback came from and where the feature was requested?

I’m uncertain if it’s considered bad, but it’d be interesting to understand what’s wrong with a pip requirements file or pyproject.toml in this context. Is that what you mean with “very bad”? Are you suggesting that the user experience with those real solutions is bad and we should improve it, by adding yet another way?

Sure, off the top of my head I can only think of making existing solutions “official”, mostly centered around my perception that this is an issue of users struggling to turn their one-off-scripts into executables or at least easy-to-execute CLI scripts.

Essentially, I think this is an application packaging issue in disguise, e.g. to turn a script into an executable. We could improve this by promoting projects like pyinstaller and PyOxidizer that bake in the dependencies. Could the PEP describe a standard way to do that, where we reuse pyproject.toml? It might solve the dependency caching problem that PEP 722 tools will surely have to face on consecutive runs.

That’s… not what I was said. I’m saying that telling them for this use case there is a slightly different way to define requirements via docstrings, instead of the many other ways, is not going to make it magically easier, but just increases the cognitive effort needed to conduct Python packaging. Yep, I’m basically calling this a case of xkcd://927. Could you add a comparative analysis to the PEP showcasing what people currently do?

Imagining the PEP goes ahead:

  • Would it essentially support all of pip’s requirements file syntax (and/or PEP 508) in the docstring of files (including pip’s hash-checking mode)?
  • Would this also be supported in more complex projects that are not single-file-scripts (would it be allowed in my_package/
  • Are there other features from pyproject.toml that would become needed for real world usage (e.g. optional dependencies)?

Good call-out, I didn’t have my conda hat on, in this case, but it’s a good question. Going to have to think about it. My feedback was purely focusing on the Python packaging.

Well, I’m for progress, for the record, in case that is in doubt. I’m submitting that adding yet another way to specify requirements is not progress but only distraction and more choice doesn’t equal a delightful user experience.

I definitely would say this PEP might harm what we do in the area, by not making it more obvious what users should do to package their scripts (which implies best practices around versioning etc). As such, I would strongly encourage presenting the PEP draft to real world users (not you!), if that would help them before we go ahead.

No apologies needed, as always, I’m providing this with an open mind and am looking for more clarification, obviously you’ve put lots of thoughts into this proposal.

That said, I wasn’t aware that you had argued for years about this! It might be helpful for the proposal if you’d add some of the past feedback you’ve received to the proposal and how the historic conversation around this topic went?


Is there something you want here beyond multiple folks saying they want this as well as the information provided in the Rationale section of the PEP?

This is covered in the PEP and the discussion here: it supports PEP 508 and pip’s various features aren’t supported because they’re pip-specific.

Allowed, yes. Something you’d want to do? Maybe. I don’t have concerns about this since it’s not changing how python runs code but rather how pipx run behaves.

We’ve already discussed how shared code across scripts is possible with this setup (PEP 722: Dependency specification for single-file scripts - #101 by pf_moore).

We aren’t really adding one more way either TBH, since it’s already something that multiple/all(?) tools do in the “run a script” space.

A quick pointer - this implies that Paul isn’t a real-world user of Python, which is probably not what you wanted to imply. :sweat_smile:

1 Like

Beyond what @pradyunsg already noted, I worked for many years in a large consultancy company, where we used many languages and scripting solutions. Python was just one option of many. Every Python user I knew in that environment would write single-file Python scripts, and would view Python as being similar to (but better than) shell scripts or batch files. They would never even conceive of giving a Python script its own directory, or of having to build the script in one location and then deploy it, and if they had been expected to do so, they would have considered Python unsuitable for many of tasks they used it for.

This is the experience I share, and as a user I was unable to find a really good way of handling 3rd party dependencies. Yes, I concede that I’m not an average user, but if I, with all my experience, can’t find a satisfactory approach, maybe that is stronger evidence that there’s an issue, not weaker?

This really has been discussed repeatedly already. I suggest that (if you haven’t already) you read the posts that have already been made here. Basically:

  1. There’s a real use case for single Python files living in a shared directory with other scripts (Python or otherwise).
  2. There’s no common idea of a unified “project” that encompasses all of the scripts in that directory.
  3. Building the code in a “project directory” and deploying it is not suitable for this use case.

Yes. Python doesn’t have a good approach for allowing those scripts to have dependencies. The common existing solution is creating (and managing) one or more virtual environments, containing a bunch of dependencies. This could be one “dump everything in here and hope there’s no conflicts” environment, or one environment per script, or something in between. But the environments need managing - for example, where do we store them? Do we back them up or have another script (which has its own dependency, virtualenv) to rebuild them on demand? All of this is extra admin work that (for example) shell scripts don’t need.

pip-run (and now pipx) offer an alternative approach. It’s not caught on widely yet, and it’s not necessarily the ultimate approach, but it’s an option. This PEP doesn’t change anything there, it simply acknowledges that there’s work going on to improve this area, and if we standardise the means of storing per-script dependencies, that means that users can store their dependencies without having to commit to a specific tool - which is good because I don’t think we have arrived at the ideal solution yet.

I’m really getting frustrated repeating this so much, but those “real solutions” are not solutions for this use case. They solve a different problem, but they violate a key requirement in this case, namely the “single file” requirement.

We’re not adding “yet another way”. We’re adding one way for a use case that isn’t addressed by existing solutions.

OK, that’s the problem. Your perception is wrong. This is not a matter of “turning one-off scripts into executables”. It’s about one-off scripts being a perfectly acceptable and valid use case in their own right, except for the fact that they have no means to declare dependencies on 3rd party libraries.

Nobody who’s interested in this solution wants to “turn” their scripts into anything else. They want to leave them exactly as they are. All they want is to enable tools to manage dependencies transparently.

You’re wrong. Would you be happy if I told you that all your shell scripts and batch files were no longer considered the “right way” to do things, and you had to create a project directory for each one and run a compiler on it to build an executable? Why do you think that it’s unacceptable to use Python in the same way as you use a batch file?

I’m sorry, but it really is. See my previous paragraph. In my ~/.local/scripts directory I have a script You are explicitly saying that the problem is “to turn a script into an executable”. I don’t want to turn that script into an executable. I want to be able to run it as is without having to manually create or manage a virtualenv. That’s it. You’re telling me that what I’m explicitly saying I want, isn’t how I should work.

No, I can’t, because the current standards explicitly exclude this use case by requiring a project directory.

  • No
  • No (except by accident, it’s not an intended use case)
  • No

Agreed. But adding a way to specify requirements for a use case that currently isn’t supported at all is progress, and will delight users who are frustrated that currently their requirements are being ignored.

I’ll take that on board. I’ll try to make it clearer in the PEP that “if you want to package your scripts for distribution, or for project management purposes, you should use the existing Python packaging workflows”. But the key is if you want to.

Apart from the fact that I resent being described as not a “real world user”, what do you think this thread (and the positive feedback from various real world users) is doing?

There’s not much I can link to. It’s more a case of linking to Nathaniel’s post (which I linked above, you should be able to find it by reading the thread) on multiple occasions, supporting tools like pip-run and pipx, my comments on the pip issue about adding a pip run command (again I’ve already linked this) and generally presenting my use cases as a user in various discussions. Mostly, I’ve just had my comments dismissed with the same “you can create a project” sort of arguments you’re repeating now. And I’ve tried - I am genuinely willing to follow the “official workflow” if it does the job - but not once have the results been as effective as simply writing a Python script and only using the stdlib (or hacky adhoc solutions for vendoring 3rd party dependencies).

Oh, and you’ll also see me regularly arguing against stripping down the stdlib, and quoting single-file scripts as one example of why “using 3rd party dependencies” isn’t as easy as people would have you believe. Again, I usually get shot down with “just use a virtualenv” (as if I wasn’t aware of that option…)

(BTW, sorry for the excess of emphasis and boldface. I’m getting really tired of repeating myself here.)


I’d like that yeah, conducting user interviews outside the self-selecting group here, e.g. people in Python groups in various companies, educators, admins?

Of course making it a Python packaging PEP elevates it to be one of the “official” ways, at least in the eyes of the users. That’s what I meant with “adding another way”. I want to know if the use case warrants making it a standard like this and have not been convinced so far from the arguments here.

Hey now, some self-reflection is important here, let me elaborate what I mean: given our many hats and deep experience with the packaging internals, I don’t think it’s easy to have a beginner’s mindset sometimes. Whether “real-world” is the right term for that, I don’t know and I didn’t have any ill intent. I meant to say that it’s worth speaking to people using Python that aren’t in the Python packaging bubble sometimes.


Feel free to do that. I don’t have the resources or contacts for that. Make sure to represent the PEP fairly, though - as I’ve already noted, some of your interpretations are simply incorrect.

And I’d question why it’s necessary for this PEP (which is literally just standardising some existing behaviour) rather than any other one? To my knowledge, no packaging PEP in the past has ever been backed by a user interview process like you’re suggesting[1]. Please do point out examples if I’m wrong.

  1. We’ve had many examples of users complaining when pip has implemented agreed PEPs - clearly those users didn’t get asked about the PEPs in question… ↩︎

1 Like

The way I solve this problem (and perhaps many other users, too) is simple: I don’t use virtual environments. So I pip install everything that I need for such scripts in my main (not system) python and then everything… just works. (I agree that knowing the dependencies is useful even in my case, but using something like pip-run seems heavyweight.)

This is a case where venvs are a classic non-solution to a non-problem, except in the edge case of conflicting dependencies (see also my rant here). But those are not really solved by this proposal, since it doesn’t specify version numbers.

1 Like

That might be a part of the problem. :wink:

I’m only half-joking there, and I’ll also caution that we shouldn’t really take an all-or-nothing approach – doing a round of UX research isn’t “free” and we ought to be congizant of that. At the same time, doing a round of UX research isn’t a panacea either; there’s definitely ways to do it that are meaningfully worse that not having done it in the first place.

While I agree that having a better sense of what users want/need is definitely something we want & need – and we have some of that through the survey conducted – we also know that there’s knowledge above and beyond what’s in the surveys that we’ve collected through interacting with people and the projects directly. Having a round of user interviews done is a benefit to a spec like this that makes it more likely (assuming positive feedback) but not doing that shouldn’t be a blocking concern.

1 Like

Thanks for the extensive response, I appreciate it and will have to chew a bit on our misalignment regarding the problem space.

Oh, actually, let me write down my thoughts on this conda/PyPA compatibility issue quickly:

I don’t worry about the PEP related to conda specs any more than before since it’s essentially the same mapping issue between PEP 508 specifiers and the conda matchspecs, as we have elsewhere. Tools like conda-lock have been able to work on rectifying that, and I believe the mapping issue could also be solved for conda users if there is a need (which is a big if, which has been discussed elsewhere).

The only wrinkle is a UX issue around the already existing conda run which behaves differently than the PoCs (pip-run and pipx run) mentioned in the PEP: conda run requires specifying a conda environment by name or path and only works with executable scripts/binaries installed in that environment. It might be possible to retrofit it with a --file option and default to a temporary environment if no env name or path is given. Let’s see how the PEP goes before we dive deeper into that.

I haven’t looked at the existing pip-run implementation, but it’s
possible it could be extended with the option of specifying the name
of a persistent environment (venv) to reuse directly or to create if
it doesn’t exist, which sounds like would bring it at least
superficially closer to the conda run behavior.


The way I solve this problem (and perhaps many other users, too)
is simple: I don’t use virtual environments. So I pip install
everything that I need for such scripts in my main (not system)
python and then everything… just works. (I agree that knowing
the dependencies is useful even in my case, but using something
like pip-run seems heavyweight.)

This is a case where venvs are a classic non-solution to a
non-problem, except in the edge case of conflicting dependencies
(see also my rant
But those are not really solved by this proposal, since it doesn’t
specify version numbers.

It seems orthogonal to the use of venvs, at least to me. I
personally would actually use it to determine what needs to be in
the venvs I create on the fly for my random one-off scripts. As I
mentioned earlier, this idea is something I (and many others, I
think?) are already doing, but it offers a standardized format for
listing the Python package dependencies of a script within the
script itself so that creating the environment that script needs
(either manually in advance, or in an automated fashion on the fly)
is a task we can collaborate on interoperable solutions for.