These are more about giving maintainers the ability to require 2FA for their projects collaborators, not PyPI itself requiring some subset of projects to use 2FA, like npm is doing.
I also don’t think the latter is something that we’re currently able to do: not for some technical reason, but because PyPI does not currently have a large support staff like npm/GitHub/Microsoft does. Account recovery requests due to lost 2FA are already a huge drain on staff/volunteer resources due to how time-consuming they are and their sensitive nature. A 2FA mandate at this time, without having support staff, would likely eventually result in an overwhelming backlog of requests.
But time-scoping them like that will still break workflows if your project isn’t that active. If I set up automated releases, but I only do a release a couple of times a year, then a month-long expiry on a token will kill that workflow.
I do agree with the sentiment of stopping accepting user passwords and account-wide tokens for uploads.
I agree with Brett. Time-scoping prevents some abuse, but it wouldn’t prevent immediate abuse of a token. If somebody gets hold of the project token for urllib3, then that person can do a lot of harm in a very short time.
Instead of time-scoping I suggest an optional two-step release process. In the first step automation uses a project-scoped token to create a new staging release and upload artifacts. In the second a human needs to log-in with MFA and press the release button. The workflow allows users to automate all the annoying bits and still require MFA for the actual release.
In the case of projects I work on, where everything is driven from
code review and continuous integration, we’d likely implement a job
which confirmed the token was still valid immediately prior to
attempting to use it, in order to block release requests from
merging and triggering attempts to upload new artifacts to PyPI
until $HUMAN does the thing with its fingers to 2FA generate a new
token and replace the old one in job definitions. Is there a
mechanism to test a token or otherwise check its validity?
Instead of time-scoping I suggest an optional two-step release
process. In the first step automation uses a project-scoped token
to create a new staging release and upload artifacts. In the
second a human needs to log-in with MFA and press the release
button. The workflow allows users to automate all the annoying
bits and still require MFA for the actual release.
That sounds like a great option, but definitely not something I’d
want to see required for every project on PyPI. I work in a
community which has volunteer release managers approving dozens (and
sometimes hundreds) of release requests a week on demand. Requiring
them to all have access to PyPI accounts and click buttons in a
browser would be a step backwards.
Hypothesis also has a weekly automated PR, and “auto-merge if CI passes” is an obvious extension. I’d highly recommend both auto-releases and auto-weekly-maintainence to anyone, they’re lovely workflows.
Personally I’d like to keep using long-lived and non-interactive tokens, but would be happy to adopt other restrictions like “only allowed to publish new package versions with a later version number” to exclude attacks which add new wheels, or .post1 versions, or 1.9999 to get anyone who has pinned to pkg < 2.
More broadly, it sounds like we have a couple of good ideas that are largely blocked on (funding for) implementation in Warehouse. In particular, I don’t see much point discussing whether we should require MFA when projects can’t yet opt-in to enforcement!
I do strongly support blocking uploads from passwords or user-scope tokens, especially but not only for users with MFA enabled. Valuable even as an opt-in.
@pradyunsg - can I suggest adding MFA-and-token-related enhancements to the fundables page? They do seem to meet the criteria, and e.g. the OpenSSF might be interested.
That sounds perfect to me. Ideally I would like a UI where I can see what files are uploaded and their checksums and I can download them for sanity checking before signing off.
I stopped just short of enabling automated uploads for SymPy releases because I didn’t quite figure out a good way to control access for which contributors would be able to trigger the upload process from GHA. When I considered this I was worried about accidental triggers as well as malicious ones. Controlling access at PyPI rather than GHA is much simpler to audit because the number of people with PyPI access is much smaller. I would also be happier working on the release script itself if I knew that the final publish step could not possibly happen automatically (debugging a script that I definitely don’t want to fully execute makes me a bit nervous).
I agree that this should be optional but it’s definitely an option I would choose.
The PyPI maintainers have long expressed a dislike for OpenPGP as a
means of establishing provenance or attestation for packages. The
accepted plan to solve it is instead integration of TUF (The Update
Framework) per PEP 458:
Authentication of users to the platform serves a separate purpose;
it can be used not only to upload new packages, but also to yank or
hide existing ones, or to delegate access to other accounts. While I
agree that some cryptographic attestation is desirable, it should be
in addition to strong authentication of accounts, not as a
PEP 458 and OpenGPG signatures solve two related but different issues. PEP 458 signatures are created by PyPI backend and only protect PyPI storage and downloads from PyPI. PEP 458 does not establish provenance of packages either.
Is this something we could raise money for? We could appeal to donors/companies by saying: once we implement a better workflow and hire a paid support person for this, we can help project owners turn on multifactor auth requirements, we can better secure PyPI, which reduces your supply chain risk.
@smm maybe this is already in the works and I’m behind the times?
It’s essentially done, we just need to change the current notification of pending failure to an actual failure. Since we notified on Feb 25th, I think a ~6 month window suffices and we can resolve this sometime after Aug 25.
I’ll let @smm expand on the project more thoroughly, but the scope in this RFP, which is underway is intended to build a more sustainable PyPI by delivering features that the PSF can provide to companies at a cost and community organizations for-free.
Part of this will require additional paid staff to support PyPI in the future, not only for paying customers but for the community. If PyPI’s revenue from Organization Accounts and additional developed features aren’t enough, the PSF will certainly look to fundraise to better support all users of PyPI, that is a basis of sustainability for the service.
And as a more short-term relief, although once again… only a small portion of the role, the upcoming Infrastructure staff hire will be brought up to speed to provide some of the support I currently do to help ease the current bottle neck (me).