Yes that’s totally reasonable. I wouldn’t personally be upset if it wasn’t possible for Cython to support on Python 3.15 (for example).
If it were possible to implement using only Python API (and it sounds like it might be…) then that would also work well for an initial release and the faster C API could come later.
In short - I don’t think Cython (or similar) matters much for the PEP. We’ll have to implement it eventually but it’s pretty tangential to us. And API access can come later and doesn’t need to be the PEP.
Taking the other angle, Cython itself uses function-local imports a lot because its large modules suffer from circular dependencies. Being able to postpone a global import until it’s used would certainly help cleaning that up. However, the PEP won’t help for a while because resolving circular imports is not an optimisation but requires lazy imports and thus isn’t backwards compatible with the proposed mechanisms. So we won’t use the feature for years to come, until we drop support for Py3.14. (We may try to fix the actual problem of having circular imports in the first place, but that’s unrelated to this discussion.)
Yes, but I also think that the same would end up applying to a small package or any nontrivial script as well. Basically once you have lazy imports working why are you going to want to use eager imports anywhere?
I think that where you will end up is that you just want to mark all imports as lazy but that means needing to list all of them in __lazy_imports__ and then keeping that list in sync. If the PEP was proposing to make lazy imports be the default in future then it would be introduced behind a future import so you could just add one line at the top of every module/script:
from __future__ import lazy_imports
It would be confusing to use a future import for something that is not proposed to change in future but I think that the same applies that ideally you would add a single line to enable this and something like the suggestion from way above would give that:
__lazy_imports__ = ['*']
To answer the first question I think that you would not prevent it. Any public function will reify some portion of the library that it needs. You want to depend on the reification figuring out which bits are actually needed.
For the second question I think it could be very difficult to “prove” compatibility. Every object in every module is a potential starting point for reification so this is something you would want to analyse statically rather than trying to test everything. I’m not sure yet how to define simple rules so that reifying any possible object or any sequence of objects in any order would be guaranteed to work.
To me the premise of this line of questioning seems flawed. If I understand the concern correctly, it’s that if you need to explicitly mark imports as lazy then a project that has a bunch of imports that all work fine lazily will need to put in a bunch of effort editing the imports. But I’d assume that generally, the complicated part of updating a package to be compatible with fully lazy imports is restructuring and testing the code. As Brénainn and you already pointed out, it’s not even clear how to actually test for this. On the other hand, replacing imports with lazy variants can be done with a simple search-and-replace. And while keeping __lazy_imports__ in sync with the imports in a file can be annoying, it can be checked for with a basic linter setting. We’re already fine with __all__ having the same issue.
We’ve already had a similar PEP with lazy imports being the default rejected because it’s fairly complex to change the entire ecosystem to that. I don’t see the need to add some kind of backdoor into that implicitness just because it might save some basic text editing in some, presumably rare, cases.
Howdy. I’ve come to add my voice of support for this proposal, with a story about how it would benefit my use case specifically. Most of the development I do is deployed on Lambdas on AWS, and usually it is a backend built with FastApi, with all routes handled by a single lambda (commonly known as lambdalith). Being able to defer the imports from cold start executions to when a route is actually matched and executed would benefit me greatly. So much so, that I would happily accept the trade off of having to rewrite a fair bit of how I’m currently handling imports and “lazy loading".
That being said, I hope for the implementation of this feature to give absolute certainty on wether my lazy import will be in fact lazy at the time of typing it out in my IDE. I would therefore appreciate if the design of this feature could add a section on how Python language servers would be able to pick that up.
Currently in the similar situation as the post above (FastAPI on serverless environment). PEP 810 is an exciting boon.
Has the PEP author team considered inviting some commentaries from maintainers of popular web server dependencies with heavy runtime type introspection? e.g., Pydantic and SQLAlchemy. They all do a fair amount of that at import time, both in their package internals, as well as their model declaration APIs (in user modules).
Based on this PEP’s reification approach, I could make an educated guess that PEP 649 and PEP 810 together can very likely shift plenty of work from import time to first use. But it would be great to hear that from the maintainers of these packages as well.
this inquiry is quiet off-topic and i certainly don’t expect an answer by the PEP authors. but since Cython has already been brought up; can anyone assess whether mypyc will be capable to compile Python modules that’d make use of lazy import?
PEP 810 lazy imports won’t be triggered by basic class definitions but will be triggered if a decorator or a base class attempts to retrieve the annotations.
For example, @dataclass will trigger the imports for any annotations that use lazy imports as it calls get_annotations(..., format=Format.FORWARDREF). Similarly Pydantic calls this function in the metaclass for its BaseModel so it too will trigger the imports on class creation. Anything which analyses annotations heavily in class construction is likely to do the same.
Demonstration with dataclasses
import sys
from dataclasses import dataclass
lazy import typing as t
class Example:
a: t.Any
print("typing" in sys.modules) # False
@dataclass
class Example2:
a: t.Any
print("typing" in sys.modules) # True
This would be status quo based on my understanding, yes.
My point is to perhaps seek commentaries from these package communities on whether they can take advantage of PEP 649 and PEP 810. More specifically:
If they can likely defer work that calls get_annotations away from import time, riding on the interaction between PEP 649 and PEP 810
If they see value in doing so.
If PEP 810 as proposed would be enough to enable them to do so.
If not, what changes PEP 810 would likely need now, or later as a future iteration.
As far as I can tell, there is plenty of context from Science Python communities rolling up into this PEP and the predecessor PEP 690. But the PEP seems to have a light touch on the runtime (annotation) introspection side of Python that is more commonly seen in web applications.
Thanks for bringing this point here. I’m one of Pydantic’s maintainer, and PEP 810 could probably solve some annoying issues arising from circular imports. I had an initial proposal for “typing imports” with runtime support, but lazy imports provide a generalized solution that fits better so a big thanks to the authors of this PEP!
Technically, with the introduction of deferred annotations in PEP 649, we could lazily build the Pydantic models only when actions are performed on them (instantiation/validation, using model_json_schema(), etc.). In practice, we still have to eagerly evaluate annotations because we can’t assume every user is using Python 3.14, and that they don’t make use of string annotations [1]. But yes, with some work, we should be able to fully leverage PEP 649.
Regarding lazy imports, I believe they will be beneficial to work around circular imports. Right now users usually have to declare one of the import statements under an if TYPE_CHECKING: block, and somehow make Pydantic aware of the actual import somewhere else [2].
With some changes (we make assumptions about sys.modules[...].__dict__ being safely accessible in several places), we would be able to leverage PEP 810 to greatly simplify circular imports. I haven’t taken a deep look at the PEP specification, but I can’t think of any changes that would be needed for such use cases.
We have some additional logic to resolve annotations from inner functions locals explaining why this is done eagerly. ↩︎
This is usually done using the model_rebuild() method, where a types namespace can be provided. ↩︎
As someone who has burned out on a topic, I can say that it generally means you are not coming back to it. And that’s especially true with something as big and “heavy” as this PEP and topic.
Your other option is to have the people who are doing the work for the community by writing this PEP, managing this discussion, and doing the coding to not admit who their employers are and what the benefit is to said employers (and who also help fund other improvements to the language that we all benefit from BTW). And since the Python core team prefers transparency, they are being transparent. It is up to the SC to decide whether this feature services specific companies more than it would the community overall which does legitimately include corporations with large codebases.
And for transparency, I have contributed to Python in my spare time, an academic, a contract worker for the PSF, and working for major corporations. I trust the PEP authors to not being trying to dupe us into a feature that is detrimental for the language for the benefit of their employers and the SC to be watching out for this as well. I also know at least Pablo, Thomas, and Dino well enough to know they would at least refuse, if not quit, their employer if they were asked to do something nefarious.
I realize some folks feel like this is “rushed” because this specific topic was opened for less than 2 weeks before the PEP authors contemplated submitting this to the SC before they gave folks another week or two. But remember:
This is the 2nd PEP on this topic.
The PEP authors have implemented this at their workplaces and have experience with the ramifications of lazy imports
This was discussed with other core developers at the core dev sprints (e.g. at least me)
I think in this particular case, the pep lays out everything well enough that there should not be compatability concerns. There may be other concerns people have, but the lack of reason to think this could break any existing code without changes (ie. someone using the advanced feature prior to a library being updated for this) makes many concerns of this being rushed have less weight to them for me.
(The rest of this post below is only for those who believe I made an argument to slow this pep, that’s not my stance, and if you don’t care about that, please just scroll past the rest)
I somewhat did, but not to the extent of slowing this PEP’s acceptance. The implication of what I wrote about this not breaking well behaved code is that unlike some other recent issues, nobody has shown a concern about negative impact not covered by the PEP itself. There are many questions here, like c-api, who can use it, and whether or not this is enough or just a start, but because this has navigated around the issues raised in the last round, this PEP is on exceptionally well explained grounds that appear to avoid all issues that would negatively impact existing code bases by not having any effect on unchanged code and interpreter invocation..
This PEP comes at a rather unfortunate timing through no fault of the PEP authors where a few other parts of the language were changed in ways that made existing code have new issues, so I think people are in general more alert to this potential and may be overcorrecting for it in this case. I’m trying to not undermine anyone’s thoughts on what’s important in saying this, but it’s a lot harder to equate a change that should have no visible user changing behavior on unchanged code with other disagreements recently.
I would be appreciative of process changes that help prevent both negative outcomes that have come up (in either direction), but I don’t know if I’m the right person to propose them[1], and while I have some mild agreement and can see some negative effects of the current “1 week” general rule, I’m not fully convinced that extending it in all cases would be beneficial.
I believe my experiences don’t match the general case here, as I’ve been content to spend years working on intersections for python’s type system and have felt that people are trying to accelerate that proposal before we have answers to certain important questions. ↩︎
I deeply appreciate @brettcannon’s trust, but in case someone has any doubt I want to be crystal clear: I will absolutely quit my employer (current or future) if I were ever asked to push something fishy or detrimental to Python or the community just because it benefits them. No hesitation. This it’s a line I will not cross, period. And frankly, if you look at my track record, I regularly do things that make life harder for my employer or even my own work. For instance, when Mark and I implemented python-to-python inline calls, that was months of extra work to adapt our internal tools. And that’s just one of many examples.
That said, I want also to clarify the corporate involvement concerns directly, because there’s a fundamental misunderstanding here. The implementation in this PEP is fundamentally different from any other implementation, including all implementations made by my co-authors at their respective companies. Meta, Google, HRT, and other major companies already have working lazy import implementations that are fundamentally different from this one. They don’t need this PEP. These implementations are different from what we’re proposing: often more aggressive and tailored specifically to their controlled environments where they can make assumptions that simply don’t hold in the general Python ecosystem. They control their entire stack and have already solved their problems by forking CPython entirely.
This PEP is about bringing a well-designed, community-focused solution to everyone else: application developers who don’t have resources to maintain CPython forks, the Scientific Python community that’s been asking for this, CLI tool developers dealing with startup time issues, and anyone who wants to opt into lazy imports without rolling their own solution. You can check all the enthusiastic comments from users in this thread from all sort of backgrounds. In just two weeks this is already the most commented PEP thread in discuss.python.org ever and the sentiment is (in my own view) extremely positive. We designed this with the many similar use cases across the community in mind, not for any single corporation’s benefit.
And frankly, if you look at this thread and other forums, many people have been consistently asking for more aggressive versions where lazy imports would be the default behavior. Since making this the default isn’t possible as we know from the previous PEP and the compatibility issues, the global mode and the filter function are specifically designed to give application developers the tools to achieve similar results in their own codebases when they need it. We deeply believe this addresses a real use case that some subset of the user base have been asking for, not just something companies wanted. And that’s why is there. You can disagree and say here that you dislike these features of course and that’s completely fine (as reasonable people can disagree on design decisions). You can be sure the Steering Council will take these ideas into consideration when evaluating the PEP.
Do we expect companies with existing forks might eventually adopt this? Yes, because we designed something better for everyone, not something that exclusively serves them. And I think that mostly everyone would agree that having everyone working on the same Python instead of maintaining fragmented forks is better for the entire ecosystem it means improvements benefit everyone, bugs get fixed once, and the community stays unified. Also as a minor note, as @brettcannon mentions, companies with large codebases are legitimate Python users too, and bringing their use cases back into mainline Python strengthens the language for all of us. The alternative to this PEP isn’t “corporations don’t get lazy imports”: it’s that they continue maintaining fragmented forks while the rest of the community lacks this capability. That serves no one. And just to be also clear: avoiding this forks isn’t by far the major point fo the PEP.
I hope this clarifies things. Continuing to debate corporate motivations isn’t productive for evaluating the technical merits of this proposal, and I’d much rather focus the discussion on whether this feature serves the Python community’s needs and how to make it the best proposal possible.
There was never any doubt in my mind about that, I didn’t intend to impugn the motives of the PEP authors in any way, and I’m sorry if my message came across that way. As I said, I think the PEP is good. I was just responding to a few posts[1] that implied that that particular corporate audience was being kept particularly in mind.
Oh, Brett, no! I wouldn’t quit! I would refuse, yes, as I’ve made it clear throughout my hiring and my tenure that I do not and will not represent the company that way, even for innocent, non-nefarious changes But I wouldn’t quit over that. They have no leverage over me. They’d have to fire me instead, and that matters because I’m not in a US “at will” employment situation