The idea is simply to create a new type called DefRef
that produces classes that represent at run time deferred references. Combine the new type with two context managers to produce the classes results in a small, mostly back portable, api that allows for static and runtime consistent evaluation.
DefRef is a simple class that tracks how it was created and produces more DefRefs upon attribute access. DefRef classes would be interpreted as themselves everywhere in python. Static and runtime type checkers, or even the get_type_hints function itself, can then resolve the DefRef to the implied type in annotations. This allows them to be easily used not only in CPython but in other implementations with consistent and easy to comprehend results. No weird depending on when you look at it side effects. Just a consistent real object.
Simple example
# DefRef is a metaclass
a = DefRef("a",(),{})
class a:
var:a
class b:
var:a
a # class a
a. __annotations__["var"] # DefRef ( name="a")
b. __annotations__["var"] # class a
a is replaced in the module immediately but class a’s annotation does not get updated this allows for easy to understand behavior. Further it means pythons normal scoping rules can be used and you will not have to mentally keep one set of scopes for annotations vs for normal python code.
By using context managers to control the creation of the DefRefs it allows the use of python default scoping rules while produce types that static type checkers can explicitly understand.
The first context manager (and likely main one). Is defer_imports
This temporarily hijacks the import statement to produce DefRefs instead of actually importing the module.
from defref import defer_imports
with defer_imports(globals()):
from json.encoder import JSONEncoder as jsonenc
import json
json is not the module json but instead a DefRef that can be used in annotations. Unlike pep 649 this will actually resolve the large number of circular reference issue that currently exist. Further it preservers the import information to allow it to be used at runtime.
The second context manager is for representing in module DefRefs. Currently I am calling it a deferral window context. The function is defer_window
.
with defer_window(globals(),locals()) as defer:
a,b = defer("a","b")
The ideally if a and b are NOT defined before exit is called it will raise an exception. The DefRef’s a and b would resolve to whatever non DefRef object occupies the local scoops a and b when exit is called.
with defer_window(globals(),locals()) as defer:
a, = defer("a")
class c:...
class d:
var:a
a = c
d. __annotations__["var"] # DefRef ( name="a", resolved_to class c)
This allows for predictable results that do not depend on when inspect or get_type_hints is called a feature both pep 649 and 563 cannot do. It also works with locally renamed variables.
A full example of code that would run and give deterministic results would be as follows.
from defref import defer_imports, defer_window
with defer_imports(globals()):
from json.encoder import JSONEncoder as jsonenc
import json
with defer_window(globals(),locals()) as defer:
a,b,c = defer("a","b","c")
class a:
name:b
val:c
class b:
val:a
class c:
main:jsonenc
mod:json.JSONDecoder
This is currently a WIP. Currently working: import hijacking, scoping rules, context managers, and resolving non import deferred references. Import hijacking relies on hackily just replacing the __import__ builtin and then swap it back after. I am aware that is documented to not be done I am fairly confident any issues with that can be overcome though it was simply the easiest way to wholesale replace import. I would also like to add some form of caching for the modules so that it does not end up create a bunch of extra objects that are unneeded and to prevent unnecessary delays in resolving defer references. Probably also beneficial to add some way to distinguish between deferred modules and classes.
One other items I would like to add that would, IMO, greatly assist runtime type checking is the ability to request a callback once a type is resolved. This would prevent the need to constantly fire functions to blindly see if something is resolvable later on. Currently, that is what is done to resolve the problem right now. DefRef’s are intended to more closely resemble weakrefs then forward refs.
There is another benefit to them that I didn’t discuss yet, in restoring normal scoping rules in function imports will once again be a tool to prevent unnecessary imports. pep 649 does not resolve problem.
with defer_imports(globals()):
import expensive_module as em
def rarefunction(a:int) -> em.internal # DefRef( expensive_module.internal)
import expensive_module as em
em.internal # not a DefRef
This is also adoptable alongside pep 649 though I think that would realistically make it far harder to understand the code. The code currently is around 150 lines of python without docstrings. It probably needs another 200 lines or so to cover the remaining features. It requires 0 modification to the interpreter and should be nearly fully backportable which would really help adoption in codebases targeting older python versions. For a non CPython implementation depending on what can be done with __import__
there would be between 0 and 2 methods that need to be replace both are isolated to a single context manager. Making it by far the most compatible solution.
This does however have more runtime overhead as it goes back to using real objects instead of well not using real objects. I am really not sure if I have a good grasp of how much that is actually an issue. I would imagine in the most time sensitive code bases, command line programs, the benefit of not importing far outweighs the cost of creating the extra objects.