f-strings, t-strings, d-strings… The idea is to go meta on prefixed strings and get func-fix strings where @<name><string literal> is equivalent to the function call of name(<string-literal>)
so:
...
d = textwrap.dedent
...
print(@d"""\
This little piggy
Little Bo Peep\
""")
This won’t work because different string literal types require different parsing and compiler rules at the language level.
For f/t-strings, what’s inside curly brackets are expressions that need to be eagerly evaluated in the scope where the f/t-strings are defined.
For d-strings, if the line continuation marker \ is given a significant meaning as it is being discussed, it’ll also require a different parsing rule that makes your proposal not work.
You can do this today, by writing name(<string-literal>).
It’s the most elegant possible syntax for it!
Seriously though: I’m really struggling to understand the current fad for “lots of slightly different string prefixes”.
In the PEP 750 discussion I opposed the idea of allowing (1) some_html = html'<br/>' on the grounds that it’s far more beneficial to write it as (2) some_html = html(t'<br/>'). Your suggestion is approximately the same as saying we should use syntax (1) (the only difference is that it’s for non-t-strings and has an @ added).
String prefixes are for the Python parser. When you just want to turn a string into something else, use a function call or a class constructor.