I propose to add something like this to our AI policy. While I wrote the original version of this, I asked Copilot to review it, and got (to my eyes) very valuable feedback to the effect that I also have “too much experience” to make the really scary issues clear to newer programmers. So I gratefully accepted most of the bot’s suggested rewordings, and rejected some others. Any errors that remain are entirely mine. I am, deliberately, trying to change our policies to actively encourage full disclosure.
Copyright and Responsible Use of AI Tools
AI assistants can be helpful when drafting code or documentation, but they also introduce specific risks that contributors must consider. These tools generate output based on patterns learned from large training datasets, and may produce material that is derived from copyrighted or licensed sources. Such derivation can occur even when the resulting text or code is extensively rewritten or does not resemble the original in form or wording.
Because the PSF can only accept contributions that the contributor has the legal right to license, all submitted material must be free of copyright or licensing conflicts, including material produced with the assistance of AI tools. Contributors are responsible for ensuring that their submissions meet this requirement.
To support effective review, please clearly mark any parts of a contribution where an AI tool materially influenced the content. A brief comment is sufficient.
This allows reviewers to apply additional scrutiny where it is most needed and helps reduce the risk of unintentionally incorporating derivative material.
AI assistance is permitted, but it requires careful and transparent use. Clear disclosure helps maintain the integrity of Python’s codebase and ensures that contributions can be safely accepted under the project’s licensing terms.