I just merged an addition to Loupe’s contribution guidelines that bans the use of AI generated content in contributions.
For now, I came up with the following text. I’m in favor of adopting a similar policy for all official GNOME software.
Use of Generative AI
This project does not allow contributions generated by large languages models (LLMs) and chatbots. This ban includes, but is not limited to, tools like ChatGPT, Claude, Copilot, DeepSeek, and Devin AI. We are taking these steps as precaution due to the potential negative influence of AI generated content on quality, as well as likely copyright violations.
This ban of AI generated content applies to all parts of the projects, including, but not limited to, code, documentation, issues, and artworks. An exception applies for purely translating texts for issues and comments to English.
AI tools can be used to answer questions and find information. However, we encourage contributors to avoid them in favor of using existing documentation and our chats and forums. Since AI generated information is frequently misleading or false, we cannot supply support on anything referencing AI output.
This text is inspired by Servo’s contribution guidelines. However, I wanted to keep it more concise for the contribution guidelines.
At the same time, there are many more reasons why I am in favor of avoiding AI. They go far beyond the often cited environment cost and the ethical issues of companies capitalizing on Open Source work and the work of many artists and other creators, without permission or compensation. Basing a lot of what a society does on AI, will also cement many structures of oppression against marginalized people. While these aspects might be more relevant in areas where AI is deciding on who gets a job, what medical diagnosis someone will get, or who is a criminal suspect, I think it’s important to take a categorical stance against the use of AI, except for areas where AI is helping people immediately with their life.
Edit March 29 2025: Added “, but is not limited to,”