The newest open-source concern around AI that is seeing a lot of interest this weekend is when large language models / AI code generators may rewrite large parts of a codebase and then the “developers” claiming an alternative license incompatible with the original source license. This became a real concern this week with a popular Python project experiencing an AI-driven code rewrite and now published under an alternative license that its original author does not agree with and incompatible with the original code.
Chardet as a Python character encoding detector with its v7.0 release last week was a “ground-up, MIT-licensed rewrite of chardet.” This rewrite was largely driven via AI/LLM and claims to be up to 41x faster and offer an array of new features. But with this AI-driven rewrite, the license shifted from the LGPL to MIT.


LGPL makes it impossible to use it and keep your code private or profit off it. MIT, however is much friendlier to commercial use and corporate lawyers will permit it in the company ecosystem.
Basically, it completely neuters the express purpose of GPL based licenses.
In this case, one of the stated goals was inclusion into the main python distribution.
Okay, I haven’t read the article I was just responding to the question about the licensing problem. From what I see in the brief summary here is that this particular item is a rewrite of an existing property, but was given a license incompatible with said property. That is different from whether its license is compatible with Python. I looked up the Python license.
So I see that point as relevant if inclusion renders Python no-longer GPL-compatible. The real issue appears to me to be that AI makes it very easy to write (theoretically clean-room) implementation of a product - in this case chardet.
The problem here is that what was once something that took real effort and dedicated developer interest to “clone” legally is now easier (perhaps trivial) to do and license differently. This would threaten the GPL model, which is to democratize software and keep it from being entirely owned by entities that could then restrict the software or otherwise destroy the value of competing products.
I’d say there’s a real problem here, as people’s significant efforts for the greater community could be co-opted and eventually be rendered “pointless” when many people move away from it due to “improved” versions or the “new” versions add features that promote lock-in to their commercialized version. Eventually that open software is no longer viable, and people have to use the proprietary one. I don’t know if that is necessarily how things would actually play out, but it would at least dilute the GPL-based licensing power.