On Fri, Feb 20, 2026 at 9:18 PM Peter Dimov via Boost <boost@lists.boost.org> wrote:
Klemens Morgenstern wrote:
On Fri, Feb 20, 2026 at 8:47 PM Peter Dimov via Boost <boost@lists.boost.org> wrote:
So, basically, they asked the LLMs to commit copyright infringement, and they complied.
I'm shocked.
It proves that LLM output isn't inherently transformative;
It doesn't need to be "inherently transformative". Wanting it to be "inherently transformative" is again asking for "legal protection" and this simply doesn't exist. It doesn't exist for people, it doesn't exist for LLMs.
it's sort of an important point, because any work generated by an LLM needs to show individually it's not infringing copyright, since it could be a plain copy.
That's not how it works. If that was how it worked, _you_ would also have needed to show, for every line of code you wrote, that it's not infringing copyright.
I think my language wasn't clear - I didn't mean before in a court. If I write a piece of code, I usually have a good sense of what it's based on so I can slap a license on it. You just have no idea with LLMs, because you cannot trace output back to the input. And there's always input, since LLMs have no thoughts or ideas. So we know it's derived from something, i.e. it's never original, but we don't know how transformative it is, and we don't have a person to ask.
Your output isn't "inherently transformative" either.
Correct, I could be infringing copyright by reproducing a copyrighted work from memory.