Illegally, maybe. Immorally, probably not. It’s fine for a human to read something and learn from it, so why not an algorithm? All of the original content is diluted into statistics so much that the source material does not exist in the model. They didn’t hack any databases, they merely use information that’s already available for anyone to read on the internet.
Honestly, the real problem is not that OpenAI learned from publicly available material, but that something trained on public material is privately owned.
I’d say that was a more controversial opinion. From a purist perspective I tend to believe that intellectual property in general is not ethical and stifles innovation.
Illegally, maybe. Immorally, probably not. It’s fine for a human to read something and learn from it, so why not an algorithm? All of the original content is diluted into statistics so much that the source material does not exist in the model. They didn’t hack any databases, they merely use information that’s already available for anyone to read on the internet.
Honestly, the real problem is not that OpenAI learned from publicly available material, but that something trained on public material is privately owned.
Is that really a problem? Is a create something new based on public knowledge, should I not be able to profit from it?
I learn to paint from YouTube, should I paint for free now?
I’ll admit that the scope of ChatGPT is MUCH bigger than one person painting.
I’d say that was a more controversial opinion. From a purist perspective I tend to believe that intellectual property in general is not ethical and stifles innovation.