ChatGPT does not “know” anything. It however knows what things “look like”. So it can construct something that looks like a legal citation because it has enough references to be able to recognise the patterns. But it doesn’t actually know anything.
Any time it gives you facts, it’s giving you sentences where the statistical correlations of those words is higher than not, I.e. enough other people have said it for it to be probably true. But that’s the extent of its cleverness.
Actually there is no such thing as originality, unless you are writing about new events that no one has written about it, everything you write will be a duplicate content. If you do online research to write your content, it is duplicate content. The only difference is who does this, a machine or a human.
This happens regularly enough with or without AI. AI just makes it much faster to do, though often quicker to spot.
But the notion that nothing is original unless you’re doing something entirely new is… complicated, because there’s 8 billion people all having thoughts and ideas, and it’s surprisingly common that two or more have the same ideas independently of each other then compete for the rights to profit off it later.
I also don’t think the world is that reductive about originality, as much as our corporate overlords would like it to be so.