I'm not a lawmaker, but it's probably pretty hard to write a law that effectively distinguishes between "doing X" and "doing X at scale". As another commenter mentions, if you target the means of doing it (human doing X vs. machine doing X), someone will just use Mechanical Turk or something to hire 10,000 humans to do X.
If telling AI to study Spiderman and then output 10 pictures of Spiderman is illegal, how is that different from hiring 10,000 artists to study Spiderman, having them each do a drawing, and then hiring 100 talent judges to pick the top 10?
Isn't the latter already a copyright infringement? So in this case, both should be forbidden?
I think the more immediate issue with AI is that it's like having access to a close-to-zero cost human who doesn't care whether or not they're creating content which, if a human did it, would be considered a copyright infringement. And that they care so little about copyright (and other data rights) that they're basically incapable of even warning you if they are close to an existing character or living person.
I don't know how this is going to play out, but right now we're getting a more polite re-run of the luddites smashing early industrial equipment. I can sympathise with the loss of purpose and economic disenfranchisement, but the economic power in that revolution went to those that did the most automation, and I expect the same to be true this time.
Not at all. We do it all the time. E.g. It's legal to consume fruits found in a National Park, as much as a single person can consume. It's illegal to harvest and take away anything more than that.
If telling AI to study Spiderman and then output 10 pictures of Spiderman is illegal, how is that different from hiring 10,000 artists to study Spiderman, having them each do a drawing, and then hiring 100 talent judges to pick the top 10?