I think you may be underselling what's going on here, because the entire point of AI generated artwork is that it is based on some chain of relationships to the source material, that there is no actual creativity involved.
So the AI is going to take what you give it, and combine that with what others have given it based on a bunch of similarity weights. If the AI is any good at its job, it will produce an image that is remarkably correlated to the undisclosed original, since that is exactly what AI is doing every time it's asked to do anything at all: "I have an incomplete thought/image in my mind, please produce a written or visual completion of it based on a model built on several billion best guesses from real world data." And if the victim of the harassment has had the misfortune (or lack of foresight) to have their actual nude images somehow slip into the ai's training data, it is likely the AI will prioritize surfacing them in its output.
I do not think we have to seriously entertain the thought that a random 15 year old will have nudes in the first place, much less part of a training dataset.
Either way, the AI will now give you a body that is roughly correlated with what your head looks like.
So the AI is going to take what you give it, and combine that with what others have given it based on a bunch of similarity weights. If the AI is any good at its job, it will produce an image that is remarkably correlated to the undisclosed original, since that is exactly what AI is doing every time it's asked to do anything at all: "I have an incomplete thought/image in my mind, please produce a written or visual completion of it based on a model built on several billion best guesses from real world data." And if the victim of the harassment has had the misfortune (or lack of foresight) to have their actual nude images somehow slip into the ai's training data, it is likely the AI will prioritize surfacing them in its output.