You're asking the wrong question. How can you prove that children were involved? In most developed jurisdictions, you must be proven guilty, not innocent.
> How does the model generate CSAM without it either being in the training material or fed to the model as an input?
How can a skilled artist (when forced to) draw CSAM without ever having seen CSAM? How can they draw something without having seen the exact thing before? Both humans and LLMs are able to extrapolate information.
It somehow generates flying pigs without them existing. Substitute pigs for an other animal if your next argument is that flying pigs might be in the database.
How can you prove that zero children were involved at any point?
How does the model generate CSAM without it either being in the training material or fed to the model as an input?