If it's AI-generated, it is fundamentally
not CSAM.
The reason we shifted to the terminology "CSAM", away from "child pornography", is specifically to indicate that it is Child Sexual Abuse Material: that is, an actual child was sexually abused to make it.
You can call it child porn if you really want, but do not call something that never involved the abuse of a real, living, flesh-and-blood child "CSAM". (Or "CSEM"—"Exploitation" rather than "Abuse"—which is used in some circles.) This includes drawings, CG animations, written descriptions, videos where such acts are simulated with a consenting (or, tbh, non consenting—it can be horrific, illegal, and unquestionably sexual assault without being CSAM) adult, as well as anything AI-generated.
These kinds of distinctions in terminology are important, and yes I will die on this hill.