This in reaction to outrage over what amounted to massive instant free-for-all AI image manipulation without even consent.
While the backlash against Grok manipulation had a strong showing, still… remember the adage: “Some are more equal than others.”


By definition, Grok can’t make child sexual abuse material, because it’s fake. Those posts need community notes.
Post needs accessibility.
Images of text break much that text alternatives do not. Losses due to image of text lacking alternative such as link:
Contrary to age & humble appearance, text is an advanced technology that provides all these capabilities absent from images.
Weird thing to go to bat for but the emphasis is on material. Even if it’s generated or drawn if it depicts CSA acts it can constitute CSAM and is even treated as such in a lot of jurisdictions. Though generated more so than drawn due its appearance of being genuine. It feels like saying “this isn’t porn of Y celebrity because it’s fake”, but it’s still porn.