Cosplayers Should Leave Twitter For Their Own Safety

The year 2026 started off with examples of Twitter users allegedly taking images of women (and girls) from the site and asking Grok, the generative AI tool built into the platform now known as X, to remove the clothes from the women (and girls) in the images. I use the word “allegedly” to stave off any lawsuits. but the evidence is all but undeniable that it has happened, and the company behind Grok admitted it had happened.

I left Twitter more than a year ago, and so initially only saw reports of this happening on the social media platform Bluesky. Before I started writing this article, I needed to see how reliable those reports were, and I found plenty of news reports about it. But it was the BBC (of course) that had actual reporting, citing examples of the images that their reporters had seen, without posting the actual images themselves, for security and legal reasons. In addition, it interviewed one woman that had this happen to her images on Twitter.

The BBC article states that it is technically against the policy of the parent company of Grok, xAI, a spinoff of X(Twitter), to use the tool for generating pornographic material. But back in August of 2025, xAI made it clear that Grok could be used for that and the digital photo website PetaPixel wrote about it, with links to examples. So, it seems as though Twitter leadership is fine with that use of Grok.

Which means that every photo a cosplayer posts to Twitter, or has previously posted to Twitter, can be used in this fashion. And, in my opinion from years of covering technology business, it is already happening, possibly to any existing cosplay photo on the platform. Rule 34 exists not because it was something funny to make up, but because it was an accurate reflection of the way the Internet works, from its very beginning.

On Friday it was reported that xAI issued a statement saying that the image manipulation had happened and it was a “lapse” in the company’s “safeguards” that would prevent Grok from making pornographic images of minors. No mention of nonconsensual porn of adults. Also, no mention of whether or not the accountholders who made the images of minors had been reported to authorities, as is required in many jurisdictions. Update 1/6/2026: According to a Rolling Stone article, Grok is now estimated to be creating “one nonconsensual sexualized image per minute.”

Everything in life is a choice. I left Twitter because I chose not to support Elon Musk as he descended further into Nazi-coded racism. Others stayed because they chose to ignore that in favor of retaining the no doubt impressive follower reach they had. Choices.

Cosplayers need to choose to allow Twitter users to turn their photos into pornography, or choose to delete their entire Twitter profile and stave off as much of that as they can. To be clear, it won’t be possible to stop it, as many cosplay images have been shared or downloaded by now. Also to be clear, I am in no way attacking cosplay pornography. This opinion article is an attack on nonconsensual pornography of any kind.

This advice also applies to cosplay photographers. Prioritize the safety and consent of your cosplay subjects and get off Twitter, scrubbing as much of your photo content from that platform as you can.

While I do think everyone should have left Twitter long ago, I won’t judge any cosplayer or photographer for making this the line that was finally crossed for them. But this really needs to be that line in the sand. Cosplay is not consent, and that applies to the moment the photo was taken, as well as days, weeks or years after.

Leave a Reply

Your email address will not be published. Required fields are marked *