Close Menu
Finsider

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    The FTSE 100 is outperforming the S&P 500 so far this year. Can it last?

    August 7, 2025

    OpenAI’s GPT-5 Models Leak On GitHub Ahead Of Official Reveal

    August 7, 2025

    Stocks Rally on Apple Strength: Stock Market Today

    August 7, 2025
    Facebook X (Twitter) Instagram
    Trending
    • The FTSE 100 is outperforming the S&P 500 so far this year. Can it last?
    • OpenAI’s GPT-5 Models Leak On GitHub Ahead Of Official Reveal
    • Stocks Rally on Apple Strength: Stock Market Today
    • Should I buy more BAE Systems’ shares on the dip after apparently good H1 results?
    • The best earbuds we’ve tested for 2025
    • Jim Cramer Highlights Microsoft’s $4 Trillion Moment and AI-Driven Strength
    • Microsoft Planning Return-to-Office Mandate: Report
    • ESPN to Acquire NFL Network and RedZone in Exchange for 10% Equity Stake
    Facebook X (Twitter) Instagram Pinterest Vimeo
    Finsider
    • Markets & Ecomony
    • Tech & Innovation
    • Money & Wealth
    • Business & Startups
    • Visa & Residency
    Finsider
    Home»Tech & Innovation»Grok’s ‘Spicy’ Mode Makes NSFW Celebrity Deepfakes of Women (But Not Men)
    Tech & Innovation

    Grok’s ‘Spicy’ Mode Makes NSFW Celebrity Deepfakes of Women (But Not Men)

    FinsiderBy FinsiderAugust 6, 2025No Comments7 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Grok's 'Spicy' Mode Makes NSFW Celebrity Deepfakes of Women (But Not Men)
    Share
    Facebook Twitter LinkedIn Pinterest Email

    This week, Elon Musk officially launched Grok Imagine, xAI’s image and video generator for iOS, for people who subscribe to SuperGrok and Premium+ X. The app allows users to create NSFW content with its “Spicy” mode, and The Verge reported on Tuesday that users are able to create topless videos of Taylor Swift easily—without even asking for it. But it’s not just Swift who should be concerned about Musk’s new AI-generated softcore porn tool.

    Gizmodo created about two dozen videos of politicians, celebrities, and tech figures using the Grok Spicy mode, though some were blurred out or came back with a message reading “video moderated.” When Grok did make scandalous images, it would only make the ones depicting women truly not-safe-for-work. Videos of men were the kind of thing that wouldn’t really raise many eyebrows.

    X has been swamped over the past two days with AI-generated images of naked women and tips on how to achieve the most nudity. But users, who’ve created tens of millions of Grok Imagine images according to Musk, don’t even need to go to some great effort to get deepfakes of naked celebrities. Gizmodo didn’t explicitly ask for nudity in the examples we cite in this article, but we still got plenty of it. All we did was click on the Spicy button, which is one of four options, along with Custom, Fun, and Normal.

    Gizmodo tested Grok Imagine by generating videos of not just Taylor Swift, but other prominent women like Melania Trump and historical figures like Martha Washington. Melania Trump has been a vocal supporter of the Take It Down Act, which makes it illegal to publish non-consensual “intimate imagery,” including deepfakes.

    Grok also created a not-safe-for-work video of the late feminist writer Valerie Solanas, author of 1967’s S.C.U.M Manifesto. Almost all of the videos depicted the women that we tested as shedding clothes to make them naked from the waist up, though the video of Solanas was unique in that it did show her completely naked.

    What happens when you try to generate Spicy videos of men? The AI will have the male figure take off his shirt, but there’s nothing more scandalous than that. When Gizmodo figured out that it would only remove a man’s shirt, we prompted the AI to create a shirtless image of Elon Musk and see what it might do with that. The result was the extremely ridiculous (and safe-for-work) video you see below.

    Attempts to make videos of Mark Zuckerberg, Jeff Bezos, Joaquin Phoenix, and Charlie Chaplin, as well as Presidents Barack Obama, Bill Clinton, and George Washington, ran into the same limitation. The AI-generated videos will have the men take their shirts off most of the time, but there’s nothing beyond that. And if there is anything more, it’s usually so cringe that we’d worry about users dying from secondhand embarrassment. Making a Spicy video of Errol Musk, Elon’s father, produced the same thing. He just took off his shirt.

    When we made a generic man to see if Spicy mode would be more loose with its sexual content since it wasn’t a known public figure, it still just made a bizarre, awkward video of a man tugging at his pants. The pants, it should be noted, seem to be a combination of shorts for one leg and long jeans for the other before transforming into just shorts. The audio for each video was also auto-generated without any further instruction.

    Trying the same thing with a generic woman rendered much more revealing images of a woman in a swimsuit who pulls down the top to reveal her naked breasts.

    Most mainstream AI video generators, like OpenAI’s Sora and Google’s Veo, have guardrails to protect against the creation of things like revenge porn and images of celebrities. And it seems like xAI does in some ways, at least for men. But most people would probably object to their image being used to create an AI avatar in various states of undress. Gizmodo reached out to Musk through xAI to ask about safeguards and whether it’s acceptable for users to create topless videos of celebrities. We haven’t heard back.

    One of the most striking things about Grok’s AI image generator is that it’s often terrible at making convincing celebrity fakes. For example, the images below were generated when asking for Vice President JD Vance and actress Sydney Sweeney. And unless we completely forgot how those two people look, it’s not even close. That could turn out to be Musk’s saving grace, given the fact that a tool like this is bound to attract lawsuits.

    Sydney Sweeney Jd Vance
    Phone screenshots of images created by SuperGrok that are supposed to depict Sydney Sweeney and JD Vance. Screenshots: xAI

    There were other glitches, like when we created an AI-generated image of President Harry Truman that looked very little like him, and the man’s nipples appeared to be on the outside of his dress shirt. Truman, in Spicy mode, did take off his shirt to reveal his bare chest, which had identical nipples.

    When Gizmodo created images using the prompt “Gizmodo writer Matt Novak,” the result was similar to what we saw with videos for Elon Musk and generic men. The figure (who, we should note, is in much better shape than the real Matt Novak) took off his shirt with a simple click of the Spicy button.

    As The Verge notes, there is an age verification window when a user first tries to create a video with Grok Imagine, but there doesn’t appear to be any kind of check by the company to confirm the year a given user was actually born. Thankfully, Gizmodo’s generation of a cartoon Mickey Mouse in Spicy mode didn’t render anything scandalous, just the animated character jumping harmlessly. An AI image of Batman yielded a “Spicy” result not unlike other male figures, where he only stripped his top off.

    Gizmodo did not attempt to create any images of children, though The Verge did try that in Spicy mode and reports nothing inappropriate was rendered. The “Spicy” mode was still an option that was listed, however. “You can still select it, but in all my tests, it just added generic movement,” The Verge notes. Elon Musk very infamously reinstated an account on X that posted child sexual abuse material in 2023, according to the Washington Post.

    It’s perhaps not surprising that Elon Musk’s new NSFW video creator has different standards for men and women. The billionaire recently retweeted a far-right figure who claimed that women are “anti-white” because they’re “weak.” The Tesla CEO, who suggested in 2024 that he wanted to impregnate Taylor Swift, isn’t exactly known for being a champion of women’s rights.

    Gizmodo signed up for the $30 per month SuperGrok subscription and only got to test it for about 1.5 hours before we were told we’d reached our image creation limit. Strangely enough, users can still create a single still image for a prompt after getting the warning and generate NSFW videos using that lone image, but it’s much more limited than what was previously available.

    We were told to upgrade to SuperGrok Heavy for $300 per month if we wanted to continue using the tool with all its features. But given the fact that we didn’t need any more shitty images of naked celebrities to write this article, we declined. We got the answers we were looking for, unfortunately.

    Celebrity Deepfakes Groks Men Mode NSFW Spicy Women
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleDon’t Let Your Dreams Go Unfulfilled: Plan for Your Passion in Retirement
    Next Article Stocks Surge as Investors Digest Flurry of Earnings Reports, Keep Tabs on Tariff News; Apple Soars, AMD Slides
    Finsider
    • Website

    Related Posts

    Tech & Innovation

    OpenAI’s GPT-5 Models Leak On GitHub Ahead Of Official Reveal

    August 7, 2025
    Tech & Innovation

    The best earbuds we’ve tested for 2025

    August 7, 2025
    Tech & Innovation

    Clay confirms it closed $100M round at $3.1B valuation

    August 5, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    The FTSE 100 is outperforming the S&P 500 so far this year. Can it last?

    August 7, 2025

    Cursor snaps up enterprise startup Koala in challenge to GitHub Copilot

    July 18, 2025

    What is Mistral AI? Everything to know about the OpenAI competitor

    July 18, 2025
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Using Gen AI for Early-Stage Market Research

    July 18, 2025

    Cursor snaps up enterprise startup Koala in challenge to GitHub Copilot

    July 18, 2025

    What is Mistral AI? Everything to know about the OpenAI competitor

    July 18, 2025
    news

    The FTSE 100 is outperforming the S&P 500 so far this year. Can it last?

    August 7, 2025

    OpenAI’s GPT-5 Models Leak On GitHub Ahead Of Official Reveal

    August 7, 2025

    Stocks Rally on Apple Strength: Stock Market Today

    August 7, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    © 2020 - 2025 The Finsider . Powered by LINC GLOBAL Inc.
    • Contact us
    • Guest Post Policy
    • Privacy Policy
    • Terms of Service

    Type above and press Enter to search. Press Esc to cancel.