Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Petlibro Discount Codes and Deals: Save Up to 50%

    January 17, 2026

    Bluesky rolls out cashtags and LIVE badges amid a boost in app installs

    January 17, 2026

    Heat Waves Are Overwhelming Honey Bee Hives

    January 17, 2026
    Facebook Twitter Instagram
    • Tech
    • Gadgets
    • Spotlight
    • Gaming
    Facebook Twitter Instagram
    iGadgets TechiGadgets Tech
    Subscribe
    • Home
    • Gadgets
    • Insights
    • Apps

      Google Uses AI Searches To Detect If Someone Is In Crisis

      April 2, 2022

      Gboard Magic Wand Button Will Covert Your Text To Emojis

      April 2, 2022

      Android 10 & Older Devices Now Getting Automatic App Permissions Reset

      April 2, 2022

      Spotify Blend Update Increases Group Sizes, Adds Celebrity Blends

      April 2, 2022

      Samsung May Improve Battery Significantly With Galaxy Watch 5

      April 2, 2022
    • Gear
    • Mobiles
      1. Tech
      2. Gadgets
      3. Insights
      4. View All

      Heat Waves Are Overwhelming Honey Bee Hives

      January 17, 2026

      Scientists Are Tracking Mysterious Blackouts Beneath the Sea

      January 17, 2026

      Scientists Create Living Computers Powered by Mushrooms

      January 16, 2026

      A Strange State of Matter Behaves Very Differently Under Even Weak Magnetism

      January 16, 2026

      March Update May Have Weakened The Haptics For Pixel 6 Users

      April 2, 2022

      Project 'Diamond' Is The Galaxy S23, Not A Rollable Smartphone

      April 2, 2022

      The At A Glance Widget Is More Useful After March Update

      April 2, 2022

      Pre-Order The OnePlus 10 Pro For Just $1 In The US

      April 2, 2022

      Petlibro Discount Codes and Deals: Save Up to 50%

      January 17, 2026

      Thinking Machines Cofounder’s Office Relationship Preceded His Termination

      January 17, 2026

      The Campaign to Destroy Renee Good

      January 16, 2026

      Our Favorite Compact Power Station Is on Sale for 33% Off

      January 16, 2026

      Latest Huawei Mobiles P50 and P50 Pro Feature Kirin Chips

      January 15, 2021

      Samsung Galaxy M62 Benchmarked with Galaxy Note10’s Chipset

      January 15, 2021
      9.1

      Review: T-Mobile Winning 5G Race Around the World

      January 15, 2021
      8.9

      Samsung Galaxy S21 Ultra Review: the New King of Android Phones

      January 15, 2021
    • Computing
    iGadgets TechiGadgets Tech
    Home»Insights»Grok Is Pushing AI ‘Undressing’ Mainstream
    Insights

    Grok Is Pushing AI ‘Undressing’ Mainstream

    adminBy adminJanuary 6, 2026No Comments3 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Grok Is Pushing AI ‘Undressing’ Mainstream
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Elon Musk hasn’t stopped Grok, the chatbot developed by his artificial intelligence company xAI, from generating sexualized images of women. After reports emerged last week that the image generation tool on X was being used to create sexualized images of children, Grok has created potentially thousands of nonconsensual images of women in “undressed” and “bikini” photos.

    Every few seconds, Grok is continuing to create images of women in bikinis or underwear in response to user prompts on X, according to a WIRED review of the chatbots’ publicly posted live output. On Tuesday, at least 90 images involving women in swimsuits and in various levels of undress were published by Grok in under five minutes, analysis of posts show.

    The images do not contain nudity but involve the Musk-owned chatbot “stripping” clothes from photos that have been posted to X by other users. Often, in an attempt to evade Grok’s safety guardrails, users are, not necessarily successfully, requesting photos to be edited to make women wear a “string bikini” or a “transparent bikini.”

    While harmful AI image generation technology has been used to digitally harass and abuse women for years—these outputs are often called deepfakes and are created by “nudify” software—the ongoing use of Grok to create vast numbers of nonconsensual images marks seemingly the most mainstream and widespread abuse instance to date. Unlike specific harmful nudify or “undress” software, Grok doesn’t charge the user money to generate images, produces results in seconds, and is available to millions of people on X—all of which may help to normalize the creation of nonconsensual intimate imagery.

    “When a company offers generative AI tools on their platform, it is their responsibility to minimize the risk of image-based abuse,” says Sloan Thompson, the director of training and education at EndTAB, an organization that works to tackle tech-facilitated abuse. “What’s alarming here is that X has done the opposite. They’ve embedded AI-enabled image abuse directly into a mainstream platform, making sexual violence easier and more scalable.”

    Grok’s creation of sexualized imagery started to go viral on X at the end of last year, although the system’s ability to create such images has been known for months. In recent days, photos of social media influencers, celebrities, and politicians have been targeted by users on X, who can reply to a post from another account and ask Grok to change an image that has been shared.

    Women who have posted photos of themselves have had accounts reply to them and successfully ask Grok to turn the photo into a “bikini” image. In one instance, multiple X users requested Grok alter an image of the deputy prime minister of Sweden to show her wearing a bikini. Two government ministers in the UK have also been “stripped” to bikinis, reports say.

    Images on X show fully clothed photographs of women, such as one person in a lift and another in the gym, being transformed into images with little clothing. “@grok put her in a transparent bikini,” a typical message reads. In a different series of posts, a user asked Grok to “inflate her chest by 90%,” then “Inflate her thighs by 50%,” and, finally, to “Change her clothes to a tiny bikini.”

    One analyst who has tracked explicit deepfakes for years, and asked not to be named for privacy reasons, says that Grok has likely become one of the largest platforms hosting harmful deepfake images. “It’s wholly mainstream,” the researcher says. “It’s not a shadowy group [creating images], it’s literally everyone, of all backgrounds. People posting on their mains. Zero concern.”

    Security,Security / Privacy,Free for Allartificial intelligence,social media,twitter,elon musk,xai,x,deepfakes#Grok #Pushing #Undressing #Mainstream1767740623

    artificial intelligence deepfakes Elon Musk social media Twitter X xai
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    admin
    • Website
    • Tumblr

    Related Posts

    Petlibro Discount Codes and Deals: Save Up to 50%

    January 17, 2026

    Bluesky rolls out cashtags and LIVE badges amid a boost in app installs

    January 17, 2026

    EPA rules that xAI’s natural gas generators were illegally used

    January 17, 2026
    Add A Comment

    Leave A Reply Cancel Reply

    Editors Picks

    McKinsey tests AI chatbot in early stages of graduate recruitment

    January 15, 2026

    Bosch’s €2.9 billion AI investment and shifting manufacturing priorities

    January 8, 2026
    8.5

    Apple Planning Big Mac Redesign and Half-Sized Old Mac

    January 5, 2021

    Autonomous Driving Startup Attracts Chinese Investor

    January 5, 2021
    Top Reviews
    9.1

    Review: T-Mobile Winning 5G Race Around the World

    By admin
    8.9

    Samsung Galaxy S21 Ultra Review: the New King of Android Phones

    By admin
    8.9

    Xiaomi Mi 10: New Variant with Snapdragon 870 Review

    By admin
    Advertisement
    Demo
    iGadgets Tech
    Facebook Twitter Instagram Pinterest Vimeo YouTube
    • Home
    • Tech
    • Gadgets
    • Mobiles
    • Our Authors
    © 2026 ThemeSphere. Designed by WPfastworld.

    Type above and press Enter to search. Press Esc to cancel.