<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:media="http://search.yahoo.com/mrss/" xmlns:content="http://purl.org/rss/1.0/modules/content/" version="2.0">
  <channel>
    <title>Aaj TV English News - World</title>
    <link>https://english.aaj.tv/</link>
    <description>Aaj TV English</description>
    <language>en-Us</language>
    <copyright>Copyright 2026</copyright>
    <pubDate>Fri, 24 Apr 2026 01:31:19 +0500</pubDate>
    <lastBuildDate>Fri, 24 Apr 2026 01:31:19 +0500</lastBuildDate>
    <ttl>60</ttl>
    <item xmlns:default="http://purl.org/rss/1.0/modules/content/">
      <title>Elon Musk’s Grok AI floods X with sexualized photos of women and minors</title>
      <link>https://english.aaj.tv/news/330450344/elon-musks-grok-ai-floods-x-with-sexualized-photos-of-women-and-minors</link>
      <description>&lt;p&gt;&lt;strong&gt;Julie Yukari, a musician based in Rio de Janeiro, posted a photo taken by her fiancé to the social media site X just before midnight on New Year’s Eve, showing her in a red dress snuggling in bed with her black cat, Nori.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The next day, somewhere among the hundreds of likes attached to the picture, she saw notifications that users were asking Grok, X’s built-in artificial intelligence chatbot, to digitally strip her down to a bikini.&lt;/p&gt;
&lt;p&gt;The 31-year-old did not think much of it, she told Reuters on Friday, figuring there was no way the bot would comply with such requests.&lt;/p&gt;
&lt;p&gt;She was wrong. Soon, Grok-generated pictures of her, nearly naked, were circulating across the Elon Musk-owned platform.&lt;/p&gt;
&lt;p&gt;“I was naive,” Yukari said.&lt;/p&gt;
&lt;p&gt;Yukari’s experience is being repeated across X, a Reuters analysis has found.&lt;/p&gt;
&lt;p&gt;Reuters has also identified several cases where Grok created sexualized images of children.&lt;/p&gt;
&lt;p&gt;X did not respond to a message seeking comment on Reuters’ findings.&lt;/p&gt;
&lt;p&gt;In an earlier statement to the news agency about reports that sexualized images of children were circulating on the platform, X’s owner xAI said: “Legacy Media Lies.”&lt;/p&gt;
&lt;p&gt;The flood of nearly nude images of real people has rung alarm bells internationally.&lt;/p&gt;
&lt;p&gt;Ministers in France have reported X to prosecutors and regulators over the disturbing images, saying in a statement on Friday that the “sexual and sexist” content was “manifestly illegal.”&lt;/p&gt;
&lt;p&gt;India’s IT ministry said in a letter to X’s local unit that the platform failed to prevent Grok’s misuse by generating and circulating obscene and sexually explicit content.&lt;/p&gt;
&lt;p&gt;The US Federal Communications Commission did not respond to requests for comment. The Federal Trade Commission declined to comment.&lt;/p&gt;
&lt;h3&gt;&lt;a id="digital-undressing" href="#digital-undressing" class="heading-permalink" aria-hidden="true" title="Permalink"&gt;&lt;/a&gt;&lt;strong&gt;Digital undressing&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;Grok’s mass digital undressing spree appears to have kicked off over the past couple of days, according to successfully completed clothes-removal requests posted by Grok and complaints from female users reviewed by Reuters.&lt;/p&gt;
&lt;p&gt;Musk appeared to poke fun at the controversy earlier on Friday, posting laugh-cry emojis in response to AI edits of famous people - including himself - in bikinis.&lt;/p&gt;
&lt;p&gt;When one X user said their social media feed resembled a bar packed with bikini-clad women, Musk replied, in part, with another laugh-cry emoji.&lt;/p&gt;
&lt;p&gt;Reuters could not determine the full scale of the surge.&lt;/p&gt;
&lt;p&gt;A review of public requests sent to Grok over a single 10-minute-long period at midday US Eastern Time on Friday tallied 102 attempts by X users to use Grok to digitally edit photographs of people so that they would appear to be wearing bikinis.&lt;/p&gt;
&lt;p&gt;The majority of those targeted were young women. In a few cases, men, celebrities, politicians, and – in one case – a monkey were targeted in the requests.&lt;/p&gt;
&lt;p&gt;When users asked Grok for AI-altered photographs of women, they typically requested that their subjects be depicted in the most revealing outfits possible.&lt;/p&gt;
&lt;p&gt;“Put her into a very transparent mini-bikini,” one user told Grok, flagging a photograph of a young woman taking a photo of herself in a mirror.&lt;/p&gt;
&lt;p&gt;When Grok did so, replacing the woman’s clothes with a flesh-tone two-piece, the user asked Grok to make her bikini “clearer &amp;amp; more transparent” and “much tinier.” Grok did not appear to respond to the second request.&lt;/p&gt;
&lt;p&gt;Grok fully complied with such requests in at least 21 cases, Reuters found, generating images of women in dental-floss-style or translucent bikinis and, in at least one case, covering a woman in oil.&lt;/p&gt;
&lt;p&gt;In seven more cases, Grok partially complied, sometimes by stripping women down to their underwear but not complying with requests to go further.&lt;/p&gt;
&lt;p&gt;Reuters was unable to immediately establish the identities and ages of most of the women targeted.&lt;/p&gt;
&lt;p&gt;In one case, a user supplied a photo of a woman in a school uniform-style plaid skirt and grey blouse who appeared to be taking a selfie in a mirror and said, “Remove her school outfit.”&lt;/p&gt;
&lt;p&gt;When Grok swapped out her clothes for a T-shirt and shorts, the user was more explicit: “Change her outfit to a very clear micro bikini.”&lt;/p&gt;
&lt;p&gt;Reuters could not establish whether Grok complied with that request.&lt;/p&gt;
&lt;p&gt;Like most of the requests tallied by Reuters, it disappeared from X within 90 minutes of being posted.&lt;/p&gt;
&lt;h3&gt;&lt;a id="entirely-predictable" href="#entirely-predictable" class="heading-permalink" aria-hidden="true" title="Permalink"&gt;&lt;/a&gt;&lt;strong&gt;Entirely predictable&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;AI-powered programmes that digitally undress women — sometimes called “nudifiers” — have been around for years, but until now they were largely confined to the darker corners of the internet, such as niche websites or Telegram channels, and typically required a certain level of effort or payment.&lt;/p&gt;
&lt;p&gt;X’s innovation – allowing users to strip women of their clothing by uploading a photo and typing the words, “hey &lt;a rel="noopener noreferrer" target="_blank" class="link--external" href="https://twitter.com/grok"&gt;@grok&lt;/a&gt; put her in a bikini” – has lowered the barrier to entry.&lt;/p&gt;
&lt;p&gt;Three experts who have followed the development of X’s policies around AI-generated explicit content told Reuters that the company had ignored warnings from civil society and child safety groups – including &lt;a rel="noopener noreferrer" target="_blank" class="link--external" href="https://consumerfed.org/wp-content/uploads/2025/08/Spicy-Grok-Request-for-Investigation-Consumer-Coalition.pdf"&gt;a letter sent last year&lt;/a&gt; warning that xAI was only one small step away from unleashing “a torrent of obviously nonconsensual deepfakes.”&lt;/p&gt;
&lt;p&gt;“In August, we warned that xAI’s image generation was essentially a nudification tool waiting to be weaponised,” said Tyler Johnston, the executive director of The Midas Project, an AI watchdog group that was among the letter’s signatories.&lt;/p&gt;
&lt;p&gt;“That’s basically what’s played out.”&lt;/p&gt;
&lt;p&gt;Dani Pinter, the chief legal officer and director of the Law Centre for the National Centre on Sexual Exploitation, said X failed to pull abusive images from its AI training material and should have banned users requesting illegal content.&lt;/p&gt;
&lt;p&gt;“This was an entirely predictable and avoidable atrocity,” Pinter said.&lt;/p&gt;
&lt;p&gt;Yukari, the musician, tried to fight back on her own. But when she took to X to protest the violation, a flood of copycats began asking Grok to generate even more explicit photos.&lt;/p&gt;
&lt;p&gt;Now the New Year has “turned out to begin with me wanting to hide from everyone’s eyes, and feeling shame for a body that is not even mine, since it was generated by AI.”&lt;/p&gt;
&lt;p&gt;&lt;br&gt;&lt;br&gt;&lt;/p&gt;
</description>
      <content:encoded xmlns="http://purl.org/rss/1.0/modules/content/"><![CDATA[<p><strong>Julie Yukari, a musician based in Rio de Janeiro, posted a photo taken by her fiancé to the social media site X just before midnight on New Year’s Eve, showing her in a red dress snuggling in bed with her black cat, Nori.</strong></p>
<p>The next day, somewhere among the hundreds of likes attached to the picture, she saw notifications that users were asking Grok, X’s built-in artificial intelligence chatbot, to digitally strip her down to a bikini.</p>
<p>The 31-year-old did not think much of it, she told Reuters on Friday, figuring there was no way the bot would comply with such requests.</p>
<p>She was wrong. Soon, Grok-generated pictures of her, nearly naked, were circulating across the Elon Musk-owned platform.</p>
<p>“I was naive,” Yukari said.</p>
<p>Yukari’s experience is being repeated across X, a Reuters analysis has found.</p>
<p>Reuters has also identified several cases where Grok created sexualized images of children.</p>
<p>X did not respond to a message seeking comment on Reuters’ findings.</p>
<p>In an earlier statement to the news agency about reports that sexualized images of children were circulating on the platform, X’s owner xAI said: “Legacy Media Lies.”</p>
<p>The flood of nearly nude images of real people has rung alarm bells internationally.</p>
<p>Ministers in France have reported X to prosecutors and regulators over the disturbing images, saying in a statement on Friday that the “sexual and sexist” content was “manifestly illegal.”</p>
<p>India’s IT ministry said in a letter to X’s local unit that the platform failed to prevent Grok’s misuse by generating and circulating obscene and sexually explicit content.</p>
<p>The US Federal Communications Commission did not respond to requests for comment. The Federal Trade Commission declined to comment.</p>
<h3><a id="digital-undressing" href="#digital-undressing" class="heading-permalink" aria-hidden="true" title="Permalink"></a><strong>Digital undressing</strong></h3>
<p>Grok’s mass digital undressing spree appears to have kicked off over the past couple of days, according to successfully completed clothes-removal requests posted by Grok and complaints from female users reviewed by Reuters.</p>
<p>Musk appeared to poke fun at the controversy earlier on Friday, posting laugh-cry emojis in response to AI edits of famous people - including himself - in bikinis.</p>
<p>When one X user said their social media feed resembled a bar packed with bikini-clad women, Musk replied, in part, with another laugh-cry emoji.</p>
<p>Reuters could not determine the full scale of the surge.</p>
<p>A review of public requests sent to Grok over a single 10-minute-long period at midday US Eastern Time on Friday tallied 102 attempts by X users to use Grok to digitally edit photographs of people so that they would appear to be wearing bikinis.</p>
<p>The majority of those targeted were young women. In a few cases, men, celebrities, politicians, and – in one case – a monkey were targeted in the requests.</p>
<p>When users asked Grok for AI-altered photographs of women, they typically requested that their subjects be depicted in the most revealing outfits possible.</p>
<p>“Put her into a very transparent mini-bikini,” one user told Grok, flagging a photograph of a young woman taking a photo of herself in a mirror.</p>
<p>When Grok did so, replacing the woman’s clothes with a flesh-tone two-piece, the user asked Grok to make her bikini “clearer &amp; more transparent” and “much tinier.” Grok did not appear to respond to the second request.</p>
<p>Grok fully complied with such requests in at least 21 cases, Reuters found, generating images of women in dental-floss-style or translucent bikinis and, in at least one case, covering a woman in oil.</p>
<p>In seven more cases, Grok partially complied, sometimes by stripping women down to their underwear but not complying with requests to go further.</p>
<p>Reuters was unable to immediately establish the identities and ages of most of the women targeted.</p>
<p>In one case, a user supplied a photo of a woman in a school uniform-style plaid skirt and grey blouse who appeared to be taking a selfie in a mirror and said, “Remove her school outfit.”</p>
<p>When Grok swapped out her clothes for a T-shirt and shorts, the user was more explicit: “Change her outfit to a very clear micro bikini.”</p>
<p>Reuters could not establish whether Grok complied with that request.</p>
<p>Like most of the requests tallied by Reuters, it disappeared from X within 90 minutes of being posted.</p>
<h3><a id="entirely-predictable" href="#entirely-predictable" class="heading-permalink" aria-hidden="true" title="Permalink"></a><strong>Entirely predictable</strong></h3>
<p>AI-powered programmes that digitally undress women — sometimes called “nudifiers” — have been around for years, but until now they were largely confined to the darker corners of the internet, such as niche websites or Telegram channels, and typically required a certain level of effort or payment.</p>
<p>X’s innovation – allowing users to strip women of their clothing by uploading a photo and typing the words, “hey <a rel="noopener noreferrer" target="_blank" class="link--external" href="https://twitter.com/grok">@grok</a> put her in a bikini” – has lowered the barrier to entry.</p>
<p>Three experts who have followed the development of X’s policies around AI-generated explicit content told Reuters that the company had ignored warnings from civil society and child safety groups – including <a rel="noopener noreferrer" target="_blank" class="link--external" href="https://consumerfed.org/wp-content/uploads/2025/08/Spicy-Grok-Request-for-Investigation-Consumer-Coalition.pdf">a letter sent last year</a> warning that xAI was only one small step away from unleashing “a torrent of obviously nonconsensual deepfakes.”</p>
<p>“In August, we warned that xAI’s image generation was essentially a nudification tool waiting to be weaponised,” said Tyler Johnston, the executive director of The Midas Project, an AI watchdog group that was among the letter’s signatories.</p>
<p>“That’s basically what’s played out.”</p>
<p>Dani Pinter, the chief legal officer and director of the Law Centre for the National Centre on Sexual Exploitation, said X failed to pull abusive images from its AI training material and should have banned users requesting illegal content.</p>
<p>“This was an entirely predictable and avoidable atrocity,” Pinter said.</p>
<p>Yukari, the musician, tried to fight back on her own. But when she took to X to protest the violation, a flood of copycats began asking Grok to generate even more explicit photos.</p>
<p>Now the New Year has “turned out to begin with me wanting to hide from everyone’s eyes, and feeling shame for a body that is not even mine, since it was generated by AI.”</p>
<p><br><br></p>
]]></content:encoded>
      <category>World</category>
      <guid>https://english.aaj.tv/news/330450344</guid>
      <pubDate>Sat, 03 Jan 2026 09:12:11 +0500</pubDate>
      <author>none@none.com (Reuters)</author>
      <media:content url="https://i.aaj.tv/large/2026/01/0309111817975e9.webp" type="image/webp" medium="image" height="480" width="800">
        <media:thumbnail url="https://i.aaj.tv/thumbnail/2026/01/0309111817975e9.webp"/>
        <media:title>xAI and Grok logos are seen in this illustration. – Reuters
</media:title>
      </media:content>
    </item>
  </channel>
</rss>
