<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:media="http://search.yahoo.com/mrss/" xmlns:content="http://purl.org/rss/1.0/modules/content/" version="2.0">
  <channel>
    <title>Aaj TV English News - World</title>
    <link>https://english.aaj.tv/</link>
    <description>Aaj TV English</description>
    <language>en-Us</language>
    <copyright>Copyright 2026</copyright>
    <pubDate>Tue, 07 Apr 2026 12:56:13 +0500</pubDate>
    <lastBuildDate>Tue, 07 Apr 2026 12:56:13 +0500</lastBuildDate>
    <ttl>60</ttl>
    <item xmlns:default="http://purl.org/rss/1.0/modules/content/">
      <title>OpenAI unveils voice-cloning tool</title>
      <link>https://english.aaj.tv/news/30356543/openai-unveils-voice-cloning-tool</link>
      <description>&lt;p&gt;&lt;strong&gt;OpenAI on Friday revealed a voice-cloning tool it plans to keep tightly controlled until safeguards are in place to thwart audio fakes meant to dupe listeners.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;A model called “Voice Engine” can essentially duplicate someone’s speech based on a 15-second audio sample, according to an OpenAI blog post sharing results of a small-scale test of the tool.&lt;/p&gt;
&lt;p&gt;“We recognize that generating speech that resembles people’s voices has serious risks, which are especially top of mind in an election year,” the San Francisco-based company said.&lt;/p&gt;
&lt;p&gt;“We are engaging with U.S. and international partners from across government, media, entertainment, education, civil society and beyond to ensure we are incorporating their feedback as we build.”&lt;/p&gt;
&lt;p&gt;Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap, easy to use and hard to trace.&lt;/p&gt;
&lt;p&gt;Acknowledging these problems, OpenAI said it was “taking a cautious and informed approach to a broader release due to the potential for synthetic voice misuse.”&lt;/p&gt;
&lt;p&gt;The cautious unveiling came a few months after a political consultant working for the long-shot presidential campaign of a Democratic rival to Joe Biden admitted being behind a robocall impersonating the US leader.&lt;/p&gt;
&lt;p&gt;The AI-generated call, the brainchild of an operative for Minnesota congressman Dean Phillips, featured what sounded like Biden’s voice urging people not to cast ballots in January’s New Hampshire primary.&lt;/p&gt;
&lt;p&gt;The incident caused alarm among experts who fear a deluge of AI-powered deepfake disinformation in the 2024 White House race as well as in other key elections around the globe this year.&lt;/p&gt;
&lt;p&gt;OpenAI said that partners testing Voice Engine agreed to rules including requiring explicit and informed consent of any person whose voice is duplicated using the tool.&lt;/p&gt;
&lt;p&gt;It must also be made clear to audiences when voices they are hearing are AI generated, the company added.&lt;/p&gt;
&lt;p&gt;“We have implemented a set of safety measures, including watermarking to trace the origin of a&lt;/p&gt;
</description>
      <content:encoded xmlns="http://purl.org/rss/1.0/modules/content/"><![CDATA[<p><strong>OpenAI on Friday revealed a voice-cloning tool it plans to keep tightly controlled until safeguards are in place to thwart audio fakes meant to dupe listeners.</strong></p>
<p>A model called “Voice Engine” can essentially duplicate someone’s speech based on a 15-second audio sample, according to an OpenAI blog post sharing results of a small-scale test of the tool.</p>
<p>“We recognize that generating speech that resembles people’s voices has serious risks, which are especially top of mind in an election year,” the San Francisco-based company said.</p>
<p>“We are engaging with U.S. and international partners from across government, media, entertainment, education, civil society and beyond to ensure we are incorporating their feedback as we build.”</p>
<p>Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap, easy to use and hard to trace.</p>
<p>Acknowledging these problems, OpenAI said it was “taking a cautious and informed approach to a broader release due to the potential for synthetic voice misuse.”</p>
<p>The cautious unveiling came a few months after a political consultant working for the long-shot presidential campaign of a Democratic rival to Joe Biden admitted being behind a robocall impersonating the US leader.</p>
<p>The AI-generated call, the brainchild of an operative for Minnesota congressman Dean Phillips, featured what sounded like Biden’s voice urging people not to cast ballots in January’s New Hampshire primary.</p>
<p>The incident caused alarm among experts who fear a deluge of AI-powered deepfake disinformation in the 2024 White House race as well as in other key elections around the globe this year.</p>
<p>OpenAI said that partners testing Voice Engine agreed to rules including requiring explicit and informed consent of any person whose voice is duplicated using the tool.</p>
<p>It must also be made clear to audiences when voices they are hearing are AI generated, the company added.</p>
<p>“We have implemented a set of safety measures, including watermarking to trace the origin of a</p>
]]></content:encoded>
      <category>World</category>
      <guid>https://english.aaj.tv/news/30356543</guid>
      <pubDate>Sat, 30 Mar 2024 22:01:36 +0500</pubDate>
      <author>none@none.com (AFP)</author>
      <media:content url="https://i.aaj.tv/large/2024/03/30220204a4e5d8f.webp?r=220218" type="image/webp" medium="image" height="480" width="800">
        <media:thumbnail url="https://i.aaj.tv/thumbnail/2024/03/30220204a4e5d8f.webp?r=220218"/>
        <media:title>AFP
</media:title>
      </media:content>
    </item>
  </channel>
</rss>
