<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:media="http://search.yahoo.com/mrss/" xmlns:content="http://purl.org/rss/1.0/modules/content/" version="2.0">
  <channel>
    <title>Aaj TV English News - Technology</title>
    <link>https://english.aaj.tv/</link>
    <description>Aaj TV English</description>
    <language>en-Us</language>
    <copyright>Copyright 2026</copyright>
    <pubDate>Thu, 23 Apr 2026 17:21:07 +0500</pubDate>
    <lastBuildDate>Thu, 23 Apr 2026 17:21:07 +0500</lastBuildDate>
    <ttl>60</ttl>
    <item xmlns:default="http://purl.org/rss/1.0/modules/content/">
      <title>OpenAI’s GPT-4.1 can now code better than most developers</title>
      <link>https://english.aaj.tv/news/330411696/openais-gpt-41-can-now-code-better-than-most-developers</link>
      <description>&lt;p&gt;&lt;strong&gt;OpenAI on Monday launched its new AI model GPT-4.1, along with smaller versions GPT-4.1 mini and GPT-4.1 nano, touting major improvements in coding, instruction following, and long context comprehension.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;The new models, available only on OpenAI’s application programming interface (API), outperform the company’s most advanced GPT-4o model across the board, the ChatGPT maker said.&lt;/p&gt;
&lt;p&gt;With improved context understanding, they can support up to 1 million “tokens” — a term that refers to the units of data processed by an AI model. The models are also equipped with refreshed knowledge up to June 2024.&lt;/p&gt;
&lt;p&gt;GPT-4.1 showed a 21% improvement over GPT-4o and 27% over GPT-4.5 on coding. Meanwhile, the improvements in instruction following and long-context comprehension also make the GPT-4.1 models more effective at powering AI agents.&lt;/p&gt;
&lt;blockquote class="blockquote-level-1"&gt;
&lt;p&gt;&lt;a href="https://english.aaj.tv/news/330400902/openai-announces-new-deep-research-tool-for-chatgpt"&gt;OpenAI announces new ‘deep research’ tool for ChatGPT&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;“Benchmarks are strong, but we focused on real-world utility, and developers seem very happy,” CEO Sam Altman said in a post on social media platform X.&lt;/p&gt;
&lt;p&gt;The family of models also operate at a “much lower cost” compared to GPT-4.5, OpenAI said. The company added it would turn off the GPT-4.5 preview that is available in the API in July, as the new models offer “improved or similar performance.”&lt;/p&gt;
</description>
      <content:encoded xmlns="http://purl.org/rss/1.0/modules/content/"><![CDATA[<p><strong>OpenAI on Monday launched its new AI model GPT-4.1, along with smaller versions GPT-4.1 mini and GPT-4.1 nano, touting major improvements in coding, instruction following, and long context comprehension.</strong></p>
<p>The new models, available only on OpenAI’s application programming interface (API), outperform the company’s most advanced GPT-4o model across the board, the ChatGPT maker said.</p>
<p>With improved context understanding, they can support up to 1 million “tokens” — a term that refers to the units of data processed by an AI model. The models are also equipped with refreshed knowledge up to June 2024.</p>
<p>GPT-4.1 showed a 21% improvement over GPT-4o and 27% over GPT-4.5 on coding. Meanwhile, the improvements in instruction following and long-context comprehension also make the GPT-4.1 models more effective at powering AI agents.</p>
<blockquote class="blockquote-level-1">
<p><a href="https://english.aaj.tv/news/330400902/openai-announces-new-deep-research-tool-for-chatgpt">OpenAI announces new ‘deep research’ tool for ChatGPT</a></p>
</blockquote>
<p>“Benchmarks are strong, but we focused on real-world utility, and developers seem very happy,” CEO Sam Altman said in a post on social media platform X.</p>
<p>The family of models also operate at a “much lower cost” compared to GPT-4.5, OpenAI said. The company added it would turn off the GPT-4.5 preview that is available in the API in July, as the new models offer “improved or similar performance.”</p>
]]></content:encoded>
      <category>Technology</category>
      <guid>https://english.aaj.tv/news/330411696</guid>
      <pubDate>Mon, 14 Apr 2025 23:17:24 +0500</pubDate>
      <author>none@none.com (Reuters)</author>
      <media:content url="https://i.aaj.tv/large/2025/04/14231556b7d41f2.webp?r=231724" type="image/webp" medium="image" height="720" width="1200">
        <media:thumbnail url="https://i.aaj.tv/thumbnail/2025/04/14231556b7d41f2.webp?r=231724"/>
        <media:title>OpenAI logo is seen in this illustration taken May 20, 2024. REUTERS
</media:title>
      </media:content>
    </item>
  </channel>
</rss>
