Aaj Logo

Published 19 Nov, 2025 01:32pm

Pakistan struggles to trust human talent in the age of AI

Artificial intelligence has infiltrated nearly every aspect of our lives. From mobile apps that recommend music to machines that can analyse medical data, AI has become part of how the world works.

Among these tools, ChatGPT has received the most attention. It can write essays, emails, articles, and even poetry within seconds.

For many people, it feels like a miracle of technology. For others, it feels like a threat.

In Pakistan, this debate has become quite strong. Many writers, teachers, and professionals have started to see every well-written piece of work with suspicion, assuming it must have been produced by ChatGPT.

This reaction reflects a mix of misunderstanding, insecurity, and fear about what AI means for originality and creativity.

ChatGPT is not magic. It is a tool built on large amounts of data that helps people express ideas more clearly or find information faster.

It can be a great assistant for anyone who writes, from students to journalists to business professionals.

But like every tool, it depends on how people use it. A pen, after all, can write a masterpiece or a lie.

The problem begins when people stop trusting human ability altogether and start believing that only AI can produce good writing.

In Pakistan, this misunderstanding is spreading fast. Whenever someone writes a well-structured or thoughtful piece, others often accuse them of using ChatGPT, as if humans have suddenly lost the ability to think and write for themselves.

There are several reasons behind this negative attitude. The first is the lack of understanding about what AI really does.

Many people think ChatGPT can read minds or produce perfect ideas on its own. In reality, it works by predicting patterns in language.

It does not know the truth or feel emotions; it only creates sentences that sound natural. Those who have never used it deeply often imagine it as a replacement for human thought.

This fear has made many people defensive about their own writing. Teachers now suspect students of using it for assignments, editors question contributors, and friends doubt each other’s creativity.

The word “AI-generated” has started to carry a negative meaning, almost like an insult.

Another reason is the deep insecurity that exists in Pakistan’s creative and academic circles.

For decades, the country’s education system has not encouraged original thinking. Most students are trained to memorise, repeat, and follow templates.

When a tool like ChatGPT comes along and suddenly produces content that sounds professional, it shakes their confidence.

Instead of seeing it as a chance to learn and improve, many people treat it as a competition.

This defensive reaction is natural, but it also limits growth. Instead of asking how they can use AI to become better thinkers, they waste time trying to find who is using it. The focus shifts from quality to suspicion.

The issue is also tied to how we define originality. Many people believe that originality means creating something completely new, with no influence or inspiration from anywhere else.

But true originality is about perspective, how we see, interpret, and express ideas. Even the greatest writers in history learned from others.

ChatGPT can provide structure or help with language, but the ideas, experiences, and emotions that shape writing still belong to the human being behind it.

The tool may help a writer polish their sentences, but it cannot understand their life, their society, or their heart.

Yet in Pakistan, this difference is often ignored. As a result, people label any polished or fluent writing as AI-generated, as if good English or clear thought is now suspicious.

There is also a cultural angle. Pakistan is a country where technology often arrives before understanding.

Many people use tools they do not fully comprehend. When social media first became popular, it was used to spread rumours and personal attacks instead of healthy debate.

Similarly, AI is now being seen through the same lens. Instead of learning how to use it responsibly, people either depend on it completely or reject it entirely.

Both extremes are harmful. This tool can make life easier for those who know how to guide it, but it can never replace human intelligence.

Sadly, those who misuse it for shortcuts, copying AI responses without adding thought, have created a bad reputation for everyone else who uses it properly.

In universities and workplaces, the reaction has become especially strong. Teachers complain that the students submit essays that sound too perfect to be real.

Editors find it hard to trust freelance writers. Employers suspect employees of using AI to prepare reports.

Some institutions have even banned ChatGPT altogether, thinking that it encourages dishonesty.

But this ban does not solve the problem. It only hides it. The better solution would be to teach people how to combine human creativity with AI assistance, to write with thought and use AI for support, not substitution.

In a way, the tool can actually make us more creative if we use it to explore ideas and save time for deeper thinking.

Many Pakistani writers and professionals also worry that AI will take away jobs. They fear that the editors will no longer need journalists, or companies will stop hiring content creators.

But this fear ignores the fact that AI still depends on human guidance. It cannot interview people, understand context, or express emotion in the same way a person can.

It can help write a report, but it cannot understand the politics behind that report. It can generate text, but it cannot feel the pain, joy, or struggle that gives writing its soul.

Those who learn to use AI as a partner will always be more valuable than those who either ignore it or misuse it.

The future will not belong to machines, but to those who know how to work with them wisely.

The suspicion around AI writing in Pakistan also exposes a larger issue: our struggle with trust and intellectual respect.

We are often quick to doubt others’ abilities. When someone writes well, we ask, “Who wrote it for you?” instead of appreciating their skill.

This attitude existed even before ChatGPT, but AI has made it worse. It is easier to blame a tool than to accept that someone else might simply be talented or hardworking.

This mindset damages our creative culture. Instead of inspiring each other, we discourage talent.

We turn every achievement into a question mark. In such an environment, innovation becomes difficult.

It is also important to understand that AI cannot replace authenticity. The best writing, whether in journalism, literature, or business communication, comes from experience. It reflects a person’s journey, emotions, and understanding of the world.

ChatGPT can imitate style but not substance. It can guess what a human might say, but cannot truly mean it.

If we depend only on AI, our writing will become hollow, without depth or identity.

Besides, if we use it to improve our grammar, expand our vocabulary, or find new ways to express our thoughts, it can be an amazing learning tool.

The problem is not AI itself but the attitude towards it.

In Pakistan, where communication skills and English proficiency are often seen as measures of intelligence, tools like ChatGPT create even more tension.

People who were once respected for their writing now feel challenged, while others who never had confidence in their language suddenly feel empowered.

This shift has created jealousy and confusion. Instead of celebrating the fact that more people can now express themselves clearly, many choose to criticise.

It shows how society sometimes fears equality, when a tool gives everyone access to knowledge, it threatens old hierarchies of talent.

Ultimately, we must accept that technology will keep evolving. Just as calculators did not destroy mathematics, AI will not destroy writing. It will only change how we approach it.

The real question is whether we will adapt or resist. Pakistan’s creative and academic communities need to build trust, not suspicion.

They must learn to tell the difference between thoughtless AI copying and thoughtful writing assisted by AI.

They must also understand that even if AI helps with language, the real strength of any piece lies in the ideas behind it. Machines can never replace that human spark.

If we continue to see every piece of good writing as AI-generated, we will only discourage our own people from thinking deeply or expressing themselves well.

We will create a culture of fear instead of curiosity. ChatGPT should be treated as a tool for learning and improvement, not as a threat.

The real danger is not that AI will make us less human, but that our fear of it will make us less trusting, less confident, and less open to progress.

Pakistan’s writers, teachers, and thinkers can either use this new technology sensibly or continue resisting something they do not fully understand.

The future will support those who pick learning instead of doubt, and creativity instead of fear.

The writer is a seasoned journalist and a communications professional.He can be reached at tariqkik@gmail.com

Read Comments