Shashi Tharoor slams fake AI video that uses fabricated voice and language praising Pakistan

Reality just got another dent. This time, the victim is Shashi Tharoor, India’s resident wordsmith and arguably the only politician whose vocabulary requires a supplementary dictionary. A video started doing the rounds on the usual toxic corners of the internet, showing Tharoor offering a glowing tribute to Pakistan. It was a lie. A digital marionette show.

Tharoor didn't take it lying down. He took to X—the platform formerly known as Twitter and currently known as a digital dumpster fire—to clarify that the clip was "neither my language nor my voice." He’s right, of course. For a man who built a career on the precise, rhythmic application of the English language, being mimicked by a clunky algorithm is more than just political sabotage. It’s an aesthetic insult.

We’ve reached the "cheap fake" era of political warfare. You don’t need a state-sponsored lab or a Hollywood VFX budget to ruin a reputation anymore. All it takes is a twenty-dollar subscription to an AI cloning tool and a few minutes of clean audio. The friction here isn’t the technology; it’s the price of entry. For less than the cost of a mediocre steak dinner, anyone with a grudge can make a Member of Parliament say whatever they want.

The tech is fundamentally lazy. These models aren't "thinking." They’re just predicting the next most likely syllable based on thousands of hours of data. In Tharoor’s case, the AI probably struggled. How do you train a model to replicate a man who uses words like "floccinaucinihilipilification" without sounding like a glitching Speak & Spell? The result was a synthetic slurry that looked just real enough to fool anyone who already wanted to hate him.

That’s the real grift. These fakes don’t need to be perfect. They just need to be plausible for thirty seconds on a grainy smartphone screen. By the time a fact-check catches up, the emotional damage is done. The outrage has already been harvested. The algorithm has already pushed the clip to three million people who will never see the correction.

Silicon Valley keeps telling us these tools are about democratizing creativity. They aren’t. They’re about democratizing harassment. When a voice can be stolen, the person behind it loses the one thing they actually own: their reputation. Tharoor’s specific brand of eloquence is his armor. This video was an attempt to melt that armor down into a puddle of digital sludge.

The platforms will tell you they’re working on it. They’ll point to their "robust" policies and their "integrity teams," most of whom were probably laid off six months ago to satisfy a quarterly earnings report. They’ll talk about watermarking and metadata, technical Band-Aids for a gaping wound in the social fabric. But let’s be honest. The platforms love the engagement that controversy brings. A fake video of a high-profile politician creates more clicks than a boring, verified truth ever could.

We are watching the total collapse of the shared visual record. If you can’t trust the eyes and ears of a public figure, what do you have left? You have tribalism. You have people believing what they want to believe because the "evidence" is now a commodity that can be manufactured on demand.

Tharoor called it out, but he’s playing a losing game of Whac-A-Mole. For every video he flags, ten more can be generated before he finishes his morning tea. The trade-off for our shiny new world of instant synthetic media is a permanent state of paranoia. We used to say "seeing is believing," but that phrase is a relic of a pre-algorithmic age.

What happens when the fakes get better? When the cadence is perfect, the blinking is natural, and the "language" is exactly what the victim would say? We’re not there yet, but we’re close enough to smell the exhaust.

If a man whose entire public persona is built on the mastery of speech can be silenced by a cheap imitation, what hope is there for the rest of us?

The real question isn't whether we can spot the lie, but whether we even care to anymore.

Advertisement

Latest Post


Advertisement
  • 503 views
  • 3 min read
  • 15 likes

Advertisement
Advertisement
About   •   Terms   •   Privacy
© 2026 DailyDigest360