Reality is leaking. Shashi Tharoor—parliamentarian, vocabulary wizard, and the internet’s favorite source of high-octane English—just hit the "not me" button on a viral video. It’s the latest episode in our collective slide into the digital uncanny valley.
The clip in question showed Tharoor offering some uncharacteristically warm words for Pakistan. To anyone who’s spent ten minutes watching Indian cable news, the premise alone was a red flag. But in the era of the rage-bait algorithm, context is a luxury nobody can afford. Tharoor took to X, the platform formerly known as Twitter and currently known as a dumpster fire, to set the record straight. His verdict: "Neither my language nor my voice."
He’s right. The AI didn’t just fail the Turing test; it failed the Tharoor test. The man breathes a specific brand of Oxford-inflected vowels and complex syntax that most large language models still find a bit too spicy to mimic perfectly. The deepfake was clunky. It was robotic. It lacked the specific, rhythmic sass of a man who once used the word "floccinaucinihilipilification" in a casual tweet.
But here’s the problem. It doesn’t have to be good to be dangerous.
We’re living through the democratization of the lie. A decade ago, if you wanted to frame a high-profile politician, you needed a professional studio, a world-class impressionist, and a six-figure budget. Today? You need a fifteen-dollar subscription to a voice-cloning SaaS and a mediocre laptop. The friction is gone. The barrier to entry has dropped so low it’s practically underground.
This isn't about some high-minded debate over the future of truth. It's a logistical nightmare. For $15 a month, an anonymous troll can generate a crisis that requires a fifty-thousand-dollar PR cleanup and forty-eight hours of frantic damage control from a national political party. That’s a trade-off that favors the chaos agents every single time.
The tech bros will tell you that the solution is more tech. They’ll talk about "digital watermarking" or "blockchain-verified media." It’s a nice dream. In reality, by the time a video is debunked by a team of experts, it’s already been shared four million times on WhatsApp. It’s already been discussed in the local tea shop. The damage isn't just done; it's baked into the social fabric.
Tharoor’s specific rebuttal—pointing out that the AI couldn't grasp his linguistic style—is a temporary defense. It relies on the victim being uniquely articulate. What happens when the target is someone with a generic accent? What happens when the AI stops hallucinating and starts nailing the subtle intake of breath between sentences? We’re mocking the "slop" now, but the slop gets better every Tuesday.
The platforms don’t care. Why would they? Outrage is engagement. A deepfake of a politician saying something controversial generates ten times the traffic of a boring, factual correction. Every time a video like this goes viral, the platform wins, even if the truth loses. It’s a parasitic relationship where the host—public discourse—is slowly being drained of its blood.
There’s a certain irony in Tharoor being the one to sound the alarm. He is a man of words in a world that is increasingly allergic to them. He relies on the precision of language to make his point. But precision is the first casualty of the generative age. When anyone can make anyone else say anything, the very idea of a "quote" becomes obsolete.
We are moving toward a "liar’s dividend." This is a cynical little loophole where real people can start dismissing real evidence of their actual mistakes by claiming it’s just a deepfake. "That wasn't me taking a bribe," they'll say. "That was just a clever AI." By flooding the zone with garbage, the fakers haven't just made us believe the lies; they've made us stop believing the truth.
Tharoor caught this one early. He used his platform to slap it down before it could mutate too far. But most people don't have a million followers and a direct line to the national press. Most people are just noise in the machine.
If the best we can do is hope the AI isn't smart enough to mimic our vocabulary, we’ve already lost the war. The machines don't need to be geniuses. They just need us to be tired. And looking at the state of the internet, we’re exhausted.
If we can’t trust the sound of a man’s voice, what’s left to buy?
