“This show is, and has always been, anti-atrocity,” is a disclaimer one would hope applies to every TV show. (Seriously, has anyone checked out Fuller House? Like, really given it a thorough vetting?) But if there’s one show that gives more than a tacit nod toward the fight against creeping evil, it’s John Oliver’s Last Week Tonight, and Sunday’s show dug deep to root out yet another attack on our collective privacy and civil liberties, this time in the form of facial recognition technology.
You know, that visage-scanning tech that allowed Denzel Washington to find one suspicious knapsack in all of post Katrina New Orleans by scanning its, um, face, in a movie you forgot existed. (It’s called Déjà Vu, suitably enough.) But, as Oliver lays out in signature, molar-grinding detail, putting an all-seeing AI database in the hands of government, law enforcement, and the occasional white supremacist Republican candidate is, let’s call it, “troubling.” Nothing if not timely, Oliver noted how civil rights abuses and authoritarianism go together like Donald Trump and authoritarianism, showing how law enforcement has been using buggy, unregulated facial recognition during the recent (and ongoing) massive citizen protests against—wait for it—abuses by law enforcement. Showing how flagging people with outstanding warrants who also think that kneeling on a black man’s throat until he dies is wrong legitimately chills public protest, Oliver called the practice a way to “undermine the right to assemble.” Toss in the fact that the still-developing technology has disproportionate trouble in differentiating between non-white faces, and, well, the word “dystopian” gets thrown around a lot these days, but when one of the largest Chinese facial recognition companies unironically names itself “Skynet,” you try coming up up with another.
One of Oliver’s gifts is putting a face on encroaching, seemingly faceless potential evil, and he did so this week by profiling one Hoan-Ton-That, a blandly douchey “serial entrepreneur” (Wikipedia’s words), whose company Clearview AI is at the forefront of this new frontier of ubiquitous, no-rules privacy invasion. Oh, sure, the former app developer (including of one called ViddyHo that stole and spammed users’ contact lists) claims the company would never, ever sell its services to such “adverse to the U.S.” human rights abusers like China and North Korea. But his answer to a reporter’s question about selling to countries where—just to pick one example—governments kill gay people is, as Oliver shows, a smirking mask of deflection and flashing dollar signs for eyes. (Hoan-Ton-Thot’s widely reported ties to “alt-right” online garbage people is also, you, know, a worry.)
Showing how Clearview AI’s serial capitalism is too creepy even for the likes of IBM, Google, and Amazon, Facebook, Twitter, and YouTube (at least for the moment), Oliver helpfully explained that scraping billions of personal photos from users and uploading them onto an unregulated database for sale to the highest bidder isn’t so much, as Clearview farcically claims, a First Amendment issue, as much as it is something “too Pandora’s Box-y” even for Silicon Valley. (Although not for Walmart, apparently.) Noting how some work by the usual good guys (the ACLU, the Urban Justice Center, and others) has forced at least some laws onto the books protecting our faces, Oliver pointed to the very real probability of this technology being used to target members of Black Lives Matter protests, saying, “we need a comprehensive, nationwide policy, and we need it right now.”
And while Oliver—cheeky gadfly that he is—can point out the need for sweeping societal changes with all the barely-contained ironic rage a British comedian can muster, he knows his platform is more of the “come up with a silly but pointed stunt” variety. So, showing that Clearview AI is likely scanning every single photo of yourself in a funny hat you put on Instagram, and that those photos are likely being scanned further by law enforcement and/or unscrupulous authoritarians the world over, Oliver suggested viewers start incorporating clearly written signs expressing whatever message you think such people really need to hear. (See header photo for one idea to get you started.)