If Hoan Ton-That is feeling the pressure, he isn’t showing it.
Over the last month, fears about facial recognition technology and police surveillance have intensified, all thanks to Ton-That’s startup, Clearview AI.
First came a front-page investigation in The New York Times, revealing Clearview has been working with law enforcement agencies to match photos of unknown faces to people’s online images. Next, cease-and-desist letters rolled in from tech giants Twitter, Google and Facebook. Lawmakers made inquiries and New Jersey enacted a statewide ban on law enforcement using Clearview while it looks into the software.
But during an interview at CNN’s studios in New York City last week, Ton-That didn’t seem particularly fazed, saying the last few weeks were “interesting.”
He demonstrated the technology and described himself as “honored” to kick off a broader conversation about facial recognition and privacy. He’s eager to build a “great American company” with “the best of intentions” and wouldn’t sell his product to Iran, Russia or China, he said. He claimed the technology is saving kids and solving crimes. And he said he welcomes government regulation.
But so far, Ton-That and Clearview have triggered more concerns than acclaim.
A face in the crowd
Clearview AI is controversial for many reasons, but perhaps the most important is its massive database. The company claims to have scraped more than 3 billion photos from the internet, including from popular social media platforms like Facebook, Instagram, Twitter and YouTube. Not only that, but Clearview retains those photos in its database even after users delete them from the platforms or make their accounts private.
Clearview sells access to its database to law enforcement agencies, so those agencies can match unknown faces to other images.
CNN Business saw firsthand how the technology works in a demonstration last week.
First, Ton-That ran a photo of my face though the database, pulling up in seconds multiple different pictures of me from across the internet.
Most jarringly, he found a photo that I had probably not seen in more than a decade, a picture that ran in a local newspaper in Ireland when I was 15-years-old and in high school. Needless to say, I look a lot different now than I did then; in fact, my producer, who has to spend far more time than she’d like looking at me through a camera, didn’t even recognize me. But the system did.
Clearview has given similar demonstrations to law enforcement, and some have been convinced to hand over taxpayers’ dollars for the tool. The Chicago Police Department, for instance, is paying almost $50,000 for a two-year Clearview “pilot,” a police spokesperson confirmed to CNN Business.
But clearly, I wasn’t a random person Ton-That had pulled from a crowd. He knew he was coming to CNN to meet me and he knew I’d ask him to run my face through his system. He even admitted he had searched my images before we met. (And, it’s worth noting, though the photo is old and I’m almost unrecognizable, the page it’s on does include a caption with my name.)
So we surprised him and also asked him to run a search for my producer.
That at least appeared to make Ton-That a little nervous. “Can we cut this if it doesn’t work?” he quipped. We said no.
But it did work. As we scrolled through the images it had found, my producer noticed that Clearview had found pictures from her Instagram account, even though her account has been private, accessible only to her followers.
Ton-That explained that Clearview had probably downloaded the photos from her account before she had made it private last year.
Ton-That’s representative had my producer’s name in advance of the interview but Ton-That said he had not run her face before the live demonstration.
Both Clearview tests for my producer and I returned no false positives.
Scary but effective?
The parts of Ton-That’s demonstration that spooked my producer and me — his access to photos that are no longer publicly available online and his ability to find a photo of me as a minor — are likely among the things his law enforcement clients find appealing.
He said more than 600 law enforcement agencies in the US and Canada are using the tool, a number CNN Business has not independently verified, and when asked, he wouldn’t specify how many are paying customers versus those using free trials. He also said that a number of banks are using Clearview software for fraud investigations, but declined to name any of the banks. CNN Business reached out to America’s 20 largest bank chains. JPMorgan Chase, Bank of America, Wells Fargo, US Bank, Ally Bank and SunTrust all denied using the software. The others either declined to comment or didn’t respond to CNN Business’ request for comment.
At least some of Clearview’s clients, like the Chicago Police Department, appear to be under the impression that the company only has access to public images that anyone could find online — but clearly, in the case of my producer, Clearview also has access to some information that is no longer public. When asked to clarify its stance, a Chicago Police Department spokesperson told CNN Business “the message here is that the information gained from Clearview was at one point placed in the public domain.”
Ton-That claims Clearview is 99% accurate and doesn’t turn back higher errors when searching for people of color, a problem that’s well-documented among other facial recognition tools. CNN Business has not conducted a full analysis of Clearview’s software.
Some law enforcement agencies do report instances in which they believe the tool has been effective.
In New Jersey, Clearview was used as part of an investigation into a child predator ring. Police there used Clearview in a sting to identify a man before he showed up for what he believed was a meeting with a minor, Gurbir Grewal, the New Jersey Attorney General, told CNN Business. In that instance, Clearview helped police look into the man’s background before he walked into the sting and helped them determine if he was likely to be armed, Grewal said.
Despite its apparent utility, the attorney general — who only learned about Clearview and its use in New Jersey after The New York Times’ report on the company — ordered law enforcement in the state to stop using the technology until a review is completed.
“I was deeply disturbed,” he told CNN Business. “I was concerned about how Clearview had amassed its database of images that it uses with its technology. I was concerned about its data privacy and cybersecurity measures that it takes.”
Shutting the barn door
According to Ton-That, Clearview has downloaded billions of images from major social media platforms and from all different kinds of websites across the internet — including, evidently, from my local newspaper.
Downloading and storing pictures this way is against most of the major social media platforms’ policies.
The practice has prompted the likes of Facebook, Twitter and YouTube to send Clearview cease-and-desist letters.
Despite Facebook’s concerns about the company, Peter Thiel, who sits on Facebook’s board was an early investor in Clearview. Facebook, Thiel, and Ton-That all declined to comment on whether Thiel knew about how Clearview was downloading Facebook data and that it was against Facebook’s rules.
“In 2017, Peter gave a talented young founder $200,000, which two years later converted to equity in Clearview AI. That was Peter’s only contribution; he is not involved in the company,” Jeremiah Hall, a spokesperson for Thiel told CNN Business.
The cease-and-desist letters don’t seem to faze Ton-That and maybe for good reason.
Technology companies have essentially no control of what happens to data, in this case pictures, after they are downloaded from their platforms.
Ensuring someone actually complies with a cease-and-desist letter when it comes to data is also essentially impossible. Once images are downloaded, as they have been by Clearview, they can be copied again and again, stored on multiple computers and servers in different places all around the world, and that’s even before they are distributed or made available to third parties. Clearview’s clients can access the images.
In 2015, Facebook asked Cambridge Analytica to delete Facebook data it had gathered, but when it emerged in 2018 that all the data may not have been deleted, it turned into one of the greatest scandals Facebook had confronted in its history.
Ton-That’s defense of his technology and his collection methods may land him in court and make him party to landmark rulings that set precedent for how American grapples with artificial intelligence in the 21st century.
Asked if he is ready to step out from behind the computer screen to face days in court, he says, “Sure. Yeah. I don’t think there’ll be that many.”