'Too dangerous': Why even Google was afraid to release this technology
NPR'Too dangerous': Why even Google was afraid to release this technology Enlarge this image toggle caption pimeyes.com pimeyes.com Imagine strolling down a busy city street and snapping a photo of a stranger then uploading it into a search engine that almost instantaneously helps you identify the person. Sponsor Message Originally founded in 2017 by two computer programmers in Poland, it's an AI tool that's like a reverse image search on steroids — it scans a face in a photo and crawls dark corners of the internet to surface photos many people didn't even know existed of themselves in the background of restaurants or attending a concert. "Something happens on the train, you bump into someone, or you're wearing something embarrassing, somebody could just take your photo, and find out who you are and maybe tweet about you, or call you out by name, or write nasty things about you online," said Hill, a reporter for The New York Times who recently published a book on facial recognition technology called "Your Face Belongs to Us." The technology Google dared not to release Journalist Hill with the Times said super-powerful face search engines have already been developed at Big Tech companies like Meta and Google. And while Big Tech companies have been holding back, smaller startups pushing the technology are gaining momentum like PimEyes, and another called Clearview AI, which provides AI-powered face search engines to law enforcement.