Clearview AI seeking to put 100 billion photos in facial recognition database

Clearview AI seeking to put 100 billion photos in facial recognition database

Clearview AI has announced it aims to put almost every human's face in its facial recognition database, making 'almost everyone in the world will be identifiable'

Clearview AI has announced it aims to put almost every human’s face in its facial recognition database, making ‘almost everyone in the world will be identifiable’

A controversial AI company has announced it aims to put an image of nearly every human face in its facial recognition database, making it possible for ‘almost everyone in the world [to] be identifiable.’ 

In its latest report in December, facial recognition firm Clearview AI told investors that the company is currently collecting 100 billion photos of human faces for the unprecedented campaign, which will be stored in its dedicated database.

The collection of images – approximately 14 photos for each of the 7 billion people on the entire planet, scraped from social media and other sources – would extensively bolster the company’s extensive surveillance system, already the most elaborate of its kind.

The American company headquartered in Manhattan further told investors that its ‘index of faces’ has grown from 3 billion images to more than 10 billion since the start of 2020. 

The firm’s technology has already been used by myriad law enforcement and government agencies around the world, helping police make thousands of arrests by aiding in various criminal investigations.

Clearview fills its database by scouring sources like Facebook, YouTube, Venmo and millions of other sites, according to the company.

The company, founded in 2016 by Australian CEO Hoan Ton-That, 34 – and currently valued at more than $100 million – is seeking to expand its facial recognition empire beyond law enforcement.

In its latest report in December, facial recognition firm Clearview AI told investors that the company is currently collecting 100 billion photos of human faces for the unprecedented campaign, which will be stored in its dedicated database.

In its latest report in December, facial recognition firm Clearview AI told investors that the company is currently collecting 100 billion photos of human faces for the unprecedented campaign, which will be stored in its dedicated database.

Some now predict Clearview — a tiny set-up which has also licensed its software to a string of private companies for supposed security purposes — could end up destroying privacy as we know it by exploiting the vast size and access to social media [File photo]

Some now predict Clearview — a tiny set-up which has also licensed its software to a string of private companies for supposed security purposes — could end up destroying privacy as we know it by exploiting the vast size and access to social media [File photo]

So how does facial recognition software actually work?

Step one

Using a complex algorithm, facial recognition software measures the facial geometry of a person, such as the distance between their nose, mouth, ears and jaw.

Step two

These values are then matched with similar photos across the internet — including social media giants such as Facebook, Twitter and YouTube.

Step three

This means that the 600 law enforcement agencies in the U.S. using this technology can recognise a face by comparing it with three billion publicly available images of people ‘scraped’ from all across the internet, allowing them to identify a suspect or bystander.

In the presentation to investors last year, obtained by The Washington Post, Clearview brass pleaded for $50 million in additional funding to boost the new undertaking.   

The infusion of funds would allow the company to reach its goal of 100 billion photos, while also building new products, expanding its international sales team, and increasing pay to lobbying government policymakers to ‘develop favorable regulation,’ The Post reported. 

At the time of the presentation, its data collection system was ingesting 1.5 billion images a month, the company said. 

Clearview added that the improved database would help organizations using their tech better monitor ‘gig economy’ workers, and that it is currently researching a number of new technologies could identify someone based on how they walk, detect their location from a photo, or even scan subjects’ fingerprints from afar.    

In March 2020, Clearview was sued by the American Civil Liberties Union, which contended the company illegally stockpiled images of three billion people scraped from internet sites without their knowledge or permission.

For many worried about privacy, news of the stockpile raised concerns that  surveillance used in authoritarian countries like China could happen in the US and other democracies. 

In December, Tech Times reported that Clearview had been called out by multiple privacy watchdogs in countries across the globe for alleged privacy violations.

European nations like the United Kingdom, France, Italy, Greece and Austria have all expressed disapproval of Clearview’s method of extracting information from public websites, saying it comes in violation of European privacy policies.  

Clearview AI CEO Hoan Ton-That has said his company collects only publicly available photos from the open internet that are accessible 'from any computer anywhere in the world.' He said its database cannot be used for surveillance

Clearview AI CEO Hoan Ton-That has said his company collects only publicly available photos from the open internet that are accessible ‘from any computer anywhere in the world.’ He said its database cannot be used for surveillance

Clearview AI, founded in 2016 as a facial recognition firm, is currently collection 1.5 billion images of people a month, the company said during the December report

Clearview AI, founded in 2016 as a facial recognition firm, is currently collection 1.5 billion images of people a month, the company said during the December report

The collection of images - approximately 14 photos for each of the 7 billion people on the entire planet, scraped from from social media and other sources - would extensively bolster the company's extensive surveillance system, already the most elaborate of its kind

The collection of images – approximately 14 photos for each of the 7 billion people on the entire planet, scraped from from social media and other sources – would extensively bolster the company’s extensive surveillance system, already the most elaborate of its kind

In the presentation to investors last year, Clearview brass pleaded for funding for the undertaking, to the tune of $50 million. The company was sued by the American Civil Liberties Union in March 2020, contending it illegally stockpiled images of 3 billion people scraped from internet sites without their knowledge or permission

In the presentation to investors last year, Clearview brass pleaded for funding for the undertaking, to the tune of $50 million. The company was sued by the American Civil Liberties Union in March 2020, contending it illegally stockpiled images of 3 billion people scraped from internet sites without their knowledge or permission

Canadian provinces from Quebec to British Columbia have requested the company take down the images obtained without subjects’ permission. 

Various law enforcement agencies have also expressed their concern regarding Clearview’s collection of personal information, with the NYPD turning down a partnership with Clearview in April after undergoing a 90-day free-trial of its facial recognition software.

Police officers say that Clearview offers several advantages over other facial recognition tools. For one, its database of faces is so much larger. Also, its algorithm doesn’t require people to be looking straight at the camera , it can even identify a partial view of a face — under a hat or behind large sunglasses [File photo]

Police officers say that Clearview offers several advantages over other facial recognition tools. For one, its database of faces is so much larger. Also, its algorithm doesn’t require people to be looking straight at the camera , it can even identify a partial view of a face — under a hat or behind large sunglasses [File photo]

The NYPD decided against using the app, citing potential security risks and potential for abuse, sources said.

At least seven states and nearly two dozen cities have limited government use of Clearview’s technology amid fears over civil rights violations, racial bias and invasion of privacy. 

Social media sites including Facebook and Twitter urged the company to delete the photos that it has collected. 

Ton-That refused and pointed out the company gathers only publicly available photos from the open internet that are accessible ‘from any computer anywhere in the world’. 

He asserted that its database cannot be used for surveillance. 

The ACLU filed its case in Illinois in May 2020, with the backing of a consortium of Chicago-based rights groups.

After the suit was filed, authorities said Clearview had halted sales of its facial recognition technology to US-based private firms. 

Illinois was the first state in the U.S. to regulate the collection of biometric data, with the introduction in 2008 of the Biometric Privacy Act (BIPA).

BIPA requires companies that collect, capture or obtain an Illinois residents’ biometric identifier — such as a fingerprint, faceprint, or iris scan — to first notify that individual and obtain their written consent. 

ACLU said that their lawsuit was ‘the first to force any face recognition surveillance company to answer directly to groups representing survivors of domestic violence and sexual assault, undocumented immigrants, and other vulnerable communities uniquely harmed by face recognition surveillance.’ 

In the court documents, filed in Cook County, Illinois, on Thursday, the ACLU team claim that the facial recognition technology provided by Clearview puts vulnerable people at risk. 

‘Given the immutability of our biometric information and the difficulty of completely hiding our faces in public, face recognition poses severe risks to our security and privacy,’ they claim. 

‘The capture and storage of faceprints leaves people vulnerable to data breaches and identity theft. 

‘It can also lead to unwanted tracking and invasive surveillance by making it possible to instantaneously identify everyone at a protest or political rally, a house of worship, a domestic violence shelter, an Alcoholics Anonymous meeting, and more. 

‘And, because the common link is an individual’s face, a faceprint can also be used to aggregate countless additional facts about them, gathered from social media and professional profiles, photos posted by others, and government IDs.’

Nathan Freed Wessler, senior staff attorney with the ACLU’s Speech, Privacy, and Technology Project, described Clearview’s technology as ‘menacing’.

He said it could be used to track people at political rallies, protests, and religious gatherings, among other uses. 

The coalition are asking a judge to order Clearview to delete the images, and inform in writing and obtain written consent from ‘all persons’ before capturing their biometric identifiers. 

Tor Ekeland, an attorney for the company, described the law suit as ‘absurd’ and a violation of the First Amendment, which protects freedom of speech, religion, assembly and protest.

‘Clearview AI is a search engine that uses only publicly available images accessible on the internet,’ he said.  

What is BIPA?

  • BIPA stands for the Biometric Privacy Act, a law that came into effect in Illinois in 2008.
  • BIPA regulates how ‘private entities’ collect, use, and share ‘biometric information’ and ‘biometric identifiers’ (collectively, ‘biometric data’), and imposes certain security requirements.
  • BIPA prohibits private entities from obtaining biometric data without informed written consent.
  • It also prohibits private entities in possession of biometric data from selling, leasing, trading or otherwise profiting from biometric data.
  • Any person aggrieved is entitled to recover ‘for each violation’ compensation of $1,000 or actual damages (whichever is greater) for negligent violations; and up $5,000 or actual damages (whichever is greater) for intentional or reckless violations.
  • BIPA has been challenged several times in Illinois Supreme Court, but remains on the books.
  • Cases filed in State court have led to significant settlements, some well into the hundreds of thousands, the National Law Review reported.
  • Washington and Texas have since passed similar statutes, although Illinois’s is still considered the most stringent. 

 Source: National Law Review

 ‘It is absurd that the ACLU wants to censor which search engines people can use to access public information on the internet. The First Amendment forbids this.’ 

Clearview AI was founded in 2016 by Hoan Ton-That, a 31-year-old Australian tech entrepreneur and one-time model. 

 Ton-That co-founded the company with Richard Schwartz, an aide to Rudy Giuliani when he was mayor of New York.

It is backed financially by Peter Thiel, a venture capitalist who co-founded PayPal and was an early investor in Facebook.  

Ton-That describes his company as ‘creating the next generation of image search technology’, and in January the New York Times reported that Clearview AI had assembled a database of three million images of Americans, culled from social media sites.

The paper published an expose of the company, in which Ton-That described how he had come up with a ‘state-of-the-art neural net’ to convert all the images into mathematical formulas, or vectors, based on facial geometry – taking measurements such as how far apart a person’s eyes are.

Clearview created a directory of the images, so that when a user uploads a photo of a face into Clearview’s system, it converts the face into a vector.

The app then shows all the scraped photos stored in that vector’s ‘neighborhood’, along with the links to the sites from which those images came. 

Amid the backlash from the January article, Clearview insisted that it had created a valuable policing tool, which they said was not available to the public.

‘Clearview exists to help law enforcement agencies solve the toughest cases, and our technology comes with strict guidelines and safeguards to ensure investigators use it for its intended purpose only,’ the company said.

 

Clearview insisted the app had ‘built-in safeguards to ensure these trained professionals only use it for its intended purpose’.

However, in February BuzzFeed reported that Clearview’s technology was being used by private companies including Macy’s, Walmart, BestBuy and the NBA, and even a sovereign wealth fund in the United Arab Emirates.    

The New Jersey attorney general has banned state law enforcement from using Clearview’s system, and in 2020 the Vermont attorney general sued. 

Leave a Comment

Your email address will not be published.