Home Objective News Today Facial Recognition Hearing to Give Congress a View of Creepy Tech

Facial Recognition Hearing to Give Congress a View of Creepy Tech

176

There’s a good chance your face is in a criminal investigative database somewhere, and the FBI has made it clear that it wants access to every database, a prospect that deeply troubles privacy advocates.

Image: Facial recognition template


Image: Facial recognition template

An androgynous base image and an experimental analysis of the same face. National Science Foundation

More than 400 million pictures of Americans’ faces are archived in local, state and federal law enforcement facial recognition networks, the federal Government Accountability Office reported last year.

Those pictures include the faces of about half of all U.S. adults, Georgetown University Law School’s Center on Privacy & Technology estimates.

The networks are largely unregulated and subject to ethnic and gender bias, according to experts, including a photo technologist for the FBI itself. The databases are culled from police mugshots, driver’s licenses, passports, visas, security video and other sources — taking in millions of Americans who aren’t even suspected of a crime.

Related: Facial Recognition Technology Raises Privacy Concerns

“The FBI, in particular, and others are doing everything [they] can to build out facial recognition with the goal, essentially, of having everybody’s face in their database,” Rep. Jason Chaffetz, R-Utah, chairman of the House Committee on Oversight and Government Regulation, said at a Washington, D.C., conference last month.

Which is why Chaffetz’s committee is holding a hearing Wednesday morning to investigate use of facial recognition technology by law enforcement agencies, particularly the FBI.

What Facial Recognition Is — and Isn’t

TV cop shows that depict technologists zooming in on someone’s face in a grainy video, pressing a few buttons and immediately getting a full dossier on that person greatly exaggerate the current capabilities of facial recognition technology.

In a paper published in the December 2012 edition of the journal IEEE Transactions on Information Forensics and Security, four authors — including a senior photography technologist for the FBI — reported that facial recognition systems are less accurate in distinguishing identities among African-Americans, women and younger people.

The FBI system, in particular, “is not designed to give no for an answer,” Alvaro Bedoya, executive director of the Georgetown Privacy Center, said last month on the public radio podcast Criminal Injustice.

“No matter what, it will return a list of faces. And so, in these systems that are designed to not tell you no for an answer, when they miss the right suspect, they’re still going to give you a list of potential suspects that look like the candidate image,” he said. “And those innocent people will predominantly be African-Americans, women and young people.”

The GAO report found that facial recognition systems will frequently return “prime candidate” profiles based on just one or two photos, offering only one or two angles,

The federal government’s own guidelines, set out by the National Institute of Standards and Technology, suggest using at least five images to determine a credible match. And if a subject is wearing “accessories that occlude facial features” — eyebrow studs or rings through the nose, for example — images should be obtained both with and without them.

IMAGE: NIST facial recognition template


IMAGE: NIST facial recognition template

National Institute of Standards and Technology guidelines suggest using clear multiple images and angles to generate a credible facial match. NIST

In a letter to the Justice Department in October, a coalition of civil liberties and privacy groups contended that “such inaccuracies raise the risk that, absent appropriate safeguards, innocent African-Americans may mistakenly be placed on a suspect list or investigated for a crime solely because a flawed algorithm failed to identify the correct suspect.”

‘Technology Will Not Wait’

And yet, according to the GAO report, there is little independent testing for errors. Two of the major companies providing such systems, in fact, have said they don’t run such tests internally, either.

Internal FBI documents obtained in a Freedom of Information Act lawsuit by the nonprofit Electronic Privacy Information Center indicate that the FBI’s own database, called the Next Generation Identification Interstate Photo System, or NGI-IPS, had an acceptable margin of error of 20 percent — that is, a 1-in-5 chance of “recognizing” the wrong person.

And research published in the October 2015 issue of the scientific journal PLOS ONE by researchers at the universities of Sydney and New South Wales in Australia found that the humans who interpret such data build in an extra error margin approaching 30 percent.

Related: China Tries to Wipe Out Toilet Paper Theft With Face Scanner

Even so, the FBI is working to grow the number of state and local law enforcement agencies whose databases it can tap into. It already has agreements with 16 states allowing investigators to cross-check faces without court warrants, creating what the Georgetown Privacy Center called a “virtual perpetual lineup.”

Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here