(Image: Shutterstock)

Facial Recognition Software Needs Test for Racial Bias, Civil Liberties Groups Say

One in two American adults is in a law enforcement facial recognition database, which has the potential to disproportionately affect people of color, according to a report released Tuesday by the Center for Privacy and Technology at the Georgetown University law school.

In response to the report, a coalition of 52 civil liberties groups wrote a letter Tuesday to the Justice Department, expressing their concern that facial recognition systems disproportionately affect communities of color.

“Face recognition systems are powerful—but they can also be biased,” the letter stated.

The report stated that many police departments don’t realize that facial recognition systems carry a bias because of disproportionately high numbers of African-American arrests that are then added to mug shot databases, which drive the software. An FBI co-authored study also found that face recognition may be 5 percent to 10 percent less accurate on African Americans than Caucasians.

Despite these findings, there is no way to test for racially biased error rates, either independently or by the companies that market facial recognition software.

The civil liberties groups recommended that the DOJ expand investigations into police departments to find out whether facial recognition software is racially biased and work with the FBI to conclude whether facial recognition software has a disparate impact on communities of color.

The report found that the Baltimore Police Department, which arrests African Americans at a rate twice as high as their share of the state population, uses facial recognition software that has never been audited for misuse. Only nine of the 52 law enforcement agencies included in the study reported an auditing requirement.

“The FBI has yet to conduct even one audit of its own face recognition systems, and continues to disclaim responsibility for assessing the accuracy of the partner state and Federal systems that it uses on a daily basis.  In other words, the FBI is leading by bad example, and many jurisdictions are following,” the letter stated.

One in four police departments can run facial recognition software through their own or through another network. At least five major police departments, including Chicago, Dallas, and Los Angeles, are using real-time face monitoring software, using street cameras.

“Face recognition technology is rapidly being interconnected with everyday police activities, impacting virtually every jurisdiction in America,” the letter stated. “Yet, the safeguards to ensure this technology is being used fairly and responsibly appear to be virtually nonexistent.”

No Comments

    Leave a Reply