Supporters, Skeptics Testify Before New Commission
President Biden’s nominee for U.S. attorney and advocates who voiced privacy concerns are pitching a state commission on a bill that would regulate government use of facial recognition technology.
As part of a broad policing reform and accountability law, state legislators last year approved a ban on almost all law enforcement use of facial recognition systems, only allowing police to ask the Registry of Motor Vehicles to perform a search with a warrant or in certain emergency situations.
The restrictions were among sections of the bill where Baker flagged concerns, threatening a veto. The bill signed into law on Dec. 31, 2020 compromised on facial recognition, allowing police to conduct searches to assist with criminal cases or to mitigate “substantial risk of harm” after submitting a written request to the RMV, Massachusetts State Police, or the Federal Bureau of Investigation. It also created a commission to make recommendations for additional regulations by the end of 2021.
That commission, led by Judiciary Committee chairs Sen. Jamie Eldridge and Rep. Michael Day, took public testimony Friday over videoconference, hearing from elected officials and advocates in favor of strong guardrails and industry representatives who touted benefits of the technology.
Suffolk District Attorney Rachael Rollins, tapped by Biden earlier this week to become the state’s next top federal prosecutor, called facial recognition technology “subjective at best and unfortunately error-prone at worst.”
“As a result of some research, we’ve seen that facial recognition technology at times can produce an inaccurate result of up to 30 percent,” she said. “That is not a margin of error. I would argue that that is a gulf of potential failure. Facial recognition is a far cry from DNA or fingerprints.”
Rollins voiced support for bills (H 135, S 47) before the Judiciary Committee that aim to regulate in what circumstances government agencies could use face recognition.
Rahsaan Hall of the ACLU of Massachusetts and Arline Isaacson of the Massachusetts Gay and Lesbian Political Caucus also specifically mentioned those bills, which Hall said would “fill the gaps” in current law and provide due process and other protections.
“The legislation is almost identical to the language agreed to by the House and Senate in the initial conference report from last year’s police reform legislation that we all fought so hard for,” Hall said. “Face surveillance can be used in limited, tightly regulated circumstances to advance legitimate police investigations, but the existing law doesn’t sufficiently protect racial justice, due process, privacy or First Amendment rights.”
Hoan Ton-That, the founder and CEO of facial recognition software company Clearview AI, said his product is used by more than 3,100 law enforcement agencies around the United States and involves a “bias-free algorithm” that “can accurately find any face out of over 3 billion images it has collected from the public internet.” He described it as “much more reliable and accurate than the human eye.”
David Ray, chief operating officer and general counsel at Colorado-basedd Rank One Computing, also described today’s automated face-recognition technology as “more accurate than the human eyewitness,” and said it has been “highly effective at preventing and solving crime.”
Ton-That told the panel of a case where Clearview AI’s software helped track down a child rapist who had been selling abuse videos of a 6-year-old girl and said it was also “essential to the quick and instant identification of the Capitol rioters” on Jan. 6.
“Any ban on facial recognition would be devastating for victims of child rape and human trafficking,” he said. “Likewise, limiting the dataset that law enforcement can search against just to DMV photos or mugshots will prevent victims of child exploitation from being rescued, as children are not in DMV databases.”
Hall, who testified later, asked that policymakers not use “worst-case scenarios and fear-mongering of murderers and rapists and the extremely disturbing incidents of the insurrection to justify the unregulated or even moderately regulated use of this technology to surveil historically marginalized communities that are already over-policed.”
Isaacson said widespread use of facial recognition and surveillance could have “huge” ramifications for closeted members of the LGBTQ communitiy.
“Their lives could be irrevocably harmed by government surveillance, if their face is being scanned, for example, every time they enter a gay bar or a club that caters to LGBTQs or a doctor who deals only with trans patients or if they’re spending the night with someone of the wrong gender,” she said. “It’s important to note, also that our concerns about privacy intrusions from this technology are not limited to the ‘big brother’ kinds of government surveillance — though we very much worry about that — but there’s also the concerns about ‘little brother’ surveillance by government employees, the people who say, ‘Gee, I wonder if he’s gay. You think she might be a lesbian? Let’s check it out.'”