San Francisco’s proposed rule forbids the use of facial-recognition technology by the city’s 53 departments — including the San Francisco Police Department, which doesn’t currently use such technology but did test it out between 2013 and 2017. However, the ordinance carves out an exception for federally controlled facilities at San Francisco International Airport and the Port of San Francisco. The ordinance doesn’t prevent businesses or residents from using facial recognition or surveillance technology in general — such as on their own security cameras. And it also doesn’t do anything to limit police from, say, using footage from a person’s Nest camera to assist in a criminal case.
“We all support good policing but none of us want to live in a police state,” San Francisco Supervisor Aaron Peskin, who introduced the bill earlier this year, told CNN Business ahead of the vote.
In San Francisco, Peskin is concerned that the technology is “so fundamentally invasive” that it shouldn’t be used.
“I think San Francisco has a responsibility to speak up on things that are affecting the entire globe, that are happening in our front yard,” he said.
Early days for facial recognition laws
Facial recognition has improved dramatically in recent years due to the popularity of a powerful form of machine learning called deep learning. In a typical system, facial features are analyzed and then compared with labeled faces in a database.
Yet AI researchers and civil rights groups such as the American Civil Liberties Union are particularly concerned about accuracy and bias in facial-recognition systems. There are concerns that they are not as effective at correctly recognizing people of color and women. One reason for this issue is that the datasets used to train the software may be disproportionately male and white.
The ACLU is one of many civil-rights groups supporting the ordinance. Matt Cagle, a technology and civil liberties attorney at the ACLU of Northern California, said the raft of issues posed by facial-recognition systems mean the city’s proposed legislation would prevent harm to community members. He also expects that, if passed, it will prompt other cities to follow suit.
“We should be able to live our lives without every movement of ours being tracked and monitored by the government,” he told CNN Business.
In the Bay Area alone, Berkeley, Oakland, Palo Alto and Santa Clara County (of which Palo Alto is a part) have passed their own surveillance-technology laws. Oakland is also considering whether to ban the use of facial-recognition technology.
How surveillance could be harder in San Francisco
Under the proposed San Francisco law, any city department that wants to use surveillance technology or services (such as the police department if it were interested in buying new license-plate readers, for example) must first get approval from the Board of Supervisors. That process would include submitting information about the technology and how it will be used, and presenting it at a public hearing. With the proposed rule, any city department that already uses surveillance tech would need to tell the board how it is being used.
“This ordinance is really about making sure that, when there are surveillance programs, the community has a voice and a seat at the table,” Cagle said.
The ordinance also states that the city will need to report to the Board of Supervisors each year on whether surveillance equipment and services are being used in the ways for which they were approved, and include details like what data was kept, shared or erased.
In a statement, the San Francisco Police Department said it “looks forward” to working with the city’s supervisors, the ACLU, and others to develop laws that speak to tech-related privacy worries “while balancing the public safety concerns of our growing, international city.”
“In accordance with the legislation, we are in the process of auditing our technologies and related policies,” the statement said.
The vocal opposition
Some locals have been vocally opposed to the surveillance ordinance, including several groups of residents. Frank Noto, president of Stop Crime SF, a group focused on crime prevention, said his organization recognizes privacy and civil-liberties concerns that may have prompted the ordinance’s introduction, but sees it as flawed legislation largely because it requires the police department to get approval from the city before it can obtain surveillance technology.
And while Stop Crime SF sees the faults in existing facial-recognition technology, it’s also concerned about prohibiting its use entirely. Noto suggested a moratorium on using it — say, for two years — might be a better option.
“The idea of banning it forever doesn’t make sense to us,” he said.