The resolution requested Amazon’s board stop selling its Rekognition software to governments unless a third-party evaluation determines the tool “does not cause or contribute to actual or potential violations of civil and human rights.” Its failure was announced at the company’s annual meeting in its hometown of Seattle, along with that of a similar proposal asking Amazon to enlist an outside group to study the risks of using Rekognition.
“Even if perfectly accurate, face surveillance changes the balance of power between government and individuals. None of us can change our face print,” said Shankar Narayan, director of the Technology and Liberty Project at the American Civil Liberties Union of Washington, who presented the resolution asking Amazon not to sell Rekognition to government agencies.
Speaking in advance of the meeting, Sister Pat Mahoney — a member of the Sisters of St. Joseph of Brentwood, New York, which invests in Amazon and supported the proposals — told CNN Business that her goal was to raise awareness about facial recognition.
“We’re hoping for consciousness raising and an awareness in a broader base of shareholders,” she said.
And while the resolutions were struck down, she said she will continue working to address underlying issues surrounding the use and sale of facial-recognition technology.
There are currently no federal laws addressing how facial-recognition systems can be used, but a handful of states and local governments have passed or are considering their own rules in relation to this and other surveillance technologies — San Francisco, for instance, banned the use of facial-recognition technology by city government just last week. And concerns are mounting about the technology on Capitol Hill.
AI researchers and civil rights groups such as the American Civil Liberties Union are particularly worried about accuracy and bias in facial-recognition systems; there are concerns that they are not as effective at correctly recognizing people of color and women. One reason for this issue is that the datasets used to train the software may be disproportionately male and white.
“But we have to be very cautious because even if you make accurate facial-recognition systems, they will be abused without regulations,” she said.
Lydia DePillis contributed reporting.