Home / Technology / Portland City Council votes to ban facial recognition technologies in public places

Portland City Council votes to ban facial recognition technologies in public places

The Portland, Oregon City Council today unanimously voted to adopt two of the strongest bans of facial recognition technologies in the U.S. One prohibits the public use of facial recognition by city bureaus including the Portland Police Department, while the other bans all private use in places of “public accommodation,” like parks and buildings. The ordinances originally contained an amendment that would have allowed airlines in partnership with U.S. Customs and Border Protection to collect facial recognition data on travelers at the Portland International Airport, but the proposals voted on today make exemptions only for Portland public schools.

The ban on Portland government agencies’ use of facial recognition technology goes into effect immediately, while the ban on private use goes into effect starting January 2021. The state of Oregon already banned police from using body cameras with facial recognition technology.

In the wake of the Black Lives Matter movement, an increasing number of cities and states have expressed concerns about facial recognition technology and its applications. Oakland and San Francisco, California and Somerville, Massachusetts are among the metros where law enforcement is prohibited from using facial recognition. In Illinois, companies must get consent before collecting biometric information of any kind, including face images. New York recently passed a moratorium on the use of biometric identification on schools until 2022, and lawmakers in Massachusetts are considering a pause on government use of any biometric surveillance system within the commonwealth.

As OneZero’s Kate Kaye notes, the newly adopted pair of Portland ordinances ban the use of facial recognition at stores, banks, restaurants, public transit stations, homeless shelters, doctors’ offices, rental properties, retirement homes, and a variety of other types of businesses. They allow people to sue noncompliant private and government entities for damages, and they establish a new chapter of city code sharply constraining the use of facial recognition by private entities. They also require each city bureau to provide an assessment ensuring they’re not using facial recognition for any purpose.

The bans fall short of preventing facial recognition use in private clubs, places of worship, and households, and they don’t limit the technology’s deployment at workplaces like factories or office buildings (excepting publicly accessible lobbies within those workplaces). In addition, government staff will still be permitted to use facial recognition to unlock a phone, tag someone in social media, and obscure faces in law enforcement images released to the public. But in spite of the exemption for Portland public schools, the ordinances do cover private schools such as nursery schools as well as elementary, secondary, undergraduate, and post-graduate schools.

“With these concerning reports of state surveillance of Black Lives Matter activists and the use of facial recognition technology to aid in the surveillance, it is especially important that Portland prohibits its bureaus from using this technology,” City Commissioner Jo Ann Hardesty said in a statement. “Facial recognition tech, with its gender and racial bias and inaccuracies, is an intrusion on Portlanders’ privacy. No one should have something as private as their face photographed, stored, and sold to third parties for a profit. No one should be unfairly thrust into the criminal justice system because the tech algorithm misidentified an innocent person.”

Amazon was among the technology vendors who sought to block or weaken the city’s legislation. According to OneZero, the company paid lobbyists $ 24,000 to contact and meet with key Portland councilmember staffers and mayoral staffers. Amazon reportedly wanted to influence language in the draft, including how the term “facial recognition” was defined.

Beyond Amazon, some Portland businesses urged councilmembers ahead of the vote to consider a temporary ban on specific uses of facial recognition software rather than a blanket ban on the technology. For instance, Jackson officials said they used the technology at three stores in the city to protect employees and customers from people who have threatened clerks or shoplifted.

“Talking to some businesses that we work with as well as the broader business community, there are definitely some who would be opposed to the city restricting their ability to use that technology,” Technology Association of Oregon President Skip Newberry told Oregon Live. “It can range from security of sites or critical infrastructure to people coming into a store and it being used to provide an experience tailored to that individual.”

Numerous studies and VentureBeat’s own analyses of public benchmark data have shown facial recognition algorithms are susceptible to bias. One issue is that the data sets used to train the algorithms skew white and male. IBM found that 81% of people in the three face-image collections most widely cited in academic studies have lighter-colored skin. Academics have found that photographic technology and techniques can also favor lighter skin, including everything from sepia-tinged film to low-contrast digital cameras.

The algorithms are often misused in the field, as well, which tends to amplify their underlying biases. A report from Georgetown Law’s Center on Privacy and Technology details how police feed facial recognition software flawed data, including composite sketches and pictures of celebrities who share physical features with suspects. The New York Police Department and others reportedly edit photos with blur effects and 3D modelers to make them more conducive to algorithmic face searches.

Amazon, IBM, and Microsoft have self-imposed moratoriums on the sale of facial recognition systems. But some vendors, like Rank One Computing and Los Angeles-based TrueFace, are aiming to fill the gap with customers including the City of Detroit and the U.S. Air Force.

Let’s block ads! (Why?)

VentureBeat

About

Check Also

SAG-AFTRA hits out at AI Taylor Swift deepfakes and George Carlin special, calls to make nonconsensual ‘fake images’ illegal

The Screen Actors Guild – American Federation of Television and Radio Artists (SAG-AFTRA) put out …