The use of facial recognition software for the purposes of policing quarantined travelers is receiving a roll-out in Australia that many consider intrusive. 

The two largest states in that country, New South Wales (which includes the capital of Canberra as well as Sydney) and Victoria (Melbourne) have instituted a system designed to enforce their pandemic quarantine rules to ensure that people are staying within their homes as required after recent travel. 

The travelers can either stay in hotel rooms during the quarantine period, where they can be monitored easily, or they can go home if they agree to this system. 

Under the system, the quarantined person takes a selfie at their designated quarantine address within 15 minutes after the receipt of a randomized check-in request. The software seeks to verify the selfie through geolocation and against a facial signature. If it isn’t verified, police may follow up by visiting the location. 

Reuters quoted a critic of the program, Dr. Toby Walsh, a professor of Artificial Intelligence at the University of NSW. “I’m troubled not just by the use here but by the fact this is an example of the creeping use of this sort of technology in our lives,” he said.  

He also suggested that the system may not be secure. It could be hacked to give false location reports. 

But that doesn’t resolve the questions. Can facial signatures, once in the possession of the state, be used for purposes other than confirming one’s compliance with quarantine rules? Might they, for example, be checked against witness descriptions on outstanding criminal investigations?  

Edward Santow, a former Australian Human Rights Commissioner, said, “The law should prevent a system for monitoring quarantine being used for other purposes….if something goes wrong with this technology, the risk of harm is high.”

Genvis, a software start-up, is behind the technology. Genvis is headquartered in Perty, Western Australia and it developed the software in 2020 in cooperation with the Western Australian Police. 

Let Freedom Ring – Debate Against Facial Recognition

The news from NSW and Victoria has further stoked ongoing debates in the rest of the world about the Orwellian implications of facial recognition technology in the hands of the authorities. What does it do to freedom?  

For example, people around the world take for granted the fact that our driver’s licenses come with photos, and those photos are maintained by the relevant motor vehicle authorities. We may not realize the ease with which these photos are used by authorities.  

It recently became a matter of public controversy in the U.S. state of Massachusetts that police can run a state-wide facial recognition search in connection with their criminal investigations just by emailing a photo to the Massachusetts Registry of Motor Vehicles (RMV).

Massachusetts enacted a new law creating some guardrails for this practice, although reformers were not satisfied with the compromise bill that resulted.

The executive director of the American Civil Liberties Union of Massachusetts, Carol Rose, said during that controversy that any information about the use of facial recognition software by police ought to be provided to defense counsel. 

“That’s really important, because so often this technology is resulting in false positives, and particularly against people of color because face surveillance technology has a very hard time recognizing black and brown faces.”  There has been no serious move toward the use of facial recognition technology in connection with Covid lockdowns in the U.S. Even people who caution that such a development may happen to speak of it in speculative terms.

Black Faces Matter

Our racial tensions, and the suspicion to which Rose made reference, are a big part of the reason why it likely won’t. There is a strong suspicion that such software was invented by white coders (almost certainly true, as a demographic matter), and that it is less accurate and thus more inclined to do an injustice the darker the skin of the subject of the photo becomes.

Limits on the use of the software will surely spread.