Microsoft has announced it will provide the Australian New South Wales (NSW) Police force with its facial recognition technology to speed up the state’s surveillance footage analysis.
The state police’s older systems involved CCTV footage—and other forms of evidence required in investigations—stored on servers locally, which required time-consuming manual review from police.
The new system involves sending footage to the “cloud”—in this case, Microsoft’s own servers—to identify suspects using artificial intelligence (AI) and machine learning (ML).
According to Microsoft, one particular case saw NSW Police collect 14,000 pieces of CCTV for a murder and assault investigation, analysing what would normally require weeks or months in just five hours.
“Detectives were able to then within days piece together the time sequence of events, movements and interactions of the person of interest as well as overlay this onto a geospatial platform, visualising the data for detectives and aiding in the preparation of the brief of evidence for Courts,” Microsoft said in a press release.
Gordon Dunsford, Chief Information Technology Officer for NSW Police, said that the process served to accelerate investigations, freeing officers to do more frontline police work.
“Using computer vision, it can search to recognise objects, vehicles, locations, even a backpack someone has on their back or a tie a gentleman is wearing,” Dunsford said. “It’s significantly sped up investigations and has helped police to get a result in a fraction of the time.”
“Computer vision” includes data gathered through CCTV, police body cams, laptops, mobile devices, and dash cams.
The news of the sale in Australia comes after Microsoft, along with Amazon and IBM, previously confirmed that it would not be selling facial recognition technology to police in the United States until strong federal regulation covering its usage had been enacted.
The Australian Human Rights Commission (AHRC) released its 2021 Human Rights and Technology Final Report last week, recommending the government ban facial recognition and other biometric technology until federal and state governments introduced regulatory legislation.
“Australian law should provide stronger, clearer and more targeted human rights protections regarding the development and use of biometric technologies, including facial recognition,” the report stated. “Until these protections are in place, the Commission recommends a moratorium on the use of biometric technologies, including facial recognition, in high-risk areas.”
In particular, the report highlighted risks posed to individuals’ right to privacy, as well the chance of racial biases, which it said could increase the risk of injustice and human rights infringements.
“This necessarily affects individual privacy and can fuel harmful surveillance. In addition, certain biometric technologies are prone to high error rates, especially for particular racial and other groups,” the report said.
Pushing back against these concerns, Microsoft said that the new deal did, in fact, align with their commitments because the NSW Police did not purchase the technology—instead, all facial recognition analysis will be carried out by Microsoft on its own servers.
Samantha Floreani, Program Lead for Digital Rights Watch—a non-profit charity focusing on raising awareness for the digital rights of Australians—said that Microsoft’s decision was dishonest and contradicted its own commitment.
“Several major tech companies recognised the danger of facial recognition technology and withdrew the sales of their technology to law enforcement in 2020—Microsoft being one of them,” Floreani told The Epoch Times. “To see them turn around and partner with NSW Police shows that commitment was disingenuous.”
Floreani pointed out that NSW Police’s decision ahead of the AHRC report showed “disregard for community safety as well as human rights law.”
“The Microsoft Press Release emphasises that it has been ‘designed with ethics front and centre’ which flies in the face of growing international consensus that it is not possible to use technology such as facial recognition or large scale image classification ethically in the context of policing.
“These technologies equip law enforcement with an untenable amount of power, with very little room for transparency or accountability,” Floreani said.
The Epoch Times