Microsoft has announced it will provide the Australian New South Wales (NSW) Police force with its object recognition technology to speed up the state’s surveillance footage analysis.
The state police’s older systems involved CCTV footage—and other forms of evidence required in investigations—stored on servers locally, which required time-consuming manual review from police.
The new system involves sending footage to the “cloud”—in this case, Microsoft’s own servers—to identify objects linked to suspects using Azure Computer Vision, which utilises artificial intelligence (AI) and machine learning (ML).
Gordon Dunsford, Chief Information Technology Officer for NSW Police, said that the process served to accelerate investigations, freeing officers to do more frontline police work.
“Using computer vision, it can search to recognise objects, vehicles, locations, even a backpack someone has on their back or a tie a gentleman is wearing,” Dunsford said. “It’s significantly sped up investigations and has helped police to get a result in a fraction of the time.”
According to Microsoft, one particular case saw NSW Police collect 14,000 pieces of CCTV for a murder and assault investigation, analysing what would normally require weeks or months in just five hours.
Australian Human Rights Commission Recommends Banning Facial Recognition
The Australian Human Rights Commission (AHRC) released its 2021 Human Rights and Technology Final Report last week, recommending the government ban facial recognition and other biometric technology until federal and state governments introduced regulatory legislation.“Australian law should provide stronger, clearer and more targeted human rights protections regarding the development and use of biometric technologies, including facial recognition,” the report stated. “Until these protections are in place, the Commission recommends a moratorium on the use of biometric technologies, including facial recognition, in high-risk areas.”
In particular, the report highlighted risks posed to individuals’ right to privacy, as well the chance of racial biases, which it said could increase the risk of injustice and human rights infringements.
“This necessarily affects individual privacy and can fuel harmful surveillance. In addition, certain biometric technologies are prone to high error rates, especially for particular racial and other groups,” the report said.
Microsoft said the solution had been designed with “ethics front and centre,” and did not utilise real-time face recognition technology.