04 October 2022
Amidst growing concerns about privacy and mass surveillance, a new report published by the University of Technology Sydney (UTS) Human Technology Institute outlines a model law for facial recognition technology.
The report, Facial Recognition Technology: towards a model law, recommends that facial recognition technology be formally assessed for potential harms before deployment and, in certain circumstances, prohibited. A public register of deployed facial recognition technology has also been recommended.
Facial recognition technology (FRT) includes any computer system or device with embedded functionality that uses biometric data to verify someone’s identity, to identify an individual or to analyse characteristics about a person. Personal information involved in the use of this technology is regulated under the Privacy Act 1988 (Cth) (Privacy Act), where biometric templates and biometric information are considered ‘sensitive information’ and subject to additional protections.
However, the report suggests that the Privacy Act does not adequately address the unique challenges posed by FRT development in relation to human rights, including freedom of association and expression and the right to assembly.
The model law proposes a novel risk-based framework that would require developers and organisations seeking to use FRT (called ‘deployers’) to complete an assessment of the potential harms, including the potential human rights risk. This ‘Facial Recognition Impact Assessment’ would be registered, publicly available and could be challenged by the regulator or interested parties.
A Facial Recognition Impact Assessment would consider various factors in the proposed use of FRT, including spatial context, functionality, performance and whether individuals can provide free and informed consent. An FRT developer or deployer would also consider whether the technology produces outputs that lead to a decision with legal or similarly significant effect, including whether that decision is wholly or partially automated. This factor draws heavily on concepts found in the European General Data Protection Regulation.
The report includes guidance about the potential level of risk associated with these factors – for example, the use of facial recognition in public spaces is considered a higher risk than in private spaces. The use of FRT in a workplace would also raise the risk rating as individuals have a reduced ability to enter, move and act freely in that environment.
Based on this risk assessment, the Model Law sets out a cumulative set of legal requirements, restrictions and prohibitions that apply based on the relevant risk rating.
There are three proposed categories of risk: base-level, elevated and high. As expected, stricter legal constraints are imposed as the level of risk increases. In addition, FRT applications used in Australia must continue to comply with the Privacy Act. Failing to comply with these legal requirements would carry civil penalties and injunctions granted against unauthorised use of FRT.
The report also contemplates specific legal rules, including a ‘face warrant scheme’ for police and other security agencies. Under the proposed regime, a judge or independent authority would consider applications by law enforcement for repeated or routine use of FRT involving members of the public who are not suspected of having committed a crime.
These potential obligations would have wide-reaching implications for organisations developing FRT and any entity deploying this technology, including government and private sector organisations. The report calls for the Attorney-General to commit to national facial recognition reform based on the model law and proposes to assign regulatory responsibility to the Office of the Australian Information Commissioner.
We expect that the Australian Government would engage in industry and public consultation before introducing a bill based on the model law. It is likely that some of these concerns will be addressed by the ongoing Privacy Act review.
Authors
Partner
Law Graduate
Tags
This publication is introductory in nature. Its content is current at the date of publication. It does not constitute legal advice and should not be relied upon as such. You should always obtain legal advice based on your specific circumstances before taking any action relating to matters covered by this publication. Some information may have been obtained from external sources, and we cannot guarantee the accuracy or currency of any such information.
Head of Technology, Media and Telecommunications
Head of Responsible Business and ESG