New privacy code: gold-plated gift or a banana skin?
22 January 2025
Analysis: Gehan Gunasekara argues the proposed Biometric Processing Privacy Code poses risks for both businesses and individuals.

After an exhaustive period of consultation spanning almost two years, the Privacy Commissioner, in the week before Christmas, released the draft version of the Biometric Processing Privacy Code he intends to issue under the Privacy Act.
Biometric information, collected through the likes of facial recognition technology, is personal information covered by the Privacy Act but is undoubtedly in need of more specific protection due to its sensitivity. A stolen credit card can be replaced but not someone’s face. In addition, biometric information is essentially a part of us, the most intimate type of personal information that exists.
It is important to note that, unlike in many other countries, codes issued under the Privacy Act are binding and mandatory in relation to the matters they cover. They are not just guidelines or suggestions. The proposed code aims to develop specific rules for bioinformatics to ensure compliance with the Act’s 13 privacy principles.Contravention of rules in a code can lead not only to enforcement action by the commissioner against an agency, but also to complaints by individuals that can result in damages awards extending to hundreds of thousands of dollars, by the Human Rights Review Tribunal.
Let’s begin with the Privacy Code’s positives. It helpfully defines biometric information to encompass biometric characteristics. These include physical features: fingerprints, iris and voice recognition for example as well as behavioural aspects such as an individual’s gait, keystroke pattern or handwriting style.
Importantly, biometric processing extends not only for the purposes of identifying individuals but also to categorising them by inferring some aspect of their personality, health or mental state as well as sex, age, racial or other demographic. However, biological (for example blood or body fluids) and genetic material are excluded as these are covered by the existing Health Code.
The risk to individuals, once biometric information is obtained, includes subsequent access to it by law enforcement authorities under permitted exceptions in the code.
Unlike an earlier draft, which prohibited its collection altogether, Rule 10 now prohibits only the use of certain types of biometric information such as that relating to a person’s health, mood, emotion as well as matters covered by the Human Rights Act’s rules preventing discrimination because of race, sexual orientation and so on.
There is a carve-out for health information when an individual specifically authorises it, which would apply to wearable devices, and for health and safety purposes.
There is a vague requirement in Rule 3 for notification to be ‘clear and conspicuous’ unlike the earlier draft’s definitions of ‘accessible’ conspicuous’ notices that were designed to prohibit ‘bundling’ through small print buried in the terms and conditions. The risk to individuals, once biometric information is obtained, includes subsequent access to it by law enforcement authorities under permitted exceptions in the code.
The most contentious aspect of the code is the proportionality and risk assessment that Rule 1 requires agencies to make prior to collecting biometric information. This requires weighing the privacy risks to the individuals against the legitimate interests of the agency which must outweigh the risk – whereas the Privacy Act simply requires a legitimate purpose to exist.
The code is closer to the European privacy regulation (or General Data Protection Regulation) which allows processing of personal data for the legitimate needs of agencies only where this is not outweighed by the privacy risks to individuals. Privacy risks are defined in the code to include bias and discrimination, downstream expansion of the purposes of the information, profiling and surveillance.
There is no requirement, though, for the risk assessment to be publicised or notified to the commissioner or available on request to affected individuals. Likewise, there are no requirements for subsequent audits. However, where complaint or investigation subsequently reveals the proportionality and risk assessment not to have been diligently conducted in the first place, significant liability for agencies can result, especially as class actions are permitted under the Privacy Act.
The code might be all privacy advocates can expect this year, but whether it is a gold-plated gift, or a banana skin remains to be seen.
Gehan Gunasekara is an associate professor in commercial law at the University of Auckland Business School and is convenor of the Surveillance Working Group of the Privacy Foundation.
This article reflects the opinion of the author and not necessarily the views of Waipapa Taumata Rau University of Auckland.
This article was first published on Newsroom, New privacy code: gold-plated gift or a banana skin?, 22 January, 2025
Media contact
Margo White I Research communications editor
Mob 021 926 408
Email margo.white@auckland.ac.nz