In an earlier facial recognition technology update, we noted the Privacy Commissioner's inquiry into Foodstuffs' trial of facial recognition technology (FRT) in 25 of its North Island supermarkets. The Privacy Commissioner has now released the inquiry report finding that the live FRT model trialled by Foodstuffs complied with the Privacy Act 2020 (Act) and it makes for interesting and instructive reading.
The Privacy Commissioner's view on FRT
Biometric information, such as an FRT image, is personal information and is regulated under the Act. In our previous biometrics updates (Consultation open on draft Biometric Processing Privacy Code, and Biometrics Code to go ahead, second consultation period open) we described how the Privacy Commissioner has developed a draft Biometrics Code (Code) intended to regulate the collections and use of biometric information, including FRT. The Privacy Commissioner has now completed public consultations and has indicated that the new Code will be released mid-2025.
The introduction of the Code and the Foodstuffs inquiry together demonstrate the recognition that biometric data is often sensitive and the Privacy Commissioner's move towards greater scrutiny of organisations using such technology. While the Privacy Commissioner has recognised the benefits of FRT in preventing and detecting crime, the technology also raises major privacy concerns including unnecessary or unfair collection of customers' information, misidentification, technical bias and the ability to be used for surveillance.
The Privacy Commissioner's inquiry
The Privacy Commissioner found that Foodstuff's FRT operating model, as implemented and updated during the trial, complied with the Act. Several privacy safeguards and modifications were significant in reaching this conclusion. These include:
- Purpose limitation: The use of FRT was strictly confined to identifying individuals involved in recent serious harmful behaviours (eg violence, threatening behaviour, or high-value theft) within stores. Broader uses or sharing across stores was expressly prohibited.
- Immediate deletion of images: Images of individuals not matching the watchlist were automatically deleted almost instantaneously. Over 225m faces were scanned during the trial and 99% were deleted within one minute. Matched images were deleted by midnight the same day if not acted upon.
- Watchlist controls: Each store managed its own watchlist, which was not shared across the Foodstuffs' network. Strict criteria governed additions to the watchlists, and vulnerable groups (ie children, elderly, those with known mental health conditions) were explicitly excluded.
- Transparency and notice: Clear signage was used at store entry points, supported by online information and trained staff able to respond to customer queries.
- Testing, audit, and continuous improvement: The operating model was adapted during the trial in response to incidents of misidentification, including raising the operational match threshold for staff intervention.
- Security protections: Robust physical and IT security controls such as dedicated security rooms and authorisations ensured limited access to FRT information.
Key takeaways
The Privacy Commissioner emphasised that this finding is not a blanket endorsement for the use of FRT generally. The report highlights a number of key considerations for businesses considering the use of FRT and biometric technology generally:
- Strong justification needed: Users should clearly define the problem they aim to solve with FRT, ensure the issue is serious, and consider whether less intrusive alternatives would be sufficient.
- Thorough planning: Before deploying FRT, businesses must carefully weigh its effectiveness and proportionality, undertake privacy impact assessments and mitigate potential risks.
- Watchlist management: Define clear, justified, and consistent criteria for including someone on a watchlist, ensure high image quality, exclude vulnerable groups (such as children) and train staff thoroughly.
- Rigorous system design: Select fit-for-purpose technology, set appropriate accuracy thresholds to minimise false matches, and keep the operating system – including cameras, data storage, and deletion processes well managed and secure.
- Transparency and communication: Clearly inform customers (eg via signage and online information) that FRT is in use, explain its purpose, and make it easy for individuals to make queries or complaints.
- Human oversight: Ensure alerts generated by FRT are checked by trained staff and that any action taken is justified and proportionate. Decisions to intervene should involve clear, objective criteria.
- Access and security: Limit access to FRT systems and data, ensure data security, and retain data only as long as necessary.
- Complaint and redress mechanisms: Implement clear procedures for handling customer access requests, correction, complaints, and removal from watchlists.
- Ongoing monitoring and review: Regularly audit and review FRT operations for accuracy, unintended outcomes and to confirm privacy safeguards remain effective.
- Context-dependent decision-making: FRT is highly intrusive and not suitable for all retail environments. Each business needs its own strong justification and must evaluate whether FRT is really necessary and appropriate for its specific circumstances.
One thing is clear, the inquiry’s findings and the anticipated Biometric Processing Privacy Code now serve as both a roadmap and a cautionary note: while FRT can be used in New Zealand with the right parameters, compliance will require careful and ongoing diligence.
If you have any questions about the use of FRT or biometric technology generally, please get in touch with one of our team.