Safety by design
13 February 2026
We would like to acknowledge the contribution of Ian Ung to this article.
The Administrative Review Tribunal’s (ART) decision of Bunnings Group Limited and Privacy Commissioner (Guidance and Appeals Panel) [2026] ARTA 130 (Bunnings) is the latest in a series of technology and privacy jurisprudence, highlighting the importance of data protection and safety by design.
Technology has developed at an exponential rate, particularly over the past few decades. From society’s ever-increasing reliance on computers and the Internet, to the popularisation of social media, biometrics and artificial intelligence (AI), the growth in modern advancement has significantly outpaced the development of laws required to regulate such technologies and keep the community safe. It therefore comes as no surprise that there has been increasing calls for safety and protection to be embedded in design.
Sparke Helmore’s analysis of the Bunnings decision is set out here and here. Kelly Matheson’s articles underscore the importance of embedding privacy-by-design in the early stages of planning and needing to have well-documented evidence to support decision-making. It is necessary to build protections into technology to avoid retrospectively altering products or services to address privacy or other safety risks that could come to light.
It is important for decision-makers to proactively consider contemporaneous expectations of technology management and the risks to vulnerable members of the community, even in the absence of existing laws. While this list is not exhaustive, the Federal Government has flagged potentially significant regulatory reforms, which could impact how technology businesses continue to do business:
- the Children’s Online Privacy Code
- further amendments to Australia’s privacy laws
- reforms of Australia’s electronic surveillance framework, and
- potential regulatory approaches to AI.
It should be noted that the Australian Government has demonstrated a willingness to introduce legislation to manage public safety, having a seismic impact on digital service providers (including first of its kind legislation). For example:
- on 29 November 2024, Federal Parliament passed the Online Safety Amendment (Social Media Minimum Age) Bill 2024 to amend the Online Safety Act 2021 (Cth) to establish a minimum age for social media use; and
- on 24 November 2024, Federal Parliament passed the Cyber Security Bill 2024, which introduced amongst other things, mandatory security standards for smart devices.
Turning back to the Bunnings decision, the ART’s decision (as well as the Privacy Commissioner’s Decision) was made in the context of technological advancements and the proliferation of facial recognition technology in public spaces, including in retail settings. The use of facial recognition technology is not specifically regulated by any Commonwealth, State or Territory laws – instead, general privacy laws apply.[1]
The Privacy Act 1988 (Cth) makes provision for the protection of privacy of individuals. Schedule 1 sets out the Australian Privacy Principles (APPs) which are a set of principles that regulate the handling of personal information. These APPs are principle-based, technologically neutral and general in nature.
This is consistent with the modern drafting practices, which avoids being overly prescriptive or imposing rigid obligations on the regulated community. Consideration must nonetheless be given to principles of reasonableness, necessity and proportionality, for example in determining whether personal information could be collected, Bunnings was required to consider whether:
- the information was reasonably necessary for one or more of its functions or activities
- there were reasons to suspect that unlawful activities of a serious nature were being or may have been engaged in, and
- they reasonably believed that the collection, use or disclosure was necessary in order to take appropriate action.
In determining that APP 3 had been complied with, Bunnings was able to put on evidence to demonstrate:
- Suitability: whether FRT was an effective response to repeat offending
- Alternatives: whether less privacy intrusive options were available, and
- Proportionality: whether the privacy intrusion was justified by the benefits gained.
Technological design based on 'reasonableness', 'necessity' and 'proportionality' is not a novel concept.
- Section 63D of the Online Safety Act 2021 (Cth) requires a provider of an age-restricted social media platform to take reasonable steps to prevent age-restricted users having accounts with age-restricted social media platforms.
- The Federal Government has agreed-in-principle to amend the Privacy Act to require that the collection, use and disclosure of personal information must be fair and reasonable in the circumstances.[2]
- The Federal Government has agreed that electronic surveillance should only be authorised where it is necessary for, and proportionate to, the purposes of an investigation.[3]
Noting the principle-based approach to new legislation, particularly as it relates to technology, businesses should ensure they turn their minds to principles of reasonableness, necessity and proportionality insofar as it relates to public safety, including on matters relating to privacy, when designing and implementing new technologies. The risk of not doing so, is that they may be required to retrofit products or services having regard to the direction of future changes to laws that may impact technology.
[1] Depending on the circumstances of the collection, it may be possible that specific secrecy regimes apply to the collection, use or disclosure of facial imagery, but that would need to be considered on a case-by-case basis.
[2] See Government response to Proposal 12.3 at page 27 of the Government Response | Privacy Act Review Report (2023).
[3] See Government response to Recommendation 80 at page 24 of the Commonwealth Government response to the Comprehensive Review of the Legal Framework of the National Intelligence Community (2020).

