Search

Quality and consistency through collaboration

All.FirmWide services.Cyber and Privacy

A number of governments around the world, including the Australian Federal Government, have become increasingly concerned about online harm to young persons. Throughout Europe and in some American States, there are now a series of online protections for children, including in relation to data collection, use and disclosure.

In this paper I consider recent developments under the Privacy Act 1988 (Cth) (Privacy Act) in relation to children’s data and how overseas developments in this area may provide guidance for schools that hold significant amounts of children’s personal information. I look in particular at the proposed Children’s Online Privacy Code (Code) and consider the standards adopted in the UK under the Age-Appropriate Design Code (UK Children’s Code).

Recent Privacy Act amendments

With the growth of education technology (Ed Tech) in schools, there is now a plethora of children’s data collected, used, disclosed and stored by a variety of persons including education technology companies and schools. Data is collected for a range of operations, including enrolment and general administration, teaching, as well as distribution and receipt of student homework and examinations. This wide range of uses can also lead to unintended uses of that data. For example, a child’s browsing history can be used to target the sale of particular products or promotion of nearby events to the child. A child’s school grades and search history could be made available to a potential employer considering the child’s job application.

Governments have become concerned and are now taking action. The Privacy and Other Legislation Amendment Act 2024 (Cth) (POLA Act) introduced several changes to the Privacy Act, including changes focusing on children. This has implications for private schools around Australia, many of which may be subject to the Privacy Act. In contrast, except for the ACT, state and territory schools will often be subject to the varying state and territory privacy legislation. In any case, all schools need to consider whether they are handling children’s data and person information in accordance with increasingly complex privacy and data laws.

By the end of 2026, the Australian Information Commissioner is required to have developed a Code which sets out how one or more of the Australian Privacy Principles (APPs) are to be applied or complied with in relation to the privacy of children.[1] The Code is proposed to apply to social media services, relevant electronic services and designated internet services accessed by children. The Code can also specify an APP entity or class of entities that are to be bound by the Code.[2]

It is expected that the Code will build upon the existing APPs and specify how the APPs should apply to children’s personal information. In practice, this may impose additional requirements on entities by the Code. This will directly impact private schools, and designers and distributers of Ed Tech. Even if the Code does not specifically apply to these entities, it will provide useful guidance on how entities dealing with children’s personal information should do so. For example, as part of their due diligence before using Ed Tech, a school should ask if the Ed Tech complies with the Code.

The United Kingdom’s Children’s Code

While the proposed content of the Code is not known, it is expected the Office of the Australian Information Commissioner (OAIC) will consider other examples of children’s codes that are in place, such as the UK Children’s Code in the United Kingdom (UK).[3]

The UK Children’s Code was established under s 123 of the Data Protection Act 2018 (UK) and came into force in 2020 to assist organisations understand what they must do to comply with the UK General Data Protection Regulation (UK GDPR). The UK Children’s Code consists of 15 standards that set out how online services likely to be accessed by children should apply data protection by design principles to protect children’s personal information.

The 15 standards in the UK Children’s Code are summarised below:

  • The first standard references the fact that:
    • ‘The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.’[4]
    • This standard has been interpreted to mean that those designing online services need to consider the needs of child users when designing any service that will process the user’s personal data.
  • The second standard relates to undertaking a Data Protection Impact Assessment (DPIA) (like an Australian Privacy Impact Assessment) to assess and mitigate risks to the rights and freedoms of children who are likely to access the online service.
  • The third standard requires entities to take an ‘age-appropriate application’, which involves taking a risk based approach to the age and needs of individual users.
  • It can be expected that the Code to be developed by the OAIC will also adopt a risk-based process, noting that risk based processes have been implemented by Australian governments in relation to data protection more generally, such as in relation to the use of artificial intelligence and cyber security.
  • The fourth standard relates to transparency. Entities must provide clear explanations of their data use, policy and community standards to their user, written with consideration for the developmental stage of their audience.
    • This standard, including its requirement for ‘bite sized’ explanations provides useful guidance for Australian schools preparing their privacy policies or notices.
  • The fifth standard prohibits detrimental and unlawful uses of children’s data.
  • The sixth standard requires entities to adhere to and enforce their published community standards and policies.
  • The seventh standard requires that default settings must be ‘high privacy’ for children.
  • The eight standard recommends that entities collect and retain ‘only the minimum’ amount of data necessary to provide their service, and that they give children choices over which elements they wish to activate.
  • The ninth standard prohibits the sharing of children’s personal information with third parties without a compelling reason, taking into account the best interests of the child.
  • The tenth standard requires that default geolocation settings must be off for children, and that a clear notification be provided when geolocation tracking is active.
  • The eleventh standard requires that the child be informed if there are parental controls on (such as if the online service allows a parent to monitor the child’s activity or track their location).
  • Under the twelfth standard, profiling must be off by default. Profiling is any form of automated processing of personal data to analyse or predict aspects relating to the person, such as their age, preferences, interests, health, economic situation, behaviour, and location or movements.
  • The thirteenth standard prohibits nudge techniques that encourage children to provide unnecessary personal data or to turn off privacy protections.
  • The fourteenth standard addresses connected toys and devices.
  • And the fifteenth standard requires entities to provide prominent and accessible tools to help children exercise their data rights (such as the right to rectification and erasure) and report concerns.

While there are no guarantees about what will or will not be in the Australian Code to be developed by the OAIC, the above standards as well as other European guidance provide useful guidance for Australian private schools.

It is important to note that there have been further legislative amendments in the UK, with the Data (Use and Access) Act 2025 coming into force on 19 June 2025. These amendments require that, when designing and developing processing systems, controllers must have ’particular regard to children’s higher protection matters’ where their services are likely to be accessed by children. These matters are defined to include:

  • age-appropriate presentation of privacy information
  • prominent default privacy settings
  • profiling limitations
  • geolocation off by default, and
  • prevention of nudge techniques that encourage data sharing.[5]

The UK’s Information Commissioner’s Office is currently in the process of reviewing the UK Children’s Code to see if any changes are required as a result of these legislative amendments. Given the current environment, and scope of the amendments, is unlikely any changes will be made that weaken the current UK Children’s Code.

Implications for Australian private schools

The UK Children’s Code implements a range of risk-based concepts to assist compliance with Recital 38 of the GDPR which provides:

’Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child…’

While there is no guarantee that the UK Children’s Code will be replicated in the Australian Code to be developed by the OAIC, the underlying concepts in the UK Children’s Code resonate across borders, noting that a range of digital platforms are hoovering up children’s personal data on an exponential scale.

The standards in the UK Children’s Code provide useful guidance for Australian schools in relation to how they should draft their privacy policies and notices, and how they should use Ed Tech.

It is not uncommon for privacy policies (required by APP 1) to be long and unwieldy and adopt a ‘tick the box’ approach to cover off on legislative requirements without regard to practical implications. They can be difficult to read by parents, let alone children. Schools should think about how they can draft policies and notices that are age appropriate and easy to read and understood by persons of all ages.

Schools are increasingly using Ed Tech, but do they fully understand how the Ed Tech works in relation to the collection and use of children’s (and teacher’s) data? By not investigating and questioning such uses, a school may find itself implicated in questionable data practices, contrary to APP 3, 4, 5, 6, 7 and 8).

Further, as schools are collecting significant amounts of data, they should consider whether all data is necessary and for how long it needs to keep that data (or take steps to ensure the data is accurate and up to date as per APP 10 and 13).

And finally, as school’s will hold large quantities of sensitive data, they should ensure they have appropriate security policies in place to protect that data from unlawful or accidental disclosure, as per APP 11.

In anticipation of the Code to be developed by the OAIC, schools should have regard to the standards in the UK’s Children’s Code to consider what additional privacy elements they will need to address. Afterall, even if the considerations do not become legislative requirements, it can be expected that parents will want to know what measures schools are putting in place to protect their child’s data.

 

[1] Privacy Act s 26GC(3). Children are defined as persons under the age of 18 (s 6(1)).

[2] Privacy Act s 26GC(5).

[3] Other countries that have codes or are developing codes include Ireland, Sweden, Indonesia and the United States of America states of California and Maryland (passed) and proposed in Hawaii, Illinois, New Mexico, Nevada and Oregon. Note that in California, an industry lobbying group, NetChoice, that represents Amazon, Meta, Google, Tik Tok and Snapchat among others has challenged the Californian code.

[4] This concept comes from Article 3 of the United Nations Convention on the Rights of the Child (UNCRC).

[5] Data Use and Access Act 2025 (UK) s 81, which amends UK GDPR article 25(1).

 

Return To Top