Search

Quality and consistency through collaboration

All.Government.Government Regulatory Law

On 1 October 2025, the Office of the Australian Information Commissioner (OAIC) updated Part 3 (processing and deciding on requests for access) of the Freedom of Information (FOI) Guidelines. This follows the release of a consultation draft on 12 May 2025 seeking comments from interested stakeholders on the content, practical implications, readability, and accessibility of the updates.

The update inserts two paragraphs addressing the possibility of FOI requests being made entirely by, or with the assistance of, artificial intelligence (AI) (our emphasis added):

3.22      The rise in the use of Artificial Intelligence (AI) brings with it the potential for FOI requests to be made without human intervention. As noted above, the FOI Act does not prevent the use of a pseudonym and an FOI request is not invalid on this basis. It follows that an FOI request is not invalid on the basis that the identity of the requestor is not apparent.

3.23      However, the legally enforceable right of access in s 11 of the FOI Act is conferred on ‘every person’. Section 2C of the Acts Interpretation Act 1901 extends the definition of ‘person’ to include ‘a body politic or corporate as well as an individual’. Without a person or corporate entity making an FOI request (even if assisted by AI), there is no right of access under the FOI Act. To reduce the possibility of ‘bot’ generated FOI requests, agencies may consider publishing an online FOI request form that includes technology that can identify whether the user is a robot. Such a request will not be a valid request under the FOI Act. However, as noted at [3.21] above, the FOI Act does not require any particular form to be completed to make a valid request, or for people to identify themselves. Agencies should be open to receiving FOI requests from people in ways other than by using an online form.

Further, as summarised in our recent article on the Freedom of Information Amendment Bill 2025 (FOI Amendment Bill), Schedule 2 will introduce (if passed) a legislated requirement that an FOI request cannot be made anonymously or under a pseudonym, and that a person must declare when making a FOI request on behalf of a third party.

Both the changes to the Guidelines and the proposed changes to the FOI Act recognise that AI is being increasingly used in the FOI space, whether that be in submitting FOI requests directly to the Commonwealth, or in assisting applicants to draft request scopes or submissions when a matter reaches internal or external review stage.

So what’s the current state of play? Can you refuse a request if you suspect it’s been generated by AI?

The short answer is: Unless your agency has been able to properly conclude (not suspect) that a request has been submitted by an AI Bot, then the request will need to be processed.

As confirmed by the recent updates to the Guidelines:

  1. The definition of person adopted by the FOI Act extends to both individuals and body politics or corporates, meaning than an FOI request can made by an individual (either on behalf of themselves or someone else), multiple individuals, or even an organisation of group.
  2. There is nothing in the current legislation or FOI Guidelines that prohibits applicants from using AI to assist them in making FOI requests, or in corresponding with FOI teams or the OAIC on their request.
  3. The current legislation does not prohibit requests being made on an anonymous basis or under a pseudonym (such as through the website Right to Know). This means an agency cannot refuse a request on validity grounds just because an agency is not certain of an applicant’s identity. Note, this will likely change if the FOI Amendment Bill passes.
  4. Agencies may wish to implement technology that can help determine whether an actual person is in fact submitting an FOI request. Examples of this kind of technology might be CAPTCHA forms, or other similarly protective mechanisms.  If following this (or a similar process) it is identified that an FOI request has been made solely by AI (and it is wholly unconnected to any person or corporate entity), it would be open to an agency to treat it as an invalid request. However, based on our experience to date we understand that current AI capabilities will usually require some level of human interaction. This means that it may be particularly difficult in today’s current environment for an agency to be fully satisfied that a request has been made solely by AI, such that a refusal on validity grounds is warranted.  

So, whilst it is relatively unlikely that agencies will be refusing a huge amount of AI requests on invalidity grounds, we do think there are some practical steps you can take in dealing with requests appearing to have been influenced by AI.

Practical tips

  1. Remember that until the FOI Amendment Bill is passed, FOI requests can be made anonymously or under a pseudonym. However, as recognised at 3.19 of the FOI Guidelines, agencies may request further evidence of identity to the extent that it may be released to assessment of exemption provisions.

Examples of requests which may give rise to a request for further identify documents include where an FOI decision maker is considering:

  • the application conditional exemption provisions under ss 47G (business information) or 47F (personal information)
  • the application secrecy provisions, in particular those that have exceptions which authorise the disclosure of protected information to authorised individuals, or where the exception in s 38(2) of the FOI Act may apply
  • whether information being processed under an FOI request has previously be disclosed to a particular person or organisation, which might be relevant in considering whether the disclosure of information is unreasonable in the context of a conditional exemption provision (see for example our case note on Linda Poulton and Department of Climate Change, Energy, the Environment and Water here), or
  • processing a request for amendment or annotation of documents under Part V of the FOI Act.
  1. If your agency does establish that identity information would be useful in assessing any of the scenarios outlined above, remember that an applicant may decline to provide this information. If they do, this will likely influence your access decision. An example of this may be to refuse a request in full accordance with s 47F and/or applying s 26(2), if a person is seeking access to another individual’s personal information and has not provided evidence of their identity and authority.
  2. Be careful not to over collect personal information in the context of FOI processing. Remember that your agency is also required to comply with the Privacy Act and Australian Privacy Principles. In simple terms, FOI teams should not collect, use or disclosure any personal information beyond what is reasonably necessary to conduct the FOI process. If you aren’t sure, seek advice to ensure that you are acting in accordance with your agency’s privacy policy and relevant privacy and secrecy law.

Key takeaways

As AI tools become more accessible to the general public, government agencies may start to see more FOI requests that appear to have been either partially or completely generated and submitted by AI.

Noting the OAIC’s new guidance at [3.22] and [3.23] of Part 3, at this stage it appears that the large majority of FOI requests made by or with the assistance of AI will be considered valid requests and will need to be processed.

However, this space is changing as a rapid pace and if passed the FOI Amendment Bill will take the next step in ensuring that requests are made individuals or organisations that clearly meet the definition of ‘person’ as intended by the legislation.

If you need advice or assistance on the processing of FOI request, or your FOI team’s obligations under privacy paw please reach out to our team and we would be happy to assist.

Return To Top