The Renter Data Reckoning: What the OAIC's Ruling Against 2Apply Means for RentTech
The Privacy Commissioner steps in to protect Renters' privacy rights.
Table of contents
Share
For years, renting a home in Australia has meant handing over an extraordinary amount of personal information to platforms you did not choose, for purposes that were never fully explained, administered by companies that occupy a strange middle ground between landlord, agent, and technology provider. With its determination against IRE Pty Ltd, the Office of the Australian Information Commissioner (OAIC) has put the entire rental technology sector on notice: that era is ending.
The decision, handed down by Privacy Commissioner Carly Kind on 1 April 2026, is one of the most significant privacy rulings to affect the Australian property industry in recent memory. It is worth reading carefully, not just for what it says about 2Apply, but for what it signals about the regulatory direction of travel for anyone operating a platform that sits between vulnerable individuals and access to something essential.
Who Is IRE Pty Ltd, and What Is 2Apply?
IRE Pty Ltd, trading as InspectRealEstate and owned by parent company Reapit Holdings, is a Brisbane based property management software company. Its flagship consumer product, 2Apply, is the online rental application platform that real estate agents use to receive and process tenancy applications.
The scale of 2Apply is significant. As of March 2025, the platform had processed over 8.5 million rental applications from Australians across the country. A 2023 CHOICE survey found that 37% of renters had used 2Apply, making it the most widely used RentTech platform in Australia. When the Commissioner describes this as a sector capable of affecting "many millions of individuals," that is not hyperbole.
The platform works like this: a real estate agent lists a property and provides a link to a 2Apply form. An applicant creates a profile, fills in their details, and submits. The form pre-fills across multiple applications, which sounds convenient, and is, until you look closely at what information the form is actually asking for.
What Was IRE Actually Collecting?
The 2Apply default form collected a remarkable range of personal information. The Commissioner's determination lists the full default dataset in Attachment A, and it makes for striking reading.
Beyond the basics you would expect (name, contact details, proof of identity, proof of income), the form routinely asked applicants for:
About the applicant personally
Gender, smoker status, student status, retirement status, and bankruptcy status.
About their family and living situation
The names and ages of dependants, two full years of living history including reasons for leaving each address, whether they owned or intended to own other property, and whether they were currently applying for other rentals.
About their financial vulnerability
Whether they were applying for bond or rent assistance, meaning a direct flag of whether someone was receiving social security support.
About their citizenship
Citizenship status and visa expiry details.
About their car
Vehicle type, registration number, make and model.
About who to contact if something goes wrong
A full emergency contact, collected from every applicant, including unsuccessful ones.
For identity verification, the form asked for 60 points of documentation, with granular details such as the number of names listed on a Medicare card and the card's colour. For income, the form required a minimum of two years of employment history, far exceeding what banks typically ask for when assessing a personal loan.
All of this was collected from every applicant for every property, regardless of whether they were ultimately offered a tenancy.
The Two Legal Breaches
The Commissioner found IRE breached the Privacy Act in two distinct ways, each of which deserves careful attention because they operate at different levels of the problem.
Breach One: Collecting More Than Was Reasonably Necessary (APP 3.2)
The first breach concerns what was collected. Australian Privacy Principle 3.2 prohibits APP entities from collecting personal information unless it is "reasonably necessary" for one or more of their functions or activities.
The Commissioner identified IRE's primary function as facilitating the processing of complete tenancy applications. That purpose, properly understood, requires information that establishes three things about an applicant: their identity and contact details, their ability to pay rent, and their likelihood of looking after the property. That is it.
Tested against this standard, a long list of the default form's fields fail. Gender does not tell a real estate agent whether someone can pay rent. Bankruptcy status does not, where proof of income is otherwise provided. Citizenship status does not establish either financial capacity or likelihood to maintain a property. The names and ages of dependants do not either, and they raise discrimination risks given laws that prohibit denying housing on the basis of parental status.
The Commissioner was particularly pointed about one argument IRE advanced: that collecting all this data was reasonably necessary for the function of "improving its service offerings." The Commissioner dismissed this as circular reasoning. You cannot justify collecting personal information by pointing to a function that itself only exists because you decided to collect that information.
Notably, IRE had agreed during the investigation to voluntarily cease collecting several of the problematic fields before the final determination was issued. The Commissioner acknowledged this cooperation. But the finding of breach was made nonetheless because the collection had occurred throughout the relevant period from March 2020 to March 2025.
Breach Two: Collecting Information by Unfair Means (APP 3.5)
The second breach is in some ways more interesting, because it introduces legal concepts that have not previously been applied in an Australian privacy determination.
APP 3.5 requires that personal information be collected only by lawful and fair means. Where APP 3.2 asks what was collected, APP 3.5 asks how. The Commissioner found that IRE failed on the how as well, for reasons that go to the structure of the platform and the circumstances in which renters interact with it.
The core of the APP 3.5 finding rests on three reinforcing factors.
The power imbalance is real and legally relevant. The Commissioner took direct account of the rental crisis as a legal fact. In a market with a shortage of supply, sharply rising rents, and intense competition, renters are not in a position to negotiate. They cannot choose which platform to use (that decision is made by the real estate agent). They cannot refuse to engage with a platform and expect equal treatment. They are, as the Commissioner put it, at a disadvantage and "more vulnerable to unfair practices." Housing is not a discretionary consumer product. It is, as the Commissioner observed, recognised as a human right in international law. This context matters to the legal assessment of fairness.
Excessive collection is itself a form of unfairness. The Commissioner characterised the breadth of IRE's collection not only as a breach of the necessity test in APP 3.2, but as a form of unreasonably intrusive conduct relevant to APP 3.5. Collecting far more information than is needed from people who have no real choice about providing it is not a neutral act. It is an exercise of power that the law, properly interpreted, does not permit.
Online Choice Architecture can make collection unfair. This is the genuinely novel element of the determination. For the first time in an Australian privacy ruling, the Commissioner has formally considered a platform's "Online Choice Architecture" as part of an APP 3.5 assessment.
Online Choice Architecture: The New Legal Frontier
Online Choice Architecture refers to the way information is presented and choices are structured on digital platforms, and the effect that design has on user behaviour. The Commissioner was clear that this concept is not inherently problematic. Well designed defaults can genuinely benefit users. But it becomes legally relevant when the design works against users' interests and pressures them into decisions they would not otherwise make.
The Commissioner identified three specific techniques used in the 2Apply form.
Confirmshaming
The use of emotive language to make a user feel guilty for not doing something that benefits the platform. The 2Apply form told applicants that providing more information would "help speed up your application process" and warned that not providing information "may affect whether you are considered as a suitable tenant for the property." The Commissioner accepted these statements may not have been technically false. But their effect was to frame personal information disclosure as a marker of suitability, pressuring applicants to provide more than they otherwise would have.
Biased framing
Involves presenting choices in a way that emphasises benefits to the platform while obscuring downsides to the user. Related to confirmshaming in this context, the 2Apply form consistently structured choices around the commercial logic of completeness being better for everyone, without acknowledging the privacy costs of providing extensive personal information.
Bundled consent
The practice of requiring agreement to multiple purposes in a single request. The Commissioner found that IRE required applicants to consent to receiving direct marketing from IRE as a condition of submitting their application. There was no opt out at the point of collection. The only alternative was not to submit at all, which in the context of housing, for most people, is not a realistic choice.
The significance of this analysis cannot be overstated. Australian regulators have increasingly focused on dark patterns and harmful design in digital markets. The Commissioner's determination formally embeds Online Choice Architecture analysis into the privacy fairness assessment. This means that how a platform is designed, not just what data it collects or how it is stored, is now a live question under the Privacy Act.
What IRE Must Do Now
The practical remedies are substantial.
Within 60 days of the determination (by approximately 1 June 2026), IRE must stop collecting the information found to be unnecessary, and must appoint an independent privacy expert to review the platform's practices. The review must cover whether remaining data fields are truly necessary (assessed at each stage of the application process), whether the form's design is fair, and whether data retention practices are appropriate across different outcomes (successful tenancies, unsuccessful applications, and tenancies managed through IRE's other products).
Within six months, the independent reviewer must produce a written report. That report must be provided to the OAIC within 14 days of receipt.
Within 12 months, IRE must report back to the OAIC on what steps it has taken in response to the recommendations.
IRE also faces a prohibition on repeating the conduct found to constitute a privacy interference.
What This Means for the RentTech Industry
The Commissioner was explicit: this determination is intended to apply beyond IRE. The OAIC will provide the determination to real estate peak bodies. The Commissioner expects other RentTech providers to adapt their practices in light of the findings.
This creates a clear compliance obligation for every platform that facilitates rental applications in Australia. Snug, tApp, Ignite, OurProperty, Tenant Options, rent.com.au, and any other operator that sits between renters and access to housing should be asking themselves the same questions the Commissioner asked of IRE.
What information do we actually need, for what specific purpose, at what stage of the application process? Can we do this with less? Is our form designed to support users' genuine interests, or does it use design techniques that pressure people into disclosing more than they would voluntarily choose to? Do we collect information from all applicants that is really only necessary from some?
The determination also has implications for real estate agencies themselves. Agents who configure and use these platforms bear their own obligations under the Privacy Act with respect to the information they request and receive. The Commissioner noted that agents can customise 2Apply forms to remove fields. They also bear responsibility for what they choose to ask for.
The Broader Policy Signal
There is a larger story embedded in this determination about the relationship between privacy law and market power.
Australian privacy law has not traditionally done much with the concept of power imbalance. APP 3.5 has generated relatively little case law. The Commissioner's decision to bring the rental crisis, the absence of legislated housing rights, and the structural position of RentTech platforms into the fairness analysis represents a meaningful expansion of how privacy law is being applied.
This is consistent with broader international trends. Regulators in the UK, EU, and elsewhere have increasingly approached data protection and consumer protection as connected rather than separate concerns. The idea that collecting personal information from people who have no meaningful ability to refuse is not "fair" even if they technically consented is becoming a settled principle in many jurisdictions. Australia is now joining that conversation in a meaningful way.
For RentTech operators, the message is clear: the regulatory environment has changed. The question is not whether you will face scrutiny, but whether you will be ready when you do.
Practical Takeaways
For RentTech platforms and real estate businesses, this determination suggests several immediate priorities.
Audit your default data collection against the three category test: identity and contact details, ability to pay rent, and likelihood of maintaining the property. Anything that does not clearly serve one of those purposes is at risk.
Assess the timing of collection. The Commissioner specifically flagged that what is necessary varies across stages of the application process. Collecting extensive data upfront from all applicants, when some of it is only relevant to successful applicants, is not a defensible practice.
Review your form design. Online Choice Architecture is now part of the legal fairness analysis. If your platform uses language that shames users for not disclosing more, bundles consent to marketing with application submission, or frames data sharing as being in the applicant's interest when it serves the platform's commercial interests, that design is legally risky.
Check your retention practices. The Commissioner flagged data retention as a live issue, and the independent review IRE must commission specifically covers retention across different tenancy outcomes. If you are retaining unsuccessful applicants' data indefinitely, or holding detailed personal information for years after a tenancy ends, you should review that approach.
Engage proactively. IRE's cooperation with the OAIC was noted positively in the determination. Regulatory goodwill is not nothing, particularly as the OAIC becomes more active in commissioner initiated investigations.
The 2Apply determination is a significant moment for privacy regulation in Australia. It confirms that digital platforms cannot claim the protections of "mere conduit" status when they actively design the systems through which personal information flows. It confirms that fairness under Australian privacy law is sensitive to context, power, and design, not just to the presence or absence of consent. And it confirms that the rental sector is squarely in the OAIC's sights.
For anyone operating in this space, the time to act is now, not when the investigation notice arrives.



