UNSW audit finds school‑approved apps send children’s data within seconds, often defying policies

School‑endorsed educational apps used by Australian students are transmitting children’s data within seconds of being opened, often in ways researchers said contradict the apps’ own privacy policies. An audit by University of New South Wales (UNSW) researchers of around 200 Android titles found underage users face significant privacy and security risks as classroom tools increasingly rely on third‑party technology.
The apps were drawn from school recommendation lists and state Department of Education websites. Findings were detailed in the paper Analysing Privacy Risks in Children’s Educational Apps in Australia by Dr Rahat Masood, with co‑authors Sicheng Jin, Jung‑Sook Lee and Hye‑Young (Helen) Paik.
The team reported that many apps collected sensitive information and passed it to external parties, while presenting privacy policies so complex that very few parents could reasonably understand them. The analysis showed 89.3% of apps began transmitting data to third parties before any user interaction.
Opening an app alone was enough to send device identifiers, location metadata and other sensitive information to analytics platforms and advertising networks. “Even if you are not interacting with the app – you just open it and that’s it – it is still transferring lots of data,” Masood said, describing telemetry that automatically sends tracker‑related identifiers to remote servers.
Researchers found 83.6% of apps transmitted persistent identifiers—unique codes that can track a device across sessions and across different apps. More than two‑thirds (67.9%) contained at least one embedded tracker or analytics tool, including services such as Google’s Firebase, Facebook and Unity Analytics.
“None of these are needed to actually run the educational app,” Masood said. Branding aimed at parents offered little protection. Apps marketed to young children with labels such as “Kids”, “Preschool” or “ABC” were no safer than general‑audience education apps and in some cases showed worse alignment between stated privacy commitments and actual behaviour.
The study called this “the illusion of safety”, with child‑centric branding cultivating trust without delivering meaningful safeguards. Seventy‑six percent of child‑targeted apps exhibited at least one form of policy distortion, compared with 67% of general educational titles.
The team also reviewed the apps’ privacy policies and found just 3% were fairly easy to read; 97% required university‑level literacy or higher. “Nobody will understand these terminologies and jargon,” Masood said, adding that readability and comprehension scores were “very bad”.
The legal text often failed to reflect real‑world data practices, and the researchers said roughly a quarter of the apps examined—about 50—were fully consistent with their stated policies. Masood said the research was intended to assess whether Australian authorities and education departments are aware of the security and privacy risks as teaching goes digital and depends on technology suppliers.
The authors argue the patterns they observed leave children exposed, despite the appearance of safety and compliance.
