Apple users in Singapore may soon have to prove they are adults before downloading certain apps, as the tech giant moves to comply with upcoming government regulations aimed at protecting minors online.
In a blog post to developers on Tuesday, February 24, Apple said that from February 24, 2026, users in Singapore, Australia and Brazil will be prevented from downloading apps rated 18+ unless they have been verified as adults "through reasonable methods". The verification will be carried out automatically by the App Store.
However, Apple did not elaborate on what these "reasonable methods" entail and did not respond to media queries seeking clarification.
The company said that the developers can tap on its updated Declared Age Range Application Programming Interface (API) to determine a user's age band. Developers may still bear separate responsibilities to independently ensure that users meet age requirements.
The move comes a month before a new requirement by the Infocomm Media Development Authority (IMDA) kicks in. By March 31, 2026, all major app stores operating in Singapore must have measures in place to block users under 18 from accessing apps deemed unsuitable for their age.
The rule falls under IMDA's Code of Practice for Online Safety for App Distribution Services. It is designed to set safeguards at the point where users download apps, a critical control point amid growing concerns about children being exposed to harmful content, including sexual material, graphic violence, cyberbullying and self-harm themes.
Apps such as the game Grand Theft Auto and dating platform Tinder carry an 18+ rating on Apple's App Store due to mature themes, sexual content and profanities.
Under the new framework, major app store operators, including Apple, Google, Huawei, Samsung and Microsoft, are responsible for ensuring compliance. By placing the onus on platform operators, authorities effectively create a single point of accountability for managing problematic apps, including those that may contain extremist or child abuse material.
The rules mirror earlier efforts to regulate social media platforms. In 2023, Singapore implemented a Code of Practice for Online Safety requiring social media companies to provide child safety tools, such as restricted account settings and parental controls. Amendments to the Broadcasting Act that year also empowered regulators to fine errant platforms up to S$1 million or block access to services that fail to comply.
Google has already begun rolling out its own age assurance measures in Singapore. Earlier in February, it introduced safeguards across products such as Gemini, Google Maps, Google Play, Search and YouTube.
These include disabling the location timeline feature on Google Maps for younger users, restricting access to adult-rated apps on Google Play, turning on SafeSearch filters by default and activating digital well-being tools on YouTube.
To estimate users' ages, Google uses machine learning models that analyse signals linked to an account, such as search queries and the types of videos watched on YouTube.
As the March 31 deadline approaches, app stores face mounting pressure to demonstrate that stronger digital guardrails are in place — particularly for younger users navigating an increasingly complex online world.