UK demands nude-blocking phones; while India pushes for platform age checks

The UK proposes asking Apple/Google to install system-wide nudity blocks until age verification occurs. While India is focusing on platform accountability, using its DPDP Act to mandate verifiable parental consent and prevent tracking.

author-image
Punam Singh
New Update
digital activities
Listen to this article
0.75x1x1.5x
00:00/ 00:00

The debate over protecting children from harmful online content has intensified globally, leading to distinct regulatory paths. While the UK is pushing device manufacturers like Apple and Google to build mandatory, system-wide nudity-blocking software into smartphones, India focuses on increasing the accountability of online platforms and enforcing verifiable age consent through its data protection laws.

Advertisment

UK’s device-level mandate

The UK government is moving forward with a new radical strategy that places the responsibility for content filtering directly onto smartphone hardware and software makers. The UK government formally plans to ask Apple and Google to embed advanced nudity-detection software directly into their mobile operating systems. This proposed measure, part of a forthcoming Home Office strategy, requires all device users to verify their age before they can capture, share, or view implicit images.

Home Office officials aim to stop any display of nudity on screen unless the user proves they are an adult. To lift the restriction, users must undergo age verification, likely involving methods such as biometric checks or official identification scans.

While officials initially explored making these controls mandatory for all devices sold in the UK, the current plan focuses on encouraging the tech companies to adopt the system voluntarily. The proposal targets mobile phones first but could eventually expand to desktop computers.

Advertisment

Building on existing, but insufficient safety tools

This move follows the passage of the UK’s Online Safety Act (OSA) 2023, which mandates greater protection for children online. The government frames the request as a critical step in its broader strategy to reduce violence against women and children, addressing issues like unsolicited images, commonly known as cyber-flashing.

Tech companies already offer some safeguards. Apple, for example, provides Communication Safety features that detect nude photos sent to children in apps like Messages. Crucially, these existing safeguards depend on parental activation and do not prevent teenagers from dismissing the warning and viewing the image. The government's new proposal goes much further, demanding a permanent, system-wide block that only certified adults can disable.

India’s platform-centric protocols

India is also actively strengthening its legal framework to protect minors online, but its approach focuses primarily on increasing the accountability of online platforms and implementing verifiable parental consent under its data protection and IT laws.

India’s strategy centres on two major pieces of legislation, which place the burden on intermediaries like social media companies, streaming services, and data processors, rather than the device manufacturers.

The DPDP Act defines anyone under 18 as a child. It requires all companies that process personal data to obtain verifiable parental consent before processing a child’s personal data. The act specifically prohibits companies from engaging in behavioural monitoring, tracking, or targeted advertising directed at children.

The IT Rules 2021 requires intermediaries to remove or disable access to content that is obscene, pornographic, or harmful to minors. Critically, platforms must remove content depicting non-consensual nudity or sexual acts within 24 hours of receiving a complaint. Failure to observe these legal obligations can cause the intermediary to lose its "safe harbour" protection, making it liable for the third-party content.

India’s prefernce for digital ID

The Indian Supreme Court has also weighed in, suggesting that platforms should implement Aadhaar-based age verification to restrict access to potentially obscene online content. This proposal highlights the government’s potential preference for using a national, verifiable digital identity system for age-gating, contrasting with the UK’s focus on hardware-based image filtering.

Ultimately, while both nations seek to protect children, the UK demands a technical 'kill switch' on every device, potentially impacting privacy, whereas India enforces protection by making platforms strictly liable and mandating verifiable consent for children’s data.