India AI Platforms Non-Compliant with Data Protection Act

This title was summarized by AI from the post below.

Something I contributed to recently was just published, and it matters. The Advanced Study Institute of Asia at SGT University released a compliance report assessing 14 AI platforms widely used by minors in India and evaluating them against the Digital Personal Data Protection Act 2023. I was brought in as a reviewer, specifically on the Trust and Safety compliance lens. The findings are difficult to ignore. 71% of all assessments across 14 platforms and 14 DPDP criteria were found to be outright non-compliant. Only 13% achieved even relative compliance. Instagram, Canva, and xAI Grok scored 100% non-compliance across all criteria assessed. ChatGPT and Perplexity were not far behind. The most pervasive failure is something deceptively simple: the DPDP Act defines a child as anyone under 18. All platforms reviewed set 13 as the minimum age. That five-year gap is not a technicality. It affects millions of Indian teenagers who are using these tools daily in schools, for homework, and for learning, with essentially no meaningful data protection in place. Parental consent mechanisms are either absent or rely on self-declaration, which anyone can bypass in seconds. Behavioural tracking continues unchecked. Grievance redressal is inaccessible or non-functional for most platforms. India has a compliance deadline approaching. Platforms have time to fix this, but the window is not open indefinitely. The full report is published by the Centre for Law and Critical Emerging Technologies at ASIA. Authored by Shivani Singh and Sonal Lalwani, reviewed by Neeti Goutam and me. If you work in Trust and Safety, policy, EdTech, or data governance, it is worth a read. https://lnkd.in/g-TYc6TS

To view or add a comment, sign in

Explore content categories