Across the world, governments are confronting a difficult question: how should societies protect children in an age where digital platforms shape attention, behaviour and even identity? What began as a parental concern has now evolved into a serious policy debate, with countries like Australia exploring age-based restrictions on social media use.
India is now entering this conversation. Several states, including Karnataka, have indicated interest in restricting social media access for younger users, reflecting growing unease about cyberbullying, addictive engagement patterns and the psychological impact of algorithm driven platforms.
The concern is valid. The solution, however, risks being misplaced. Blanket bans on social media for children may appear decisive, but in India they are unlikely to work. And may even distract from the real regulatory challenge.
The first reason is practical. Age based restrictions are notoriously difficult to enforce. Even in technologically advanced countries, young users routinely bypass controls through virtual private networks, false age declarations or shared family devices.
In India, with its scale, diversity of access and uneven digital literacy, enforcement would be even more complex. Such measures risk becoming symbolic rather than effective.
The second reason is structural. The COVID-19 pandemic fundamentally altered the digital experience of an entire generation. Between 2020 and 2022, millions of Indian students relied on smartphones and online platforms as their primary gateway to education.
For this “pandemic generation,” the internet is not merely a source of entertainment. It is an environment for learning, collaboration and social interaction. Attempting to sharply separate children from digital platforms ignores this irreversible shift.
There is also a distinctly Indian dimension that policy debates often overlook. A significant amount of informal learning today, particularly in areas such as classical music, dance, yoga, languages and cultural traditions, flows through digital and social platforms. For many young learners, access to teachers, performances and peer communities is mediated through these channels.
At the same time, schools and colleges routinely use messaging platforms and digital groups to share assignments, schedules and academic updates. While parental oversight may exist, device based participation has become embedded in everyday educational practice. Any regulatory approach that assumes children can simply be disconnected from these networks, risks ignoring this reality.
A third, often overlooked dimension is the role of social media as an informal knowledge network. Students today use digital platforms to access tutorials, participate in academic communities, learn new skills and engage with ideas beyond the classroom. For many in smaller towns and rural areas, these platforms provide exposure that formal systems cannot easily match. A regulatory approach that treats all digital platforms as harmful, risks undermining these opportunities.
At the same time, the risks are real, and they are not accidental. Many platforms are built on behavioural design architectures intended to maximise engagement. Infinite scrolling, autoplay features and algorithmic recommendation loops. These systems can be particularly powerful, and potentially harmful, when applied to younger users.
This points to a fundamental policy misalignment. The current debate focuses on restricting children’s access, when the more effective lever lies in regulating platform behaviour.
India already has a foundation to build upon. The Digital Personal Data Protection Act, 2023, introduces obligations around the processing of children’s data and places responsibility on large platforms. The next step is to extend this into a broader framework of platform accountability. Age appropriate design standards, restrictions on behavioural advertising targeted at minors, and greater transparency in algorithmic systems.
However, the regulatory challenge is rapidly evolving. The rise of artificial intelligence introduces a new grey area that current policy discussions have barely begun to address.
Students are increasingly using conversational AI platforms, to clarify concepts, explore ideas and support learning. These systems are interactive, but they do not function like traditional social media platforms driven by engagement metrics and advertising incentives. Treating them under the same regulatory umbrella, risks conflating fundamentally different categories of digital services.
This is where future policy will need greater precision. The distinction between “engagement platforms” and “knowledge platforms” will become critical. Without it, well intentioned regulations could inadvertently restrict tools that enhance learning while failing to address systems that drive harmful engagement patterns.
India, however, has a unique opportunity to approach this challenge differently.
Its evolving digital public infrastructure. Built around identity, authentication and consent, this offers the possibility of privacy preserving age verification at scale. If implemented with appropriate safeguards, such systems could enable graduated access for younger users without resorting to blunt restrictions.
More importantly, India can move towards a model that combines platform accountability with what may be called an “AI handler” approach. Systems or institutional mechanisms that ensure transparency about how digital content is generated, curated and recommended. As digital ecosystems become more complex, such layers of oversight may become essential for maintaining trust, particularly for younger users.
The debate on children and social media is ultimately not about whether young people should be online. That question has already been answered by reality. The issue is whether the digital environments shaping them are designed for learning and well-being, or for maximising attention and advertising revenue.
Blanket bans offer the comfort of simplicity. But in a country as large, diverse and digitally integrated as India, they are unlikely to succeed. Smarter regulation, focused on platform design, accountability and technological nuance, offers a more difficult path. But it is the one that is far more likely to work.
India has the scale, the digital architecture and the policy opportunity to lead this shift. The choice is between symbolic regulation and systemic reform. Only one of them will endure.
(The author is Former Signal Officer-in-Chief of the Indian Army and National Cyber Security Coordinator (NCSC) of India; Views expressed are personal)


