Accountability as Publishers, Not Bans, Is the Solution to Social Media Harm

Protecting young users and democratic discourse requires platform accountability, not age bans that are hard to enforce and easy to evade.

iStock.com
Article related image
Author
By TK Arun

T.K. Arun, ex-Economic Times editor, is a columnist known for incisive analysis of economic and policy matters.

January 31, 2026 at 5:10 AM IST

The plans to ban social media access for young people are misguided. There is no disputing the harm that young people sustain from consuming an overdose of social media. The right solution is not to restrict social media users, but to hold the social media platforms responsible for the harm they do.

In India, Andhra Pradesh and Goa are toying with legislation to ban children from social media. Australia has passed a law and started enforcing it, to deny those yet to attain the age of 16 access to social media. France has put the age threshold at 15, and is expected to enforce this rule sometime this year. Egypt, Malaysia, Denmark and Norway have similar laws in the works.

Many would have watched ‘Adolescence’, the limited series on Netflix, and empathised with the parents of a barely pubescent boy, as they watch, in bewildered sorrow and horror, their son’s arrest for the crime of battering a girl from his school to death, and the unravelling of their faith in his innocence and in the adequacy of their own parenting. ‘Adolescence’ is a fictional testament to the all-too-real capacity of social media to stereotype, segregate, isolate, shame and, in general, bully young people to a point, beyond which violent retribution appears to be the only salve for mangled self-esteem.

The objectification of women has been part and parcel of the value system across time and space. Social media urges young girls to ‘max out’ their strong points, work on every individual part of their corporeal being to bring it in ever closer alignment with what are, in the world of social media, anointed as societal norms. ‘Looksmaxxing’ is spreading to young men, too.

If in North Indian tradition, young women are expected to perform ‘solah singar’ — 16 specified ornamentations — to make themselves beautiful, it would clearly be wrong to blame social media for all the pressure on young girls to look pretty. However, social media algorithms and agglomerations can exaggerate, amplify and focus the pressure to match a normative ideal of beauty, layering on constant comparison with others, as well as shame for failing to measure up.

It is eminently desirable to shield young minds from such pressures. But banning social media would be the wrong way to go about it. For one, enforcement would be patchy. How does one verify age online, without compromising privacy? Further, to proscribe some sites would be to drive young people to darker corners of the web, where they would be exposed to far worse influences than what they see on social media.

Rather than filtering users, it is social media sites that should filter harmful content and algorithms out of themselves. There are two parts to getting this done.

One is to bring them on par with publications, in terms of legal accountability for what they carry on their platform. The current myth is that platforms are not responsible for the content that users generate and post on them, that platforms are passive passthroughs to enable users to interact with one another. This claim is spurious. For one, the platforms do screen out some things like child pornography, some filter out all nudity. For another, they use algorithms to assess users’ tastes and feed them content that cater to those tastes. In other words, the platforms do exercise some kind of editorial function.

In the past, it could be thought that a platform would have to deploy armies of moderators to screen the zillions of bits of content that users upload. Now, that excuse no longer holds. They need to employ AI-enabled filters. If the law mandates sites to abide by the conditions of responsibility and accountability that a regular publisher of a physical journal is obliged to accept and follow, these social media platforms would figure out how to create the algorithms needed to comply with the mandate.

The second part is vital not only for protecting young people from online harm, but for protecting democracy itself from being deprived of a basic pre-requisite — a commonly shared information space. Social media algorithms seek to keep people glued to their sites, so as to feed them a steady stream of revenue-generating ads, by spraying them different shades of the content they have consumed, liked and spent time on before. Such practice of generating ad revenue also prevents the user from coming into contact with news or opinion that disputes or challenges the assumptions that had shaped their prior preferences.

The net result is to divide the information space into separate spheres, destroying any shared discourse. A shared discourse is the basis of democratic choice and decision-making. The tale of a bunch of blind men touching different parts of an elephant and concluding that an elephant is a vine, a tree stump, a barrel or a smooth, horizontal protrusion comes in handy to illustrate the point. If the narrator of each partial account is trapped in an echo-chamber where all others agree with that account and pooh-pooh any other, there is no way for any of them to put together the whole truth.

This means that social media platforms must abjure such algorithms. Newspapers did not lack ads because they carried diverse pieces of fact and opinion, nor will social media platforms. To make sure that social media platforms do not cheat on their commitment to give up such diversity-killing algorithms, these algorithms must be audited in real time. Social media platforms can figure out how this can be accomplished without compromising the integrity of their proprietary control over platforms.

Make the platforms behave. As for insulating children from harm, there is no real substitute for sound parenting. How elders conduct themselves, how much time children spend in front of any screen, how parents interact with children, how children interact with other children, what values they transmit and imbibe in those interactions, how much confidence and impulse control the parents help their children build — these matter more than what sites children visit.

Parents should not be encouraged to outsource their guilt over not finding the time to parent their children by dumping all responsibility on the companies that make and supply the things on which children spend time, whether social media, videogames or television programming and on the government. At the same time, the companies that engage young people’s time and attention must be made to pull their weight as well, instead of taking shelter behind a 1990s American law that shielded internet companies from responsibility for user-generated content.