Social media firms have a ‘duty of care’
It is clear that social media platforms play an active political role in our public discourse, instead of merely providing dumb and passive technological infrastructure for user interactions
Social media platforms started gaining popularity in the 2010s. These platforms made it easy for everyone to access information by offering a space to connect with friends and families and share and consume content. Social movements leveraged these online tools to drive political and social revolutions. Charismatic leaders deployed innovative social media campaigns to acquire power. It appeared that there was much to celebrate.

During the same time, social media was also upending established information ecosystems. Traditional news media lost their gate-keeping powers on news and information. With time, the adverse effects of these platforms surfaced. For example, malicious users and organisations learned to leverage online tools to sow division, fear, and confusion, and undermine democratic processes.
The response of social media platforms, especially in developing countries, was poor and weak. These outlets claim that social and political ills exist in society; and they manifest online. This equivocation is facilitated by the legal immunity platforms enjoy for being “intermediaries”, which means they facilitate the posting and sharing of content by third-party users without any kind of editorial control. In this argument, social media platforms are held on a par with a regular ISP, which only carries packets of data from one point to another over the internet but lacks cognisance of the content of the data. This argument is untenable. Platforms are not just conscious of their users and content, but straddle the continuum from distributors to publishers.
As private corporations, platforms have the right to decide what content they will host and distribute. Accordingly, all platforms have extensive terms of service and content guidelines. Further, platforms moderate content, implying that they are aware of what they are hosting and that distribution of content is a choice. They are also known to “deplatform” users to showcase their political distance from the “deplatformed” user.
These platforms have become increasingly interventionist with content. To maximise user engagement and retain/increase users, platforms also push new content into user feeds. The “selection” of content to amplify is a key editorial function, even though an algorithm, and not a human being, deploys this function. The amplification of user content and mass engagement has a political and commercial impact (in proportion to increased distribution). This is evident from the mainstreaming of individuals and narratives, which may otherwise have remained on the fringes of public consciousness. Amplification is, thus, an intervention in the socio-economic and political processes of society, and platforms must be held responsible.
Finally, platforms are starting to pay for original content to increase user stickiness, thus completing their transition to a media corporation. Facebook, for instance, has pledged $1 billion for creator content in 2022. YouTube established a $100-million creator fund. TikTok has earmarked a $200 million “creator fund” for creators in the US. Instagram and Snapchat, too, offer financial incentives for original content.
It is clear that social media platforms play an active political role in our public discourse, instead of merely providing dumb and passive technological infrastructure for user interactions. Initially, platforms embraced their impact on societies and political systems by positioning themselves as harbingers of democracy and pro-people movements, especially during the Arab Spring.
However, today platforms have become increasingly aligned with the powerful instead of ordinary people in areas of conflict; organised political entities have mobilised online (in India through dedicated political party “IT cells”), overwhelming individuals and less-resourced people’s movements. Platforms have been found to be compliant with take-down requests (of dissident views/speeches) of governments. Most importantly, platforms have become interventionist repositories of political power. They must be held accountable for their choices. It is no longer tenable to argue that platforms have no responsibility for content hosted on their platforms. Instead, platforms — which are now de facto media companies — have a “duty of care” in proportion to the harm posed by the content they host and liability linked to their distribution choice.
Ruchi Gupta is executive director of the Future of India Foundation. This article is based on the Foundation’s report, Politics of Disinformation The views expressed are personal