Social Media and Political Neutrality

- Part 2

By Abhishek Venkatesh

Sharing the onus of accountability

The broader issue of regulation of social media is a topic that’s well outside the scope of this piece. However, that being said, there are some measures possible to ensure these platforms conform to greater political equidistance that we talked about in Part 1. Be it neutrality, or commerce, the burden of any such regulation must be shared. A greater burden on governments to regulate may end up in tendencies to curtail free speech or harming the big tech economy and the businesses that depend on it. On the other hand, entrusting platforms with complete responsibility may not necessarily be in the public interest, as can be seen from the Cambridge Analytica scandal. 

Increasing use of these platforms to shape politics and opinions, alongside complex, proprietary algorithms, already make this task difficult to achieve. In this regard, transparency and collaboration, are the biggest tools to make platforms more politically neutral. Both these measures not only allow governments and tech giants to be on the same page but also reinforce public trust in governance. These initiatives also provide a foundation for any addressing the functioning of social media platforms and lend a credible mechanism for oversight. Most importantly, these measures create an environment where people can be treated as rational individuals, and not political pawns living in echo chambers. The following section, in brief, explores suggestions for a framework of regulation, to address political neutrality of platforms.  

What can platforms do?

As a starting point, political undertones of any deviations from guidelines can be assessed through pan party reviews, or panel of political observers, who can then point out the existence of a trend, beyond a reasonable doubt. Further, ‘consultative censure’ of public leaders, that involves a dialogue between the concerned leader and the platform, would go a long way to uphold mutually acceptable standards of political expression.  

Transparency initiatives, between Big Tech and academia, can go a long way in ensuring a government-free approach to build neutrality. Facebook instituted the Data Transparency Advisory Group (DTAG), in 2019, comprising of eminent academicians of the US, from various fields. Another visible example in this regard is the Social Science One project, where one of the collaborations between Facebook and researchers allowed for greater insight into the role of social media in elections. This would be a good move toward social accountability, and credible oversight over algorithmic developments.

Lastly, neutrality must also reflect in the organization of platforms. One of the major concerns around Facebook’s appointment of the DTAG is its centralized functioning and the lack of similar such groups across various regions. In addition, it is also not clear, how communications flow from content review operations across regions to this Group. Another issue of the organization lies around the issue of designing metrics. A key observation from Facebook’s transparency report highlights that the absence of metrics for ‘Prevalence’ for hate speech, bullying, and organized hate. While they make it clear that it is still evolving, having a consultative approach to design these very metrics would be an affirmation of neutrality, and better deal with a bias that may arise in the future. 

What can governments do?

Personal Data and Non-Personal Data can be extremely useful, but invasive tools for political advertising. In light of meta-analysis of voting patterns across constituencies, to direct appeals, political advertisements must be built on robust data protection and privacy frameworks.


In this regard, better oversight by the Competition Commission of India is vital. It can help curb near-monopoly over the data of a billion users, that otherwise may result in an asymmetric power to influence the political identities of citizens. This must work alongside the companies, and the proposed Data Protection Authorities (under the Personal Data Protection Bill, and the Report on Non-Personal Data), to push for greater transparency and open-sourcing of algorithms. There must also be oversight over political advertising standards and mechanisms, in confidence with Advertising Standards Council of India, and the Election Commission of India. 

Further, the classification of Intermediaries (Bansal, 2020), would reduce ambiguity in roles. There is a need for the creation of a category of content-based intermediaries, and other service-based intermediaries that build on the evolving definitions, scope, and liabilities, largely looked into by courts till now. This would reduce the case by case approach, that started with the Avnish Bajaj case (Sharat Babu Digumatri vs Government of NCT of Delhi, 2005), to the Tik Tok related orders by Madras HC (Software Freedom Law Center, 2019), and in the process, outlining their accountability.

Lastly, Governments must also reduce the reliance on ad hoc arrangements for communication with social media giants, and rely on a more stable dialogic process. This enables trust, and greater predictability in public policy, with the vision of a connected social future going hand in hand with regulatory frameworks.

In conclusion

Political neutrality, for some, may echo with not taking a stand, but for an ecosystem that social media creates, it is necessary that people chart their own path toward their political convictions. Any explicit influence on the part of platforms may threaten the legitimacy of such opinions, and how they interact. It is, therefore, an act of utmost responsibility, that truly affirms the commitment to free speech.  The purpose thus far has not been to assign a Leviathan-like characteristic to Big Tech. It is a very indispensable manifestation of globalisation, but the regulatory gaps pose a risk of antagonization of both, the governments and these tech giants. Any burden of neutrality must be shared and must be progressive, much like our commitment to evolving democratic ideals. 

Sources: 

  1. Bansal, S. (2020, 08 24). Content regulation lapses cast doubts on Facebook’s biz model. Retrieved from Livemint: https://www.livemint.com/companies/people/-content-regulation-lapses-cast-doubts-on-facebook-s-biz-model-11598232566696.html

  2. Sharat Babu Digumatri vs Government of NCT of Delhi, 3 CompLJ364 Del ( High Court of Delhi 2005).