Key Features of the Rules
- Social media intermediaries, with registered users in India above a notified threshold, have been classified as significant social media intermediaries (SSMIs). SSMIs are required to observe certain additional due diligence such as appointing certain personnel for compliance, enabling identification of the first originator of the information on its platform under certain conditions, and deploying technology-based measures on a best-effort basis to identify certain types of content.
- The Rules prescribe a framework for the regulation of content by online publishers of news and current affairs content, and curated audio-visual content.
- All intermediaries are required to provide a grievance redressal mechanism for resolving complaints from users or victims. A three-tier grievance redressal mechanism with varying levels of self-regulation has been prescribed for publishers.
Key Issues and Analysis
- The Rules may be going beyond the powers delegated under the Act in certain cases, such as where they provide for the regulation of significant social media intermediaries and online publishers, and require certain intermediaries to identify the first originator of the information.
- Grounds for restricting online content are overbroad and may affect freedom of speech.
- There are no procedural safeguards for requests by law enforcement agencies for information under the possession of intermediaries.
- Requiring messaging services to enable the identification of the first originator of information on its platform may adversely affect the privacy of individuals.
Intermediaries are entities that store or transmit data on behalf of other persons, and include telecom and internet service providers, online marketplaces, search engines, and social media sites.[1] The Information Technology Act, 2000 (IT Act) was amended in 2008 to provide an exemption to intermediaries from liability for any third party information.[2] Following this, the IT (Intermediary Guidelines) Rules, 2011 were framed under the IT Act to specify the due diligence requirements for intermediaries to claim such exemption.[3] The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 were notified on February 25, 2021, to replace the 2011 Rules.[4] Key additions under the 2021 Rules include additional due diligence requirements for certain social media intermediaries, and a framework for regulating the content of online publishers of news and current affairs, and curated audio-visual content. The Ministry of Electronics and Information Technology noted that the changes were necessitated due to widespread concerns around: (i) prevalence of child pornography and content depicting sexual violence, (ii) spread of fake news, (iii) misuse of social media, (iv) content regulation in case of online publishers including OTT platforms and news portals, (v) lack of transparency and accountability from digital platforms, and (vi) rights of users of digital media platforms.[5],[6],[7],[8] The validity of the 2021 Rules have been challenged in various High Courts.[9],[10] |
KEY FEATURES
- Due diligence by intermediaries: Under the IT Act, an intermediary is not liable for the third-party information that it holds or transmits. However, to claim such exemption, it must adhere to the due diligence requirements under the IT Act and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (which replace the earlier 2011 Rules). Under the 2011 Rules, the requirements included: (i) specifying, in service agreements, the categories of content that users are not allowed to upload or share, (ii) taking down content within 36 hours of receiving a court or government order, (iii) assisting law enforcement agencies, (iv) retaining blocked content and associated records for 90 days, and (v) providing a grievance redressal mechanism for users and affected persons, and designating a grievance officer. The 2021 Rules retain these requirements, while: (i) modifying the categories of content that users are not allowed to upload or share, and (ii) prescribing stricter timelines for the above requirements.
- Significant social media intermediaries: The 2021 Rules define social media intermediaries as intermediaries which primarily or solely enable online interaction between two or more users. Intermediaries with registered users above a notified threshold will be classified as significant social media intermediaries (SSMIs). The additional due diligence to be observed by these SSMIs include:
Personnel: An SSMI must appoint: (i) a chief compliance officer for ensuring compliance with the Rules and the Act, (ii) a nodal person for coordination with law enforcement agencies, and (iii) a grievance officer, all of whom should reside in India.
Identifying the first originator of information: An SSMI, which primarily provides messaging services, must enable the identification of the first originator of information within India on its platform. This may be required by an order of a Court or the competent authority under the IT Act. Such orders will be issued on specified grounds including prevention, detection, and investigation of certain offences such as those relating to national security, public order, and sexual violence. Such orders will not be issued if the originator could be identified by less intrusive means.
Technology-based measures: SSMIs will endeavour to deploy technology-based measures to identify: (i) content depicting child sexual abuse and rape, or (ii) information that is identical to the information previously blocked upon a court or government order. Such measures: (i) must be proportionate to interests of free speech and privacy of users, and (ii) have a human oversight and be reviewed periodically.
User-centric requirements: SSMIs must provide users with: (i) a voluntary identity verification mechanism, (ii) a mechanism to check the status of grievances, (iii) an explanation if no action is taken on a complaint, and (iv) a notice where the SSMI blocks the user’s content on its own accord, with a dispute resolution mechanism.
- Digital Media Publishers: The 2021 Rules prescribe certain requirements for online publishers of: (i) news and current affairs content which include online papers, news portals, aggregators and agencies; and (ii) curated audio-visual content, which is defined as a curated catalogue of audio-visual content (excluding news and current affairs) which is owned by, licensed by, or contracted to be transmitted by publishers and available on demand. The Rules institute a three-tier structure for regulating these publishers: (i) self-regulation by publishers, (ii) self-regulation by associations of publishers, and (iii) oversight by the central government.
- Code of Ethics: For publishers of news and current affairs, the following existing codes will apply: (i) norms of journalistic conduct formulated by the Press Council of India, and (ii) programme code under the Cable Television Networks Regulation Act, 1995. For online publishers of curated content, the Rules prescribe the code of ethics. This code requires the publishers to: (i) classify content in specified age-appropriate categories, restrict access of age-inappropriate content by children, and implement an age verification mechanism, (ii) exercise due discretion in featuring content affecting the sovereignty and integrity of India, national security, and likely to disturb public order, (iii) consider India’s multiple races and religions before featuring their beliefs and practices, and (iv) make content more accessible to disabled persons.
- Grievance redressal: Any person aggrieved by the content of a publisher may file a complaint with the publisher, who must address it within 15 days. If the person is not satisfied with the resolution, or the complaint is not addressed within the specified time, the person may escalate the complaint to the association of publishers, who must also address the complaint within 15 days. The complaint will be considered by an inter-departmental committee constituted by the Ministry of Information and Broadcasting if: (i) escalated by the complainant or the association under certain conditions, or (ii) referred by the Ministry itself.
- Oversight by Ministry: The Ministry of Information and Broadcasting will: (i) publish a charter for self-regulating bodies, including Codes of Practices, (ii) issue appropriate advisories and orders to publishers; (iii) have powers to block content on an emergency basis (subject to review by the inter-departmental committee). Any directions for blocking content will be reviewed by a committee headed by the Cabinet Secretary.
KEY ISSUES AND ANALYSIS
Regulation of online intermediaries
Intermediaries include a vast array of entities who facilitate the flow of data on internet. These include telecom service providers, internet service providers, search engines, online marketplaces, payment sites, cyber cafes, messaging services, and social media sites. While many intermediaries are mere conduits or storage providers, where they are unaware of the content being transmitted or stored on their platform, other intermediaries may be aware of the user-generated content on their platform. This raises the question that to what extent intermediaries should be held liable for the user-generated content on their platform.
In some jurisdictions such as European Union and India, intermediaries are regulated through the safe harbour model. Under this model, intermediaries are granted immunity from any liability for any illegal user-generated content provided they comply with certain requirements.[11],[12],[13] The intermediaries remain immune from liability unless they are aware of the illegality and are not acting adequately to stop it.13 They are subject to ‘duties of care’ and ‘notice and take down’ obligations to remove illegal content.13
In recent years, some online platforms have gained a central role in enabling access, facilitating the exchange of information and sharing of information at scale.[14] Many online platforms have expanded their role from mere hosts of information to that of entities governing how content is displayed and shared online, and undertaking significant actions in the areas of moderation, curation, and recommendation.14 There are growing concerns around misuse of these platforms for the proliferation of illegal or harmful content such as child sex abuse material, content provoking terrorism, misinformation, hate speech, and voter manipulation.5,6,7,8,14 This has raised questions on the role and responsibility of platforms in preventing diffusion, detection, and subsequent removal of such content.14
Some platforms have been self-regulating the publication of such content. However, this has raised concerns about arbitrary actions taken by these platforms which could affect freedom of speech and expression.6 These developments pose an important challenge for the regulatory framework for intermediaries in terms of finding the correct balance between enhancing the role of platforms and governments in detection, moderation, and curation, and protection of individual’s rights.14 The 2021 Rules may address some of these issues. Implications of certain provisions under the Rules are discussed in the following sections.
The Rules may be going beyond the powers delegated under the Act
The central government has framed the 2021 Rules as per the following rule-making powers under the Act: (i) carrying out provisions of the Act, (ii) specifying the safeguards or procedures for blocking information for access by the public, and (iii) specifying due diligence to be observed by intermediaries for exemption from liability for third-party information. The 2021 Rules define new types of entities, state their obligations, and prescribe a new regulatory framework for some of these entities. This may be going beyond the powers delegated to the Executive under the Act. Such instances are discussed below. In various judgements, the Supreme Court has held that Rules cannot alter the scope, or provisions, or principles of the enabling Act.[15],[16],[17]
Distinct obligations for new classes of intermediaries: The Act defines an intermediary and states its obligations. These include: (i) taking down content upon a court or government order, (ii) retaining certain information, (iii) providing information and assistance to law enforcement agencies in certain conditions, and (iv) observing due diligence to be exempt from intermediary liability. The Rules define two new classes of intermediaries: (i) social media intermediary and (ii) significant social media intermediary (SSMIs). The Rules also specify the additional due diligence to be observed by SSMIs. These include: (i) appointing certain personnel, (ii) identifying the first originator of information (where SSMIs primarily provide messaging services), and (iii) deploying technology-based measures to pro-actively identify certain types of information on a best-effort basis. The Rules also empower the central government to: (i) determine the threshold for classification as SSMIs, (ii) require any other intermediary to comply with additional due diligence requirements for SSMIs. Defining new types of intermediaries, and empowering the government to specify thresholds under these definitions and cast obligations on select entities, may be going beyond the powers delegated to the government under the Act. Provisions such as the definition of new entities and their obligations may have to be specified in the parent Act.
Identification of the first originator of information: The Rules require SSMIs, which provide a service primarily or solely in the nature of messaging, to enable the identification of the first originator of information within India on its platform. This rule has no related provision under the parent Act. The Rules also prescribe certain details such as: (i) information on the first originator can be required only by a government or court order, (ii) the grounds on which such orders can be passed, and (iii) not issuing such an order if less intrusive means to obtain the information are available. It may be questioned whether this amounts to instituting legislative policy, and hence, is required to be provided in the parent Act.
Regulation of online publishers: The Rules prescribe a regulatory framework for online publishers of news and current affairs and curated audio-visual content (such as films, series, and podcast). Regulation of such publishers may be beyond the scope of the IT Act. Please see the discussion on online publishers on page 5.
Certain grounds for restricting content may affect freedom of speech
The Constitution allows for certain reasonable restrictions with respect to freedom of speech and expression on grounds such as national security, public order, decency, and morality.[18] The IT Act prohibits uploading or sharing content which is obscene, sexually explicit, relates to child sex abuse, or violates a person’s privacy.[19] The 2021 Rules specify certain additional restrictions on the types of information users of intermediary platforms can create, upload, or share. These include: (i) “harmful to child”, (ii) “insulting on the basis of gender”, and (iii) “knowingly and intentionally communicates any information which is patently false or misleading in nature but may reasonably be perceived as a fact”. Some of these restrictions are subjective and overbroad, and may adversely affect the freedom of speech and expression of users of intermediary platforms.
The Supreme Court (2015) has held that a restriction on speech, in order to be reasonable, must be narrowly tailored so as to restrict only what is absolutely necessary.[20] It also held that a speech can be limited on the grounds under the Constitution when it reaches the level of incitement. Other forms of speech even if offensive or unpopular remain protected under the Constitution.
The Rules require the intermediaries to make these restrictions part of their service agreement with users. This implies that users must exercise prior restraint, and intermediaries may interpret and decide upon the lawfulness of content on these grounds. Such overbroad grounds under the Rules may not give a person clarity on what is restricted and may create a ‘chilling effect’ on their freedom of speech and expression. This may also lead to over-compliance from intermediaries as their exemption from liability is contingent upon observing due diligence.
While examining the 2011 Rules on intermediary guidelines, the Lok Sabha Committee on Subordinate Legislation (2013) had observed that to remove any ambiguity, the definitions of the grounds used in the Rules should be incorporated in the Rules, if the definitions exist in other laws.[21] If not defined in other laws, such grounds should be defined and incorporated in the Rules to ensure that no new category of crimes or offences is created through delegated legislation.21 The 2021 Rules do not provide definitions or references for the terms listed above and hence, may cause ambiguity regarding the interpretation of these terms.
Procedure for information requests from government agencies lacks safeguards
The Rules require intermediaries to provide information under their control or possession upon request by a government agency. The government agency which is lawfully authorised for investigative or protective or cybersecurity activities may place such a request. The request may be placed for verification of identity, or prevention, detection, investigation, or prosecution of offences under any law or for cybersecurity incidents. However, the Rules do not state any procedural safeguards or requirements for such actions.
An earlier set of Rules notified in 2009 specify the procedure and safeguards subject to which interception, monitoring or decryption of information of intermediaries may be undertaken.[22] These state that such orders must be given by the union or state home secretary (with exceptions in case of unavoidable circumstances and remote regions), and be subject to review by a committee (headed by cabinet secretary or the state’s chief secretary). Further, the authority issuing such orders should first consider alternate means of acquiring information.22
Further, the 2021 Rules do not restrict the extent or type of information that may be sought. For example, the information sought may be personal data of individuals such as details about their interaction with others. Such powers, without adequate safeguards, as those in the 2009 Rules, may adversely affect the privacy of individuals.
Enabling traceability may adversely affect the privacy of individuals
The Rules require significant social media intermediaries, which provide services primarily or solely in the nature of messaging, to enable the identification of the first originator of information within India (commonly referred to as traceability). The Rules state that: (i) such identification should be required by a court order or an order passed by a competent authority under the 2009 Rules (union or state home secretary), (ii) order for identification will be passed for specified purposes including prevention, detection, and investigation of offences related to sovereignty and security of the state, public order, and sexual violence (rape, sexually explicit material or child sex abuse material), and (iii) no such order will be passed if less intrusive means are effective for the required identification.
Enabling such identification may lower the degree of privacy of communication for all users. Identifying the first originator of information on a messaging platform will require the service provider to permanently store certain additional information: (i) who all exchanged a message, and (ii) the exact message or certain details which uniquely describe a message so that information in question may be matched against it. This will be required for every message exchanged over the service provider’s platform to enable tracing the first originator of any message. Note that permanently storing such details about a message is not a technological necessity for providing messaging services over internet. The Rules also do not specify any timeline in terms of how far back in time the messaging service will be required to check for determining the first originator. Overall, this requirement will lead to the retention of more personal data by messaging services which goes against the principle of data minimisation. Data minimisation means limiting data collection to what is necessary to fulfil a specific purpose of data processing, and has been recognised as an important principle for the protection of personal privacy.[23],[24]
The Supreme Court (2017) has held that any infringement of the right to privacy should be proportionate to the need for such interference.[25] Traceability is required to prevent, detect, and investigate specified offences. For enabling traceability for a few messages that may be required for investigative purposes, the degree of privacy of communication of all users of online messaging services will need to be permanently lowered. Hence, the question is whether this action could be considered proportionate to the objective.
Note that a case related to the issue of traceability is currently pending before the Supreme Court.[26]
Framework for regulation of content of online publishers
Content on conventional media including print, TV, film, and radio are regulated under specific laws as well as license agreements (in the case of TV and radio).[27],[28],[29],[30] These regulations seek to ensure that community standards are reflected in content easily accessible by the public. They also seek to restrict access to certain content based on its age-appropriateness and if it may be deemed unlawful.[31] Economic costs and certain licence requirements for some of these operations mean that their numbers are few. In the past few years, internet has become a more mainstream medium for the publication of news as well as entertainment content. The regulatory framework for content on digital media may not be similar to conventional media as there are certain challenges in terms of: (i) defining who is a publisher; individuals and businesses publishing online may not be regulated in the same manner, (ii) the volume of content to regulate, and (iii) enforcement (cross-border nature of internet means that publishers need not have a physical presence in India). The 2021 Rules under the IT Act prescribe a framework for regulation of content by online publishers of news and current affairs and curated audio-visual content (such as films, series, and podcasts). Certain issues with these Rules are discussed below.
Regulation of online publishers under the 2021 Rules may be beyond the scope of the parent Act
The framework provides for norms and oversight mechanism for the regulation of content of online publishers. The press note by the central government on 2021 Rules noted that online publishers are digital platforms which are governed by the IT Act.6 The IT Act is aimed at providing legal recognition for transactions carried out by means of electronic data interchange and other means of electronic communication, and to facilitate electronic filing of documents.[32] The Act prohibits cybercrime including publishing specified content such as sexually explicit content, child sex abuse material, and content violating other’s privacy.
Laws such as the Press Council Act, 1978, the Press and Registration of Books Act, 1867, the Cable Television Networks (Regulation) Act, 1995, and the Cinematograph Act, 1952 are specific laws regulating publishers of news in print, television broadcast of news and audio-visual content, and films, respectively (similar content through other media).27,28,29,30 Regulation of content of these classes of publishers deals with questions of freedom of press and freedom of artistic expression. It may be questioned whether regulation of online publishers is envisaged under the IT Act and hence, if the 2021 Rules exceed the scope of the Act in this regard.
Oversight mechanism for digital news media lacks the independence accorded to print news
The oversight mechanism for content regulation in case of news in print is under the Press Council of India (PCI), which is an independent statutory body. One of the main objectives of the PCI is to uphold the freedom of the press. The Council consists of a chairman and 28 other members including working journalists, persons from the management of newspapers, members of Parliament, and domain experts. The Chairman is selected by the Speaker of the Lok Sabha, the Chairman of the Rajya Sabha and a member elected by the PCI. Key functions of the PCI include: (ii) adjudicating upon complaints of violation of standards, (iii) issuing directions upon violation of code of conduct including admonishing, warning, and censuring. For similar functions in case of digital news media, the oversight mechanism will be under the Ministry of Information and Broadcasting. Thus, the oversight mechanism for digital news is not through an independent statutory body unlike that for print publications.
Note that the content of TV news is regulated under the Cable Television Networks (Regulation) Act, 1995 (CTN Act).29 The CTN Act empowers the central government to prescribe programme code and advertising code to be followed by the publishers. The central government may prohibit the transmission of a programme in the public interest on certain specified grounds if it violates these codes. A three-tier self-regulation mechanism for TV broadcasters, similar to that for online publishers, has been prescribed under the CTN Act in June 2021.[33]
The procedure for emergency blocking of content of online publishers lacks certain safeguards
As per the Rules, the Secretary of the Ministry of Information and Broadcasting may pass an order for blocking the content of an online publisher in case of emergency. Such orders may be passed on certain specified grounds including national security and public order, without giving the publisher an opportunity of hearing. Such an order will be examined by the inter-departmental committee for its recommendation on the confirmation or revocation of the order. The Rules do not give the publisher an opportunity for hearing during this entire process. This is in contrast with the process for examination of violation of the code of ethics. Under this process, the concerned publisher will be allowed to appear and submit their reply and clarifications before the committee.
Definition of social media intermediary may be too broad
The Rules define a social media intermediary as an intermediary which primarily or solely enables interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services. This definition may include any intermediary that enables interaction among its users. This could include email service providers, e-commerce platforms, video conferencing platforms, and internet telephony service providers.
[1]. Section 2 (1) (w), The Information Technology Act, 2000.
[2]. Section 79, The Information Technology Act, 2000.
[4]. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
[5]. Official Debates, Rajya Sabha, July 26, 2018.
[6]. “Government notifies Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021”, Press Information Bureau, Ministry of Electronics and Information Technology, February 25, 2021.
[7]. Suo Moto Writ Petition No. 3 of 2015, Supreme Court of India, December 11, 2018.
[8]. “Report of the Adhoc Committee of the Rajya Sabha to Study the Alarming Issue of Child Pornography on Social Media and its Effect on Children and Society at Large”, February 3, 2020.
[9]. W.P. (Civil) No. 6272 of 2021, Kerala High Court.
[10]. W.P. (Civil) No 3125 of 2021, Delhi High Court.
[11]. Article 13-15, Directive 2000/31/EC of The European Parliament And of the Council.
[12]. Section 79, The Information Technology Act, 2000.
[13] “Reform of the EU Liability Regime for Online Intermediaries”, European Parliamentary Research Service, May 2020.
[14]. “Liability of online platforms”, European Parliamentary Research Service, February 2021.
[15]. Agricultural Market Committee vs Shalimar Chemical Works Ltd, 1997 Supp(1) SCR 164, May 7, 1997.
[16]. State of Karnataka v Ganesh Kamath, 1983 SCR (2) 665, March 31, 1983.
[17]. Kerala State Electricity Board vs Indian Aluminium Company, 1976 SCR (1) 552, September 1, 1975.
[18]. Article 19, The Constitution of India.
[19]. Section 67, 67A, and 67B, The Information Technology Act, 2000.
[20]. Shreya Singhal vs Union of India, Writ Petition (Criminal) No. 167 Of 2012, Supreme Court of India, March 24, 2015.
[21]. 31st Report of the Committee on Subordinate Legislation of Lok Sabha on Rules under the IT Act, 2000, March 2013.
[22]. The Information Technology (Procedure and Safeguards for Interception, Monitoring, and Decryption of Information) Rules, 2009 under the Information Technology Act, 2000.
[23]. White Paper of the Committee of Experts on Data Protection Framework for India under the Chairmanship of Justice B.N. Shrikrishna.
[24]. Article 5, General Data Protection Regulation of European Union.
[25]. Justice K.S.Puttswamy (Retd) vs Union of India, W.P.(Civil) No 494 of 2012, Supreme Court of India, August 24, 2017.
[26]. Facebook Inc vs Antony Clement Rubin, Diary No 32478/2019, Admitted on January 30, 2020, Supreme Court of India.
[31]. “The Challenge of managing digital content”, International Telecommunications Union, August 23, 2017.
[32]. Introduction to the Information Technology Act, 2000.
[33]. The Cable Television Networks (Amendment) Rules, 2021 issued under the Cable Television Networks (Regulation) Act, 1995.
DISCLAIMER: This document is being furnished to you for your information. You may choose to reproduce or redistribute this report for non-commercial purposes in part or in full to any other person with due acknowledgement of PRS Legislative Research (“PRS”). The opinions expressed herein are entirely those of the author(s). PRS makes every effort to use reliable and comprehensive information, but PRS does not represent that the contents of the report are accurate or complete. PRS is an independent, not-for-profit group. This document has been prepared without regard to the objectives or opinions of those who may receive it.
Be the first to comment