Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021
Background
- In 2021, citing instructions from the Supreme Court and the concerns raised in Parliament about social media abuse, the government framed the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021 in exercise of powers under the Information Technology Act, 2000.
- The rules apply to publishers of news and current affairs (including websites, portals and YouTube news channels), publishers of online curated content (commonly called OTT platforms which stream content such as web series and films) and social media intermediaries (which primarily enables online interaction between two or more users).
- To encourage innovations and enable growth of new social media intermediaries without subjecting smaller platforms to significant compliance requirements, the Rules make a distinction between social media intermediaries and significant social media intermediaries. This distinction is based on the number of users on the social media platform.
- Government is empowered to notify the threshold of the user base that will distinguish between social media intermediaries and significant social media intermediaries. The Rules require the significant social media intermediaries to follow certain additional due diligence.
- Part II of the rules relate to social media intermediaries and would be administered by the Ministry of Electronics & Information Technology. Part III relates to digital news publishers and OTT platforms and would be administered by the Ministry of Information & Broadcasting.
What are the key proposals that the guidelines make for social media intermediaries?
- Section 79 of the Information Technology Act 2000 provides a “safe harbour” to intermediaries that host user-generated content, and exempts them from liability for the actions of users if they adhere to government-prescribed guidelines.
- The new guidelines prescribe an element of due diligence to be followed by the intermediary, failing which the safe harbour provisions would cease to apply to these platforms such as Twitter, Facebook, YouTube, and WhatsApp.
- They also prescribe a grievance redressal mechanism by mandating that the intermediaries, including social media platforms, should establish a mechanism for receiving and resolving complaints from users.
- Intermediaries shall remove or disable access withing 24 hours of receipt of complaints of contents that exposes the private areas of individuals, show such individuals in full or partial nudity or in sexual act or is in the nature of impersonation including morphed images etc. Such a complaint can be filed either by the individual or by any other person on his/her behalf.
Do the guidelines lay the rules for removal of content from social media?
- In essence, the rules lay down 10 categories of content that the social media platform should not host.
- These include content that “threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign States, or public order, or causes incitement to the commission of any cognizable offence or prevents investigation of any offence or is insulting any foreign States”; “is defamatory, obscene, pornographic, paedophilic, invasive of another’s privacy, including bodily privacy; insulting or harassing on the basis of gender; libellous, racially or ethnically objectionable; relating or encouraging money laundering or gambling, or otherwise inconsistent with or contrary to the laws of India”, etc.
- The rules stipulate that upon receipt of information about the platform hosting prohibited content from a court or the appropriate government agency, it should remove the said content within 36 hours.
Additional due diligence to be observed by significant social media intermediary
- A significant social media intermediary shall appoint a Chief Compliance Officer who shall be responsible for ensuring compliance with the rules.
- A significant social media intermediary shall appoint a nodal contact person for 24×7 coordination with law enforcement agencies.
- Further, these platforms will need to publish a monthly compliance report mentioning the details of complaints received and action taken on the complaints, as well as details of contents removed proactively by the significant social media intermediary.
- Significant social media intermediaries which provide messaging as a primary service must enable the identification of the first originator of the information on its platform and it must be disclosed if required by an order from the Court or the government. Such order will be passed for specified purposes including investigation of offences related to sovereignty and security of the state, public order, or sexual violence.
Rules for Digital news media & OTT platforms
- The rules establish a three-tier grievance redressal framework for news publishers and OTT platforms on the digital media.
- The first tier of the regulatory mechanism is grievance redressal by the company itself; the second level involves a self-regulatory body that will be headed by a retired judge of a high court or the Supreme Court. The third-tier will comprise an inter-ministerial committee and will be headed by a joint secretary-level officer from the Ministry of Information and Broadcasting.
- The rules also establish a Code of Ethics for digital news media and OTT platforms. For publishers of news and current affairs, the following existing codes will apply: (i) norms of journalistic conduct formulated by the Press Council of India, and (ii) programme code under the Cable Television Networks Regulation Act, 1995.
- For online publishers of curated content, the Rules prescribe the code of ethics. This code requires the publishers to: (i) classify content in specified age-appropriate categories, restrict access of age-inappropriate content by children, and implement an age verification mechanism, (ii) exercise due discretion in featuring content affecting the sovereignty and integrity of India, national security, and likely to disturb public order, (iii) consider India’s multiple races and religions before featuring their beliefs and practices, and (iv) make content more accessible to disabled persons.
- For OTT service providers such as YouTube, Netflix, etc., the government has prescribed self-classification of content into five categories based on age suitability which includes “U”, “U/A 7+”, “U/A 13+”, “U/A 16+” and “A”.
Why in News?
- The Supreme Court is set to hear petitions related to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021).
- The matters include the transfer petition filed by the Central Government seeking to transfer to the Supreme Court the petitions filed in different High Courts challenging the IT Rules.
Reference:
Subscribe
Login
0 Comments