Skip to content

MEITY Takes Action Social Media for Child Sexual Abuse #1

  • by
Fight Against Child Sexual Abuse
MEITY’s Action Against Child Sexual Abuse Material (CSAM)

In a significant move aimed at safeguarding online spaces for Indian users, the Ministry of Electronics and Information Technology (MEITY) took action on Friday, October 6, by issuing notices to three major social media intermediaries: X, YouTube, and Telegram. These notices come as a stern warning to remove Child Sexual Abuse Material (CSAM) from their platforms operating within the Indian internet domain.

Union Minister for Electronics & IT, Rajeev Chandrasekhar, stated, “We have sent notices to X, YouTube, and Telegram to ensure there are no Child Sexual Abuse Materials on their platforms.”

The Government’s Initiative for Online Safety

The ministry has not only demanded the removal of existing CSAM but has also called upon these social media giants to proactively implement stringent measures, including content moderation algorithms and effective reporting mechanisms, to prevent the dissemination of CSAM in the future.

Notices Issued to Social Media Giants

It is emphasized that failure to comply with these requirements would be considered a breach of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021. Furthermore, any delay in adhering to these notices will result in the withdrawal of their safe harbor protection, as outlined in Section 79 of the IT Act. This protection currently shields these intermediaries from legal liabilities arising from the content posted by users on their platforms.

Minister Chandrasekhar reiterated, “The Government is determined to establish a safe and trusted internet environment in accordance with the IT rules. These rules impose strict expectations on social media intermediaries, requiring them not to allow criminal or harmful posts on their platforms. Failure to act swiftly will result in the withdrawal of their safe harbor protection under Section 79 of the IT Act, with subsequent legal consequences under Indian law.”

Importantly, the Information Technology (IT) Act of 2000 provides the legal framework for addressing pornographic content, including CSAM. Sections 66E, 67, 67A, and 67B of the IT Act impose stringent penalties and fines for the online transmission of obscene or pornographic content.

Section 79 of the IT Act currently shields social media intermediaries from legal liability associated with the content posted by their users. However, the recent notices from MEITY signal the government’s commitment to holding these platforms accountable for ensuring the safety and security of Indian internet users, especially the vulnerable section, children, and adolescents from Child Sexual Abuse.

Child Sexual Abuse
FAQ: MEITY’s Action Against Social Media Giants for CSAM

Q1: What prompted MEITY to issue notices to social media intermediaries regarding CSAM?
Ans: MEITY took this action to ensure the removal of Child Sexual Abuse Material (CSAM) from social media platforms operating within the Indian internet domain and to promote online safety.

Q2: What measures has MEITY called for from these social media giants?
Ans: MEITY has urged these platforms to implement stringent measures, including content moderation algorithms and effective reporting mechanisms, to prevent the dissemination of CSAM.

Q3: What consequences do social media platforms face if they fail to comply with MEITY’s notices?
Ans: Non-compliance with the notices will be considered a breach of IT Rules, 2021, and could result in the withdrawal of their safe harbor protection under Section 79 of the IT Act, exposing them to legal liabilities.

Q4: How does the IT Act address the issue of pornographic content, including CSAM?
Ans: The Information Technology (IT) Act, 2000, contains sections (66E, 67, 67A, and 67B) that impose strict penalties and fines for the online transmission of obscene or pornographic content.

Q5: What is the significance of Section 79 of the IT Act for social media intermediaries?
Ans: Section 79 currently shields social media intermediaries from legal liability related to user-generated content on their platforms. However, the recent notices indicate the government’s intent to ensure accountability and safety online.