India Cuts Deepfake Takedown Time to Three Hours Under New IT Rules

India notifies IT Rules Amendment 2026, cutting deepfake takedown timelines to three hours and mandating labelling, traceability and stricter platform accountability.

India Cuts Deepfake Takedown Time to Three Hours Under New IT Rules

In a significant regulatory shift aimed at curbing the rapid spread of deepfakes and synthetic deception, the Government of India has notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, introducing a sharply reduced three-hour compliance window for removal of unlawful content upon official notice. The amendments, issued by the Ministry of Electronics and Information Technology, will come into force from February 20, 2026, and are designed to respond to the accelerating misuse of AI-generated content across digital platforms.

For the first time, the rules formally define “synthetically generated information” as audio, visual or audio-visual content that is artificially or algorithmically created, modified or altered in a way that appears real and is likely to be perceived as indistinguishable from a real person or event. This includes deepfake videos, AI voice cloning, altered images and fabricated recordings capable of misleading users. At the same time, the framework draws a clear boundary: routine editing such as brightness correction, compression, transcription, subtitles or accessibility features do not qualify as synthetic information, provided they do not materially distort the original content.

The most consequential operational change appears under Rule 3(1)(d). When an intermediary receives “actual knowledge” of unlawful content through a court order or a reasoned intimation from an authorised government officer, it must remove or disable access to the specified content within three hours. This replaces the earlier 36-hour timeline and significantly tightens compliance expectations. The amendment clarifies that such intimation must come from an officer authorised in writing, and in cases involving police administration, officers not below the rank of Deputy Inspector General may be designated to issue notices.

Grievance redressal timelines have also been compressed. General grievances must now be resolved within seven days, down from 15. Certain urgent takedown grievances must be addressed within 36 hours. Complaints relating to nudity, sexual content, morphed imagery or impersonation must be acted upon within two hours, a move aimed particularly at tackling non-consensual intimate imagery and AI-morphed content where harm can escalate quickly.

A new Rule 3(3) introduces specific due diligence obligations for platforms that enable or facilitate synthetic media creation. Intermediaries offering AI video generators, voice cloning tools or similar technologies must deploy reasonable technical measures to prevent users from generating unlawful synthetic content. Prohibited categories include child sexual abuse material, non-consensual intimate imagery, forged electronic records, content facilitating explosives or arms preparation, and deceptive impersonation or fabricated events presented as real.

The rules also mandate transparency. Permitted synthetic content must be clearly and prominently labelled. Visual content must carry visible disclosures, audio must include a prefixed announcement, and permanent metadata or provenance markers with unique identifiers must be embedded. Platforms must not enable removal or suppression of such labels.

Significant Social Media Intermediaries face additional obligations under a new Rule 4(1A). Before allowing publication, these platforms must obtain user declarations on whether content is synthetically generated, deploy technical measures to verify those declarations and prominently label confirmed synthetic content. If a platform knowingly permits or promotes unlawful synthetic content, it may be deemed to have failed due diligence under the IT Rules.

The amendments also clarify that removal or disabling of content in compliance with the rules, including through automated tools, does not by itself jeopardise intermediary protection under Section 79 of the IT Act. The policy direction is explicit: speed, traceability and accountability are now central pillars of India’s digital governance framework. As synthetic media continues to blur the line between fact and fabrication, the compliance burden on platforms has moved from reactive moderation to proactive, technology-backed oversight.