Please click here to download the Prism as a PDF.
Regulating Artificial Intelligence generated content: Amendments proposed to be made to the Intermediary Rules, 2021
On October 22, 2025, the Ministry of Electronics and Information Technology (“MeitY”), published proposed amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“Intermediary Rules”) issued under the Information Technology Act, 2000 (“IT Act”) for stakeholder comments (“Proposed Amendments”). Comments on the Proposed Amendments can be submitted till November 6, 2025.
The Proposed Amendments aim to reduce the harm that may be caused due to the use of generative Artificial Intelligence (“AI”) tools and the resulting proliferation of Synthetically Generated Information (“SGI”). The Proposed Amendments seek to regulate synthetically generated content, such as deepfakes and content created or modified using generative AI technology and are intended to bolster the due diligence obligations on intermediaries, including Significant Social Media Intermediaries (“SSMIs”).
Key aspects of the Proposed Amendments
Some of the key provisions of the Proposed Amendments are, as follows:
- Definition of SGI – The Proposed Amendments define SGI as “any information which is artificially or algorithmically created, generated, modified or altered using a computer resource, in a manner that such information reasonably appears to be authentic or true”. The proposed definition appears to be wide enough to include all forms of content created, modified or altered using any AI technology, such as generative AI, which has become popular in creating deepfakes.
- Clarification on safe harbour under the IT Act – The Proposed Amendments clarify that the intermediaries will not lose the safe harbour granted to them under Section 79(2) of the IT Act, as long as they remove or disable access to SGI based on reasonable efforts or grievances reported by the users.
- Due diligence requirements for intermediaries – The Proposed Amendments require intermediaries offering a computer resource that enables, facilitates or permits the generation, creation, modification or alteration of SGI, such as generative AI outputs, to label or embed SGI with permanent unique metadata or identifiers. The metadata or identifiers must be visibly displayed or made audible appropriately within the SGI. Additionally, the intermediaries must not, in any case, allow any modification, suppression or removal of such label, metadata or identifier. The following specifications have been set out in regard to the metadata/identifier to be displayed in relation to SGI:
- the metadata/identifier must be displayed at least 10% of the surface area of the visual display; or
- in case of audio content, the metadata/identifier must be audible during the initial 10% of its duration.
The aforesaid means can be used to immediately identify information that is SGI which has been created, generated, modified or altered using the computer resource of the intermediary. Separately, the Proposed Amendments also state that the intermediary must not enable the modification, suppression or removal of the permanent unique metadata or identifier.
- Additional due diligence for SSMIs – The Proposed Amendments set out the following additional requirements for SSMIs, which enables displaying, uploading, or publishing any information on their computer resource:
- declaration from users on whether such information is SGI;
- deployment of reasonable and appropriate technical measures, such as automated tools or appropriate mechanisms to ensure the accuracy of the user declarations; and
- in cases where the user declaration or appropriate technical measures confirm that any information is SGI, ensure that it is clearly and prominently displayed with a label or notice that such content is SGI.
The SSMIs will be deemed to have failed to exercise due diligence where such SSMI knowingly permits, promotes or fails to comply with the above requirements with respect to the user declaration regarding SGI. The SSMIs will be accountable for carrying out reasonable and proportionate technical measures to verify the correctness of user declarations and to ensure that no SGI is published without such a declaration or label.
Conclusion
The Proposed Amendments introduce clear accountability for intermediaries and SSMIs in managing SGIs, through labelling, metadata traceability, and other transparency measures. They protect intermediaries acting in good faith while mandating that SSMIs ensure that users declare SGIs and the SSMIs label synthetic content using reasonable verification mechanisms. Overall, the changes set out in the Proposed Amendments aim to help users identify authentic information, foster trust, and advance India’s vision of an open, safe, and accountable internet that balances innovation with user rights.
Whilst the MeitY has touched upon similar topics in the past via advisories such as the one issued on March 15, 2024, in relation to the use of AI models/LLM/Generative AI, software(s) or algorithm(s), the Proposed Amendment appears to be an attempt by the Government of India to address deepfakes via legislation.
This Prism is prepared by:
|
Sajai Singh |
Sankalp Inuganti |
Saurav Kumar |
For more details, please contact [email protected].










