Summary: ‘Working Paper On Generative AI And Copyright – Part I: One Nation One License One Payment’(Department For Promotion Of Industry And Internal Trade)

A DPIIT-appointed committee has submitted a Working Paper on Generative AI and Copyright – Part I, in December 2025, which investigates the legal and policy implications of using copyrighted works as training data for generative artificial intelligence (GenAI). It examines India’s existing Copyright Act, 1957 highlighting the absence of a text-and-data mining (TDM) exception under Indian law. Further, the Paper explores global regulatory approaches and synthesises stakeholder consultations to assess competing models for access to training data and remuneration of right-holders. The Paper is limited to input-side concerns relating to the use of copyrighted content in AI training, while Part II will address authorship and protection of AI-generated outputs.

The Paper identifies a fundamental conflict between the scale and diversity of data required to develop GenAI systems and the risk that unlicensed ingestion of copyrighted works could undermine the sustainability of creative industries. The paper raises two questions central to the issue; 1) whether training AI Systems involves exercising any of the exclusive rights given to copyright holders under Section 14 of the Copyright Act, and 2) whether the “fair dealing” exception under Section 52(1)(a) of the Copyright Act, 1957, can be interpreted to protect the training of AI Systems.

The Paper further pinpoints the sharply divergent views of various Stakeholders: 1) the AI developers and technology firms that advocate exceptions or simplified access regimes, 2) the authors, publishers, and other right-holders emphasise licensing mechanisms to protect control and remuneration. The Paper undertakes a comparative review of international approaches, including blanket TDM exceptions, opt-out regimes, voluntary and collective licensing systems, and statutory compensation models, as well as ongoing litigation in various jurisdictions. It concludes that none of these approaches, when adopted in isolation, adequately address India’s legal structure and creative ecosystem.

The Paper explains that acts intrinsic to AI training, such as reproduction, storage, adaptation, and communication of works, are presently treated as exclusive rights under the Copyright Act, 1957, with no exception permitting TDM. The committee proposes a hybrid statutory framework described as “One Nation, One License, One Payment”, under which AI developers would be granted a mandatory blanket licence to use lawfully accessed copyrighted works for training purposes without requiring individual authorisation. Creators would receive statutory remuneration, with royalties collected and distributed by a single government-designated, non-profit body comprising existing copyright societies and collective management organisations – CRCAT (Copyright Remuneration Collective for AI Training). The framework envisages transparent royalty pooling, distribution to members and non-members of CRCAT on registration for the purpose of receiving royalties for AI training, institutional mechanisms for rate-setting, judicial review, and dispute resolution. The Paper also acknowledges implementation challenges in the distribution of royalties, highlighting the need to ensure transparency and accountability.

Disclaimer: Views, opinions, interpretations are solely those of the author, not of the firm (ALG India Law Offices LLP) nor reflective thereof. Author submissions are not checked for plagiarism or any other aspect before being posted.

Copyright: ALG India Law Offices LLP

Summary: Draft Information Technology (Intermediary Guidelines And Digital Media Ethics Code) Amendment Rules, 2025

The Ministry of Electronics and Information Technology has released the Draft Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2025 (“Rules”), introducing a framework to identify and regulate deepfakes, misinformation, and other unlawful or deceptive material. The Rules introduce the concept of “synthetically generated information” and impose labeling, identification, verification of consent, and due diligence requirements on intermediaries, particularly Significant Social Media Intermediaries (“SSMIs”). Through these measures, the aim is to ensure an open, transparent, safe, trusted, and accountable Internet for citizens availing Internet services, and maintaining transparency in the digital ecosystem. 

According to the rules, synthetically generated information refers to content that is artificially or algorithmically created, generated, modified, or altered through computer resources in a manner that appears reasonably authentic or true. It also provides statutory protection to intermediaries acting in good faith, clarifying that the removal or disabling of access to synthetically generated content based on reasonable efforts or user grievances will not affect the exemption available under the safe harbour under Section 79 of the Information Technology Act, 2000.

Additionally, the Rules introduce due diligence requirements for intermediaries that provide computer resources enabling, permitting, or facilitating the creation or modification of synthetically generated information. Such intermediaries are required to ensure that all synthetically generated content is permanently labelled or embedded with a visible or audible metadata identifier in a prominent manner. The identifier must cover at least ten percent of the surface area of any visual display or, in the case of audio content, the initial ten percent of its duration. Intermediaries are prohibited from modifying, suppressing, or removing such labels or identifiers. 

Some additional obligations have also been prescribed for SSMIs. These platforms require to obtain user declarations indicating whether uploaded information is synthetically generated, verify users’ declarations through reasonable and proportionate technical measures, and ensure that synthetically generated information is clearly and prominently displayed with a visible label or notice. Further, SSMIs are obligated to act promptly upon becoming aware that synthetic content has been displayed or published without the required declaration or label. SSMIs are also required to ensure reasonable measures have been taken to confirm that no synthetically generated information is published without the necessary declaration or label and to act promptly if such content appears without compliance. An intermediary that knowingly allows or fails to address the dissemination of undeclared synthetic content will be considered to have failed in exercising due diligence under the Rules.

Disclaimer: Views, opinions, interpretations are solely those of the author, not of the firm (ALG India Law Offices LLP) nor reflective thereof. Author submissions are not checked for plagiarism or any other aspect before being posted.

Copyright: ALG India Law Offices LLP

Summary: Draft Promotion And Regulation Of Online Gaming Rules, 2025 

The Draft Promotion and Regulation of Online Gaming Rules, 2025 (“Rules”) seek to implement key provisions of the Promotion and Regulation of Online Gaming Act, 2025 (“Act”) and introduce measures to promote the structured growth of legitimate e-sports and social gaming. They establish a framework for determining whether an online game qualifies as an online money game. Additionally, the Rules outline procedures for the recognition, classification, and registration of legitimate e-sports and online social games and require the maintenance of a National Online Social Games and e-sports Registry. They also ensure a transparent, digital, and accountable regulatory structure, supported by a robust grievance redressal mechanism to protect users.  

A central feature of the Rules is the establishment of the Online Gaming Authority of India (“Authority”). The Authority possesses powers of a civil court and is entrusted with key regulatory responsibilities, including assessing whether an online game constitutes an online money game, online social game, or e-sport; recognising and registering eligible games; maintaining a National Online Social Games and E-sports Registry; issuing directions and codes of practice; and imposing penalties for non-compliance. 

Under the proposed Rules, online game service providers are required to apply digitally for registration of online social games or e-sports, furnishing details on user safety features, age suitability, and non-involvement of monetary stakes. The Authority may also suo motu determine the classification of a game and prohibit its operation or advertisement if found non-compliant. Upon registering as an online social game, the Authority shall issue a certificate of registration to the online game service provider with a unique registration number in relation to such online social game or e-sport, which can be cancelled or have an inquiry made by the Authority, on receipt of any complaint.

Further, the Rules mandate a structured grievance redressal mechanism for redressal of grievances from any user in relation to the online social games or e-sport offered by it. Registered service providers must establish internal grievance systems, while unresolved complaints may be escalated to the Grievance Appellate Committee under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, and subsequently to the Authority, under specified timelines. The Authority may issue remedial directions, enforce compliance, and impose penalties. 

Disclaimer: Views, opinions, interpretations are solely those of the author, not of the firm (ALG India Law Offices LLP) nor reflective thereof. Author submissions are not checked for plagiarism or any other aspect before being posted.

Copyright: ALG India Law Offices LLP

Summary: The Promotion And Regulation Of Online Gaming Act, 2025

On August 22, 2025, the President assented the Promotion and Regulation of Online Gaming Act, 2025, (The Act) marking a significant legislative milestone in India’s digital governance landscape. The Act aims to strike a balance between fostering innovation in the online gaming sector and curbing the harmful effects of online money gaming. It reflects a comprehensive and forward-looking approach to regulating one of the fast-growing segments of the digital economy.

The Act comprises 20 Sections organized across 6 Chapters, each addressing specific aspects of regulation, enforcement, and definitions related to online gaming in India. It establishes a comprehensive national-level legal framework aimed at ensuring uniformity in the regulation and enforcement of online gaming laws across states, thereby resolving cross-border and inter-state inconsistencies. The Act applies to online gaming services offered within India, as well as those operated from outside India but accessible to users within the country.

The Act imposes a clear prohibition on “online money games” [Section 2(1)(g)], defined as any online game that involves monetary or equivalent stakes, regardless of whether it is skill-based or chance-based. In contrast, the Act encourages the development of “e-sports” [Section 2(1)(c)], which are characterized by two distinct features: (i) outcomes determined purely by skill, and (ii) the absence of any monetary or similar wagering.

The central government may either set up a new authority or assign these functions to an existing one, with powers to assess whether an online game qualifies as an online money game, and to recognise, categorise, and register online games. The government will define the authority’s structure and specify the terms and conditions for appointments.

The Act, however, sets out strict penalties for violations. Under Section 5, individuals who offer or participate in online money gaming services may face imprisonment of up to three years and/or a fine of up to INR 1 crore. Section 6 penalizes the advertisement of such services with imprisonment of up to two years and/or a fine of up to INR 50 lakh. Further, Section 7 criminalizes any financial transactions related to online money gaming services, carrying penalties of up to three years in prison and/or a fine of up to INR 1 crore.

Significantly, offences under Sections 5 and 7 are classified as cognizable and non-bailable under Section 10, granting authorities powers of arrest without warrant and limiting bail. The Act thus takes a firm stance against monetary wagering in online gaming, while supporting skill-based, non-wagering digital competition.

Disclaimer: Views, opinions, interpretations are solely those of the author, not of the firm (ALG India Law Offices LLP) nor reflective thereof. Author submissions are not checked for plagiarism or any other aspect before being posted.

Copyright: ALG India Law Offices LLP

Comment: Are Automated Copyright Takedown Mechanisms Being Used To Silence Critique

The sheer volume of uploads and activity on social media platforms necessitates real-time action against infringers. To counter this, social media platforms employ methods like automatic takedowns for copyright violations. Pertinent concerns arise as to the misuse of such mechanisms to stifle critique or dissent by powerful entities. However, the author believes that automated copyright takedown mechanisms do not silence critique or dissent and are mere mechanisms for enforcing copyright in a rapidly moving digital space.

In digital spaces, enforcement mechanisms cannot compete with the velocity of traffic present. The general criticism that these takedowns are aimed at dissent or satire assumes that all instances of criticism are lawful. Exceptions such as fair use are a means of defence, a shield rather than enforceable rights. Thus, if a video or material being shared contains copyrighted material beyond the contours of fair use, the holder has the right to seek immediate removal. What, in general understanding, seems like suppression may actually be legitimate claims of copyright infringement. Further, these platforms provide redressal mechanisms, in the form of counter-notifications in case of invalid claims. To protect the rights of copyright holders in such a scenario, while imperfect, these mechanisms provide the only feasible solution.

Moreover, there is a growing ecosystem of checks and balances built into these systems. Most platforms now allow counter-notices, appeals, and in many cases, revenue-sharing options where original content and derivative critique coexist. The burden is not as high, and small creators have had increasing success challenging wrongful takedowns. To allege systemic misuse without engaging with available remedies dilutes the credibility of the free speech argument. To confuse copyright enforcement with censorship also ignores the realities faced by smaller creators and rights-holders. Creators like designers, independent filmmakers, and musicians rely on automated takedowns to guard against large-scale, commercial misuse of their work.

While misuse cannot be ruled out completely in any system, the digital reality demands a solution that can keep pace with its rapid evolution. Automatic enforcement is thus the complementary pair to wide-scale infringement. If we weaken these systems in the name of protecting critics, it would disproportionately harm the very creative economy that copyright law is meant to sustain.

Disclaimer: Views, opinions, interpretations are solely those of the author, not of the firm (ALG India Law Offices LLP) nor reflective thereof. Author submissions are not checked for plagiarism or any other aspect before being posted.

Copyright: ALG India Law Offices LLP

  • Non Solicitation
  • Data Privacy & Protection
  • Conflict of Interest Policy
  • Data & Document Retention Practice
  • Firm Management Policy
  • Liability
  • Disclaimer
  • Privilege
  • Copyright
  • Billing Policy
  • Pro Bono