India | Japan | Italy | Spain | France | German | UAE
Dear Reader,
The Legal Department at UJA is delighted to impart certain legal knowledge as construed under the Legal Chronicle to keep the readers aware of the recent updates and developments that revolve around various aspects of the law. Our goal is to enable our readers to develop a sense of familiarity with the complexities of Indian as well as international law.
In this edition of Legal Chronicle, we focus on Data Breach Response in India, providing an insight on the CERT-IN Directions under the IT Act and the Digital Personal Data Protection (DPDP) Act, 2023. With the rising frequency of cyber incidents and data breaches, it is crucial for organizations to understand their legal obligations. While the DPDP Act establishes comprehensive personal data protection rules, its operational mechanisms for breach reporting are yet new. Until then, CERT-IN Directions remain the go-to framework, mandating immediate reporting, preservation of logs and technical cooperation. This edition also highlights practical compliance steps, including incident detection, forensic investigation and preparation for future DPDP requirements, ensuring that organizations are both legally compliant today and prepared for tomorrow.
We hope that this edition creates a sense of enthusiasm for our readers and successfully delivers the plethora of legal knowledge as intended. In case you have any feedback or need us to include any information to make this issue more informative, please feel free to write to us at legal@uja.in.
Deepfakes are AI-generated videos or images that convincingly mimic real individuals, while misinformation refers to false or misleading information that spreads intentionally or unintentionally. With the rapid advancement of AI tools and the widespread use of digital platforms, both deepfakes and misinformation are becoming increasingly prevalent, posing risks to individuals, brands and society at large. Corporations, whether as creators, hosts or passive enablers of such content, may face legal and reputational consequences if they fail to address these risks. This article will explore how Indian law and global frameworks tackle corporate liability in cases involving deepfakes and misinformation.
Deepfakes refer to AI-generated synthetic media that realistically imitate real individuals, making it increasingly difficult to distinguish between authentic and fabricated content. Misinformation denotes false or inaccurate information shared without harmful intent, whereas disinformation is deliberately created to deceive or manipulate. Recent instances, such as political deepfakes influencing public opinion and fake brand endorsements circulating online, highlight the growing misuse of such technologies. These developments pose significant risks to individuals’ reputations, corporate credibility and overall public trust in digital media, emphasizing the urgent need for stronger regulatory and ethical frameworks.
Corporations can be connected to deepfakes and misinformation both directly and indirectly. Direct involvement arises when a company’s technology, tools or employees are used to create or disseminate manipulated content. Indirect involvement occurs when corporations, particularly digital platforms or intermediaries, host, promote or fail to remove such content despite being aware of it. Additionally, many companies become victims of deepfakes or misinformation that damage their brand reputation, mislead customers or affect stakeholder trust. In all scenarios, corporate accountability and proactive governance play a crucial role in mitigating the legal, ethical and reputational risks associated with such incidents.
Information Technology Act, 2000 (IT Act)
IT Rules, 2021 (Intermediary Guidelines and Digital Media Ethics Code)
Digital Personal Data Protection (DPDP) Act, 2023
Bharatiya Nyaya Sanhita (BNS), 2023
Key Provisions under IT Rules, 2021 (Rule 3(1)(b))
User Awareness Obligations
Accountability in Content Removal
Grievance Redressal
Grievance Appellate Committees (GACs) Mechanism
Effective corporate governance and internal compliance mechanisms are critical in managing risks associated with deepfakes, misinformation and other digital threats. Companies are expected to establish clear policies, oversight structures and accountability frameworks to ensure ethical use of technology, prevent misuse and respond promptly to incidents. Internal compliance measures, including risk assessments, employee training, reporting channels and audit mechanisms, help organizations meet legal obligations under Indian laws like the IT Act, DPDP Act, POSH Act and other relevant regulations. Strong governance not only mitigates legal and reputational risks but also reinforces trust among stakeholders and promotes responsible corporate conduct.
Corporate liability in cases of deepfakes and misinformation depends on the company’s role, awareness and response to the content. Companies may be held accountable if they create, distribute, host, or fail to remove harmful content, or neglect due diligence obligations under laws like the IT Act, DPDP Act, upon being effective and CERT-In Directions. Liability may extend to directors, officers and intermediaries, emphasizing the importance of robust internal controls, monitoring mechanisms and timely compliance to mitigate legal, financial and reputational risks.
Deepfakes and misinformation present significant challenges for individuals, corporations and society at large. Ensuring corporate accountability through robust governance, internal compliance and adherence to legal frameworks both in India and globally is essential. By adopting proactive risk mitigation strategies, ethical practices and timely response mechanisms, organizations can safeguard their reputation, protect stakeholders and navigate the evolving digital landscape responsibly.
This document is intended to provide general information and is not intended to be substituted for any legal or professional advice. This document is meant exclusively for informational purposes and not for advertising or solicitation. UJA has made significant efforts to ensure that the information contained in this document is accurate and reliable. However, the information herein is provided “as is” without warranty of any kind. UJA hereby disclaims all responsibility and liability, whether stated or implied, for the accuracy, validity, adequacy, reliability or completeness of any information provided under this document. In no event shall UJA be held liable for any losses or damages whatsoever incurred as a result of using this document.
The UJA’s team specializes in offering a wide range of legal solutions, ensuring comprehensive support for both businesses and individuals.
Our Comprehensive Services Include:
UJA supports businesses in navigating complex regulations, global markets, and GI laws. Operating across France, Germany, Japan, Spain, and more, we specialize in market entry, expansion, and offering tailored solutions for growth. With over 29 years of experience and a team of 170+ experts, we have helped more than 1000 clients from SMEs to MNCs achieve their goals. Headquartered in Pune, we have offices across India – Bengaluru, Gurugram, Mumbai and International Offices in Japan, Italy and France with the representation in Germany, Spain & the UAE.