Deepfake technology in India: Legal and Policy Landscape

Author: Iklavya Dev

The Casualty called Truth:

When we gauge the merits of a democratic society through the lens of it being a marketplace of ideas, where the freedom of expression is held in the greatest regard, technologies such as deepfakes shake the very foundation upon which this marketplace is built. Ideas and thoughts hold merit for only as long as there is consensus amongst the stakeholders of this market that there is an assumption of veracity and accuracy in the expression of aforementioned thoughts and ideas. In contrast, the very dissent posited by Justice Oliver Wendell Holmes Jr. in the Abrams case, which breathed life into the ‘Marketplace of ideas’ postulation, argued that misinformation, like a bad product in this market, is necessary to allow the rise of good products to meet the demands of the market. In true Hamletian fashion, whether the marketplace of ideas can sustain the outrageous weight and dumping of falsehood- that is the question.

Rehearsing Reality: The Deepfake Effect

The nomenclature of the portmanteau ‘Deepfake’ comes from ‘Deep learning’ and ‘Fake’. Deep learning is a form of machine learning that uses multi-layered neural networks to learn from extremely large datasets. In the case of Deepfakes, deep learning manifests in the form of Generative Adversarial Networks (GANs), where two competing neural networks- a generator to create synthetic content, and a distinguisher to perfect the content creation process- work together to create deepfakes.

The malicious use of deepfakes is prominently visible on social media, in news headlines, and on the screens of netizens. With the usage being predominantly pornographic, the impact of deepfakes is highly gendered and targeted, due to the high ease of access to generative software. Digital sexual harassment of women and electoral manipulation have emerged as the most profound drawbacks of the technology, with financial fraud also finding its space in the chaos. There also exist multiple benefits that practitioners and learners of medicine, history, and education enjoy. Immersive medical and hyper-realistic simulations enable second chances in surgical operations, historical enactments, and novel ways to enjoy audio-visual media. In essence, the underlying technology is characterised by neutrality alone; however, any sort of policy or statutory regulation is derived from the intent or negligence with which harmful implications can become normative when in use.

Policy Mirrors and Legal Smoke:

At the risk of messily entangling two complicated concepts, let us persist with the ‘Marketplace of ideas’ analogy, with UCL Psychoanalyst Peter Fonagy’s theory of ‘Epistemic Trust’. In this regard, Epistemic Trust refers to the baseline confidence consumers possess in the truthfulness of information in the marketplace of ideas, which serves as the premise for the meaningful functioning of market forces. Deepfakes systematically erode this epistemic trust, marking a striking shift from regular misinformation. Deepfakes corrupt evidentiary impulses that initiate the information registration process in the human brain, such as audio-video and biometric likeness. When such psycho-biological indicators of reality cannot be considered accurate reflections of the truth, the erosion is manifested through false negatives (where genuine evidence is dismissed as untrue) and false positives (where fabricated evidence passes as the truth). This destabilizes the ground on which legal responsibility and free speech stand, mandating special legislation to address these novel concerns.

Additionally, upon examining the existing laws, it becomes apparent that primary regulation is sourced from multiple provisions of the Bhartiya Nyaya Sanhita and the Bhartiya Sakshya Adhiniyam, which address criminality and the admissibility of digital evidence. The Information Technology Act deals with voyeurism, obscenity, and associated crimes, which are highly prevalent in the illegal usage of deepfakes. Sections 6 and 33 of the Digital Personal Data Protection Act of 2023 address non-consensual deepfakes as a violation of digital safeguards, which necessitate consent for any data processing required for adversarial neural networks to generate the deepfakes. The secondary source of regulation is delegated legislation, where governmental institutions such as the Election Commission of India and NITI Aayog release guidelines and policy frameworks, while the Judiciary publishes white papers and shapes regulation through its judgments. While there exists a normative web of overlapping statutes, a highly translucent trend is observable, where parliamentary codification is insufficient, and delegated legislation and judicial activism are left behind to play catch-up.

The Art of Faking the Truth:

Three key challenges hinder the applicability and subsequent enforcement of the laws, despite their evident breadth that covers aspects from personality rights to privacy protection, from data fiduciaries to electoral malpractice. Firstly, the establishment of mens rea serves as a huge barrier to any conviction, as the accused individuals tend to defend their content by invoking grounds of satire, parody, and similar lines of expression. In any scenario, there is a scope of under-enforcement due to high evidentiary standards to balance penal provisions against constitutional rights, along with the importance given to Blackstone’s Ratio. Any dangerous precedent that disturbs this balance risks a massive chilling effect on the freedom of digital expression, where the reasonability of digital content becomes synonymous with self-censorship amongst citizens.

Even if mens rea can be proven, culpability and attribution to the offence remains a massive loophole in most cybercrimes, as modern digital architecture allows operationality through private networks, pseudonymous profiles and encrypted communication beyond the reach of the investigative agencies. In the absence of symmetric technological capabilities with the law enforcement, judicial relief remains etched in the sections of the criminal codes, while prosecution remains impossible and aspirational. Thirdly, even if the traceability of the accused ceases to remain a problem, the globalised nature of social media platforms and deepfakes proliferation is such that limitations take on a jurisdictional nature. The international system of law has no harmonization over what the standards of deepfake regulation should be, and mutual legal assistance and extradition arrangements require active and willing diplomatic channels. Additionally, crimes take on a transnational colour when other countries have similar regulations of synthetic content, which is unlikely in most countries of origin. This allows the deepfake creators to act with full impunity despite the observable and substantive harm caused to Indian citizens.

Way Forward:

‘If the world, but as the world is a stage where every man must play a part’, deepfakes play the part of a man who might not even be playing. This, of course, has risen to occupy a special office in the minds of policymakers due to the competing legal interests in the technology. These include personality rights, the right to expression, and the integration of digital evidentiary procedures for criminal offences enshrined in the existing legislation. In essence, deepfakes possess the ability to impact the public persona of the individual, diminishing their constitutional right to life, as imagined and manifested by them. Any legislation constructed to cement the gaps, or substitute the existing legal frameworks, must be grounded in constitutional proportionality and informed by personality rights jurisprudence. This deepfake-centric statute would not represent an expansion of state control over speech, but a necessary recalibration of legal responsibility in an era where identity itself has become technologically replicable.

Citations:

Previous
Previous

Bharat And Central Asian Republics – Continuum of Allyship and Camaraderie

Next
Next

Recasting India’s Fiscal Architecture for Viksit Bharat 2047 | Part III