Skip to content Skip to footer

Facial Recognition laws in India

By Neha Patil – Intern at Netlawgic Legal


In the post-COVID era, we’re seeing a surge in the use of facial recognition technology. Face recognition technology have provided significant gains in criminal tracing and crime prevention for law enforcement organisations, but they also pose significant risks to privacy and data misuse.

Is facial recognition data identified as “sensitive personal data”?

We must first show that face recognition data is sensitive personal data’ under The Indian Information Technology Act, 2000 in order to seek protection under the numerous legislations and provisions mentioned.

Personal data is defined as any information about a natural person that may be used directly or indirectly to identify that person under Rule 3 of the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011. As a result, facial recognition data can be classified as sensitive personal data under the definition.

Fundamental Rights

The nine-judge bench in the landmark case of Justice KS Puttaswamy Vs Union of India (2018) decided that Article 21 protects the ‘Right to Privacy’ (read alongside Article 14 and Article 19).

It was held that the ‘Right to Privacy’ is not only an integral part of the right to life and personal liberty under Article 21 but it is also a part of the freedoms protected by Part III of the Constitution”. They claimed that “informational privacy is a component of the Right to Privacy,” and urged that the government create regulations to help strike a balance between the privacy rights of citizens and the security needs of the state.

The Indian Information Technology Act, 2000

 Section 43A – Compensation

This section holds businesses accountable for improperly collecting and managing sensitive data. This Act makes these entities accountable to pay compensation if they do not follow reasonable practises and procedures. Bodies are defined in this section as firms or corporations engaged in professional work.  For example, such sections would apply to service providers like Quora and Google who collect information.

“Reasonable security practises and procedures” refers to measures taken to protect sensitive data from such attacks. These practises may be outlined in a contract, a statute, or government-mandated processes.

As a result, if a corporation fails to follow standards and your face recognition data is exposed, they may be held accountable for damages. The tribunal courts would decide on the amount.

Section 72A – Imprisonment or/and Fine

The sharing of “personal information” without the person’s consent can result in three years in prison, a fine of up to five lakh rupees, or both.

As a result, if your face recognition data is leaked without your permission, the corporation may be penalised as well as imprisoned.

Consumer Protection Act of 2019

The Consumer Protection Act of 2019 gives redress to anyone who have suffered injury as a result of utilising a product. These laws can also be utilised to provide relief when service providers disclose sensitive or personal information provided by users in order to get services.

Individuals who have had their personal information exploited are entitled to remedies under Section 47 (ix) of the Consumer Protection Act, 2019. If such personal data is disclosed, the maker will be held liable for product responsibility. They will subsequently be forced to pay the tribunal-determined compensation. “Mental agony and emotional distress” are also defined in the 2019 Act as categories of injury that can be induced by the use of goods.

As a result, if a service provider leaks private information (such as your face recognition data) and the disclosure causes physical, mental, or emotional discomfort, customers can seek protection and compensation under this Act.


In Alejandro Monroy v. Shutterfly Inc. (2017), the court broadened the concept of biometric data to include everything that isn’t explicitly excluded from the law’s scope. This includes face mapping, which uses photos and fingerprints, as well as image-based retina capture. Citizens who want to defend their ‘Right to Privacy’ against face recognition technologies now have more options.[1]

In the case of In re Facebook Biometric Information Privacy Litigation (2019), the court found that Facebook’s use of face recognition software to propose tags on images violated theBiometric Information Privacy Act guidelines. As a result, the user’s privacy was violated because they did not obtain the user’s consent before scanning their face.[2]

The withdrawal of biometric face recognition technology (FRT) from two high schools was ordered by the French Administrative Court of Marseille.  This case is notable because it is the first time that the General Data Protection Regulation (GDPR) has been applied to AI biometric technology in France. In a democratic society, where basic rights are respected, this case clarified when facial recognition is required and when it is unnecessary and improper.[3]

However, in Rivera v. Google Inc. (2019)[4], it was determined that when Google Photos scanned photos to create facial alterations and templates, they did not breach the BIPA rules. The plaintiffs had not suffered any “financial, bodily, or emotional injury other from feeling insulted,” thus the court refused to offer them remedy. They further stated that the user’s right to privacy was not violated because the only parties that had access to the data were Google and the user.

Facial recognition in law enforcement

The National Crime Records Bureau (NCRB) released a public Request for Proposal (RFP) in 2019 to create a countrywide Automated Facial Recognition System (AFRS). In some ways, the NCRB replied to a legal notification demanding the recall and cancellation of this RFP. The RFP was recalled but not cancelled; it was simply replaced with a revised RFP in June 2020. The arbitrariness with which this activity is being pursued has been one of the most concerning features of both RFPs. Currently, there is no comprehensive legislation in India that authorises, controls, or decides the evidentiary value of automated facial recognition technologies (AFRTs) in domestic law enforcement and the criminal justice system.

The employment of AFRTs in Indian law enforcement raises constitutional and legal concerns. The guidelines for responsible AI, published by NITI Aayog in the year 2021 and expressing the governments’ commitment to safe and ethical AI use, explicitly endorse the idea of “constitutional morality.[5]” This requires  any deployment of AI in India must, among other things, protect people’s constitutional rights and freedoms.[6]

Incidents of abuse:

1.S. Q. Masood, a social activist from Hyderabad, filed a PIL claiming that the deployment of FRT not backed by law, is unnecessary, disproportionate , and is being done without any safeguards in place to prevent misuse.


Mr. Masood  was stopped by 8-10 police officers in Hyderabad while heading home from work in May 2021 and told to remove his mask, despite the fact that the pandemic was still ongoing and Hyderabad was reporting a substantial number of cases. He  was asked to remove his mask because they intended to photograph him, and when he refused, they took his picture anyhow.

The implementation of facial recognition technology has been challenged by the Petitioner on the following grounds:

  • According to the PIL, the Supreme Court stated in K.S. Puttaswamy v. Union of India & Ors (2017) 10 SCC 1 that the government can only restrict the right to privacy if it is justified by law. No law in Telangana authorises the administration to employ FRT, therefore it violates right to privacy.
  • There are no procedural safeguards in place to guarantee that the power to deploy FRT is used fairly, reasonably, and justly. The deployment of FRT was outlawed by the Court of Appeal of England in R v South Wales[7] Police because it was left to the discretion of the police as to who and where FRT should be used (See from Paragraph 90).
  • The deployment of the FRT in Telangana is not targeted, specific, or narrow; rather, it is utilised for mass monitoring and is a violation of the fundamental rights of the residents of the State of Telangana.

After hearing Mr. Manoj Reddy, who represented Mr. Masood in court, the bench issued a notice to the Telangana State Government.[8]

2.In a huge privacy breach of a facial recognition app used by the Tamil Nadu police at its Madurai city branch, the names and images of thousands of “suspected criminals” were disclosed to the public through the internet. Cyber security researchers discovered and reported the leak.


Copseye, developed by Madurai-based startup Geomeo Informatics, allows police to photograph anyone suspected of illegal activities.

The photographs are subsequently sent to the police department’s centralised criminal database, where they are scanned for prior criminal histories. A match permits police to look into the’suspects.’

Personal Data Protection Bill 2019

In the case of Justice KS Puttaswamy vs Union of India (2019), the court established a committee led by Justice BN Srikrishna to draft legislation to protect an individual’s data privacy. The Personal Data Protection Bill, 2019, is the culmination of a year’s worth of work and has yet to be approved by Parliament.

The bill aims to regulate personal data processing not only by the government but also by private businesses (for example, companies, corporations, and firms). It is based on the General Data Protection Regulations of the European Union.

Section 3(8) of the bill defines Biometric data. The definition includes facial images as Biometric data. Section 3(35) includes biometric data as Sensitive Personal Data. The bill specifies that a data fiduciary can process biometric data only if it is notified by the Central Government and  approved by law.

Position in other Countries


The Body Camera Accountability Act prohibits California law enforcement from using face and other biometric surveillance technology in officer-worn body cameras against the public in the state for the time being.[9]

European Union General Data Protection Regulation

The collection of biometric data, such as fingerprints, retinal scans, and facial recognition data, is prohibited under Article 9.[10]

It also ensures that citizens have the right to know about the data gathering procedure and the data controller’s contact information. They also have the right to data rectification, restriction of data processing, data portability, restriction of unauthorised data acquisition, and the right to be forgotten.

Biometric Privacy Act in the United States of America

Many states in the United States have laws prohibiting the gathering of biometric information. The Biometric Information Privacy Act, for example, was passed by the state of Illinois in 2008. It is a set of regulations that make it illegal for private entities to acquire, use, or store biometric data without the user’s consent. The state of Texas, for example, has created and codified regulations governing the acquisition of biometric data. House Bill 1493, which establishes regulations for firms that collect and analyse data for commercial purposes, was also signed into law by the state of Washington.


Facial recognition technology has been responsible in false arrests, intrusive surveillance, and protest crackdowns all around the world. It is now illegal in 13 places across the United States, including San Francisco and Boston. In Europe, regulators are also reconsidering the widespread use of facial recognition systems in public settings.

Before being subjected to a severely faulty technology with no protections in place, Indians deserve the same level of care and consideration. The government should withdraw its bids and focus on passing the privacy law that it promised Indians in the first place until there has been public engagement, laws putting in place safety measures, and a proper proof that facial recognition genuinely works.


  1. K. (2022, January 3). Telangana HC issues notice in challenge to FRT. Internet Freedom Foundation. Retrieved January 6, 2022, from
  2. Chandrashekhar, A. (2019, September 6). Facial recognition app used by Madurai Police left data of individuals unsecured. The Economic Times. Retrieved January 6, 2022, from
  3. Part2-Responsible-AI-12082021.pdf (
  4. DGN Paper 16.cdr (

[1] Monroy v. Shutterfly, Inc,.




[5] 615 Pratap Bhanu Mehta, What is constitutional morality (

[6] Facial recognition in law enforcement is the litmus test for India’s commitment to “Responsible AI for All” | ORF (


[8] Petition in public interest by Mr. Masood against the illegal deployment of FRT in the State of Telangana,

[9] The Body Camera Accountability Act (AB 1215) | ACLU of Northern CA (

[10] Art. 9 GDPR – Processing of special categories of personal data – General Data Protection Regulation (GDPR) (

Leave a comment


Subscribe to the updates!

[mc4wp_form id="461" element_id="style-11"]