Drones: It is important to address the privacy issues
The Unmanned Aerial Vehicles (UAV) Regulations, 2018, pay a lip service to privacy of citizens.
The regulatory approach to civilian drones has taken a 180-degree turn over the past five years. From a ban in 2014, the ministry of civil aviation (MCA) has been able to lead the transition to a regulatory framework that provides operational clarity. Implementation challenges notwithstanding, Digital Sky – a platform that technologically solves many compliance requirements – is an innovative step to tackle capacity constraints.

Yet, important privacy concerns remain unaddressed or under-addressed at best. The Unmanned Aerial Vehicles (UAV) Regulations, 2018, pay a lip service to privacy. While basic technology safeguards such as geo-fencing and detect-and-avoid systems have been identified to tackle safety concerns, equivalent privacy safeguards find no mention in the Guidance Manual for Remotely Piloted Aircraft Systems (RPAS). The guidance manual merely cautions RPA operators to ensure that privacy norms are not compromised. The public tender for Digital Sky identifies data privacy as fundamental to the platform design but without sacrificing utility. A more recent drone ecosystem policy roadmap released by MCA in January 2019 references privacy-by-design but with little guidance on operational details.
Therefore, we need to both classify the nature of privacy risks involved and identify possible solutions. There are two broad types of risks. The first set is to do with the spatially intrusive character of drone operations, especially in the civilian context. Because of their small size, relatively low altitude operations, and potential ubiquity, there is a strong likelihood that the technology becomes a tool for one-off snooping and mass surveillance. The Electronic Privacy Information Center in Washington D.C. has alerted against drone deployment of high-definition cameras, heat and motion sensors, and automated facial recognition technologies.
When drone operators cross limits, the legal response is to empower aggrieved persons using the tort of privacy. But Indian jurisprudence is weak on this front. Therefore, the trend in some of the US states – North Carolina, Arkansas, California etc. – to tailor criminal responses against aerial harassment and voyeurism could be imported. The guiding principle should be to evaluate the mental state that accompanied the act of intrusion before punishing violators. This helps avoid excessive criminalisation.
Apart from enabling private operators, drones could become surveillance tools of State actors. Current conversations on facial recognition technology are instructive. At least three cities in the United States have recently banned such technology for law enforcement purposes. However, the National Crime Records Bureau in India has sought bids to build an automated system that relies on such technology. These parallel developments underscore the need for debate on the ethical and legal ramifications of mass data gathering using drones.
The biggest gap in India on tackling this concern is again a jurisprudential gap. The Supreme Court emphasises proportionality of State action when legitimate aims such as crime prevention are pursued, but without clarity on what proportionality entails. Additionally, most of the court’s precedents deal with situations where individuals are singled out for exceptional treatment. These differ from instances where technological systems with an already low privacy baseline impact all citizens. The court’s insistence that potential for abuse cannot vitiate state power hinders a novel exploration of systemic erosion of rights. The opportunity to lay down new standards was sadly lost when the court practically upheld the excessive seeding and linking of Aadhaar numbers and databases last year. Therefore, onus is on courts and policy makers to address these spatial privacy concerns and flesh out new doctrine.
Additionally, there is a second type of risk resulting from the data gathered using such technology, which is that such data can be combined with other data sets to reveal personal information. This “big data” informational privacy risk is not unique to drones but stands amplified because of the granular data that the technology can gather. Standard legal responses, most commonly the implementation of informed consent before gathering personal data, work poorly in the face of ubiquitous data gathering, a point noted by the Srikrishna Committee that drafted the personal data protection bill currently pending before Parliament.
Therefore, privacy self-management must be customised to the context of civilian drone use. Here, responses could include integrating a notice dashboard within Digital Sky. Individuals can then access information about the geographic locations and purposes served by drone operations, the sensing and data gathering technologies onboard the unmanned system, the kinds of data potentially captured, and technical specifications relating to the granularity and accuracy of the data collected and processed, from such dashboard. This makes it possible to evaluate whether data operations are proportionate with stated purposes.
The availability of new technology is not necessarily a pretext to subvert rights sacrosanct to our being as individuals and citizens. Privacy is one such right, a fact recognised by the Supreme Court through a unanimous nine-judge bench verdict in 2017. Therefore, duty is cast upon the State to guarantee that technological applications like drones do not cause widespread erosion of this right. The drone regulatory conversations must accommodate this possibility and accordingly fulfil the positive obligation to safeguard individual privacy notwithstanding any utility concerns.
Ananth Padmanabhan is a fellow at the Centre for Policy Research, New Delhi
The views expressed are personal