The DNA bill will cement a disturbing link between tech and policing
It potentially entrenches systemic issues of access to justice and unequal socio-economic status leading to the persecution of a disproportionate number of disadvantaged people.
Recently, the parliamentary committee on science and technology submitted its report on the DNA Technology (Use and Application) Regulation bill with a set of recommendations.
The purpose of the bill is to regulate the use of DNA information for establishing the identity of people. An index is sought to be created and maintained by national and regional “DNA banks”. The indices are meant for criminals, undertrials, missing and deceased persons. The bill allows different kinds of operations to be performed on such genetic material. One of the common applications of DNA technology is to create profiles of people using nails, hair, swabs and so on. Bodily types will be compared, categorised, homogenised and excluded, along with the inferences drawn from these divisions.
These profiles are then meant to guide law enforcement in investigations. Although the technology has been used (without proper regulation) under criminal procedure codes, the bill will institutionalise its use within the justice system with the maintenance of databases.
Experts believe that the bill leaves ample room for misuse, and that its consent provisions are not strong. A more fundamental concern is that DNA technology for identification derives from antiquated and discredited methods. Scientists confirm that much of DNA analysis involving statistical modelling algorithms embed judgments of the people behind the creation of these tools. This means that DNA samples collected are used to statistically create composites of “types” of people — racial, ethnic and so on. These methods, in their composition of types, in the inferences drawn, and the mathematical fact of computing averages to arrive at the estimates of types, have the scope for giving a scientific varnish to existing social and cultural bias.
Every new advancement in technology does not necessarily ensure automatic justice delivery, especially when our criminal justice system is one of the main modes of State repression. We have seen activists, students and journalists face police excesses. Where CCTV evidence contradicts official accounts, it has been ignored, or has been explicitly destroyed in cases where it might implicate the police. Imagine the same spaces being classified under the bill as “crime scenes” and the DNA of persons from these sites included within indices maintained by the State. This is what the bill will facilitate as standard procedure.
It potentially entrenches systemic issues of access to justice and unequal socio-economic status leading to the persecution of a disproportionate number of disadvantaged people.
Another question is that of openness. Already, denial of access to DNA laboratory records is affecting the rights of individuals in defending themselves, as highlighted by the work of Project 39A. With a new system of indexing DNA profiles of undertrials, criminals, missing and deceased persons, it becomes all the more important to think about the openness of the algorithmic techniques used in these methods. With no shield in the form of data protection and privacy laws, or the cross-dialogue with anti-discrimination laws like SC/ST Prevention of Atrocities Act, we are potentially moving towards automating, invisibilising and legitimising already existing biases in society, all in the name of technology.
The DNA profiling bill follows a long list of bills that are being introduced without the data protection law in place. In any case, what kinds of protections would a data protection bill allow? Possibly not a lot more, as government use of data for law enforcement is already grounds for wide exemption within the last known draft of the data protection bill. A data protection bill would not allay any concerns about biological surveillance and an imminent algorithmic turn in criminology.
The DNA bill is not just about regulating a scientific method. It cements a relationship between technology and policing in a direction that privileges discrimination. It does not consider the lessons of the last decade in how automated classification systems sustain caste, class and ethnic anxieties, and will substitute the complicated navigation of personal identity with genetic determinism.
Nayantara Ranganathan is a researcher and lawyer interested in the politics and culture of technologies
The views expressed are personal