How can we ensure that consent is really informed and freely given in situations where power imbalances occur, together with cultural obstacles and emotional implications? Scholars Georgios Glouftsios, Stefania Milan, and Gianclaudio Malgieri tried to answer this question within the framework of “Data Dilemmas 2021 ”.
Biometrics imply that the knowledge of the human body reveals something about the human self. These technologies do not just seek to establish the identity of an individual, but they also provide authorities with access to digital files that contain various administrative information about them. Such information can reveal, for example, when and where a migrant applied for asylum, when and where he or she applied for a visa, and so on. Georgios Glouftsios’ work as a postdoctoral researcher at the School of International Studies of the University of Trento has led him to the conclusion that data extracted from bodies and their links with such administrative information are used to make migrants “controllable subjects”.
Referring to the book of Simone Browne (Dark Matters: On the Surveillance of Blackness) Glouftsios explains that this kind of obsession with controlling bodies has a long history that can be traced back to the transatlantic slave trade. As Browne shows in her book, the branding of slaves with hot iron used to function as an early biometric technology: it was a practice through which enslaved people were signified as commodities to be bought and sold and traded. The brand, in that context, denoted the relation between the black body and the “owner”. Slave branding was a rationalising act: by turning black bodies into a commodity, it allowed their dehumanisation to fit them into a system of exploitation for profit.
Today, branding with biometric identification technologies used for border and migration management is of course something else. However, Glouftsios believes that the operational logic is quite similar: non-Europeans are other-ed as digital codes give them new meaning and identity. Codes are used to categorise them as “migrants” and “asylum seekers”; they are treated as inherently suspect subjects, not on the basis of past conduct, but of their nationality and non-white, non-European background.
In fact, the list of countries whose citizens are required to apply for a visa before travelling to the EU and thus get registered in the visa information system are mainly from the global south. As part of the visa application process, these people are required to provide body evidence in the form of biometric data, which is registered in pan-European databases used by state authorities to control their movements. In the context of migration management, such control of movements takes the form of traceability (the ability of the state to predict their movements and bureaucratic traceability through identification first, and then digital traces) and containment (with the purpose of slowing down, intercepting, and redirecting migrants).
There are several biometric databases in Europe. EURODAC (European Dactyloscopy) registers fingerprints of asylum seekers – but also people caught crossing borders illegally – and helps determine which member state is responsible for examining the application. The Visa Information System (VIS) enables the consulates of EU member states in countries where asylum seekers live to create digital files for people applying for visas before they travel to the EU. These files are then consulted to understand whether applicants intend to migrate from the member states after the expiry of their visas, in order to assess the security risks that they may cause, and to check the validity and authenticity of visas at border crossing points. The forthcoming Entry/Exit System will be able to calculate the duration of the authorised stay for all migrants that are in the EU, and to detect those who have no longer the right to stay.
Other biometric databases are used by law enforcement agencies and are not strictly related to border security and migration management. One of them is the Shengen Information System (SIS II) , which collects alerts related to individuals convicted of a criminal offence in one or more member states as well as suspects for whom there are grounds to believe that they have been involved or will be involved in criminal or terrorism-related activities.
These databases turn people into actual “objects of control”. But many migrants, rightly, rebel, and refuse to provide biometric data, sometimes enacting forms of dissent that also include self-directed acts of violence – such as burning one's fingerprints. In these cases, consent is of course difficult to obtain, or it is forced through the use of “best practices”. One of these is counselling, aimed at persuading migrants to give fingerprints voluntarily. If counselling fails, however, forms of coercion are sometimes employed. Another violent “solution” has been the detention of migrants until their fingerprints heal, as documented by Amnesty international .
Another issue related to the collection of migrants’ data is highlighted by Stefania Milan, Professor of New Media and Digital Culture at the University of Amsterdam and head investigator of the DATACTIVE project. During the Covid-19 crisis, migrants tended not to report to hospitals, thus becoming invisible to society and to the healthcare system. If infected, some of them did not test or look for help, hence not showing up in the official counting of cases. These elements led to the emergence of racist narratives about how people of African descent “are immune to the virus”. When quantification becomes more and more important, data and numbers become the central part of narratives. The pandemic has made it clear how important issues of data power and power imbalances are when it comes to data production. But the problem is also data poverty, which leads to a condition of invisibility for vulnerable people.
Alternative forms of consent
The GDPR defines vulnerable people as those who are not legally competent to give their consent to the use of personal data, but it explicitly refers just to children. However, it introduced the idea that vulnerable people can also be those who might suffer from adverse consequences if their personal data were to become public. According to Gianclaudio Malgieri, Professor of Law and Technology at the EDHEC Business School in Lille, there are two kinds of vulnerability: the first is related to data processing and regards people who are data illiterate or with limited cognitive capabilities. The second is vulnerability to the effects of data processing, which can consist of discrimination, manipulation, stigmatisation, or limitations of basic freedoms. Vulnerability is an ambiguous concept with a high risk of stigmatisation, because if people are already part of a minority or in a disadvantaged position, defining them “vulnerable” leads to even more stigmatisation.
In general, vulnerable people usually have a stronger need to give their data (to hospitals, reception centers, etc.), but at the same time the risk of manipulation or discrimination is higher for them. If vulnerabilities involve data illiteracy or restricted freedom to say “no”, then the solution is to move away from traditional forms of consent. Consent of incapacitated people can be obtained in non-conventional ways, for example through the use of a psychologist or a mediator (although, as we have seen, this can sometimes be a form of coercion), but also by asking for it in different time steps, or not just in writing. Another solution is to avoid consent as a legal basis in the first place and choose legitimate or public interest as a legal basis. This requires, however, a concrete balancing of interests with the data subject.
Another creative solution proposed by the EDPS (European Data Protection Supervisor) in 2019 is to make sure that the real privacy expectations of people are known, thus consent becomes just a safeguard to understand what their view is. This only applies to subjects that are vulnerable in understanding data collection processes, but not to those vulnerable to the effects of data processing (discrimination, manipulation, control, etc.), such as migrants. In these latter cases, consent is key.
According to Milan’s studies and project, vulnerable people are also those whose work would potentially expose them to threats of various kinds (activists, for example) or even those who might not be vulnerable today but might become so in the future. The goal of her “DATACTIVE” project is to treat them as ‘skilled learners’ - people that know a lot about given practices - instead of ‘data sources’. A change in the relationship to sources can help address the problem of power imbalances.