Facial Recognition Software – Step Forward for Technology, Step Backwards for Privacy Rights

Facial Recognition Software – Step Forward for Technology, Step Backwards for Privacy Rights

Serbia
Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

As we witness the progress of technological achievements in our society, certain advancements in this field may give cause for concern in others, such as privacy and personal data protection laws. Although facial recognition technology has been around for a while, an increasing number of countries are starting to implement this technology in dealing with the ongoing pandemic. Its use is not limited only to special circumstances such as health emergencies; its use far exceeds a narrow scope of crises and extends into anything from preventing terrorist activities to keeping track of class attendance at school. Naturally, while this technology should be welcomed as its possibilities are further explored, there is a legitimate reason for legislators to stay alert, approach regulation of this issue carefully, and not abolish the privacy rights of citizens.

Global Phenomenon

China was among the first to implement this kind of technology, mainly under the pretense of fighting crime and prevention of terrorism. However, the vague nature of Chinese privacy and personal data protection laws allowed the use of technology for purposes far more commercial than the noble pursuit of eradicating crime. For example, in the case Guo Bin vs. Hangzhou Zoo, Mr. Guo was obliged to provide Hangzhou Zoo with the scan of his facial features based on a unilateral decision of the Zoo in order to access its premises under his previously paid subscription. Luckily, Chinese Law on the Protection of Consumers’ Rights and Interests requires that the collection and use of consumers’ personal data can be conducted in accordance with principles of legality, justification, and necessity, subject to the consent of the data subject, which provided grounds for a lawsuit on behalf of Mr. Guo against Hangzhou Zoo. Similarly, when Beijing Metro proposed the introduction of facial recognition software, such a proposal was deemed as excessive and unnecessary for the purpose it set out to achieve by a group of local jurists, resulting in the postponement of its application by the Metro.

Sadly, the most vulnerable among us had far less success in protecting their rights. A certain University introduced facial recognition software to monitor the attendance and behavior of students in classrooms. Due to the limited scope of Chinese data protection laws, this course of action did not breach any laws as classroom spaces were declared a “public space”.

China’s example was soon followed by countries such as Japan, Malaysia, Bangladesh, Singapore, Moldova, Russia, Australia, and the United States, just to name a few. The trend became apparent in the wake of an ongoing COVID-19 pandemic as it is one of the means by which governments track the movement of citizens and enforce the application of quarantine measures.

The legality of the use of facial recognition software cannot be questioned in most jurisdictions in the absence of the broader legislative acts akin to those of the European Union. The United States federal government (the United States Congress) did consider the issue and has come to bipartisan consent that the privacy concerns must be resolved through new policy proposals, but no such policy had been enacted on the federal level to this day. Privacy and personal data protection rights remain under the jurisdiction of individual states, while the level of protection varies from one state to the next.

Can This Happen in European Union?

Fortunately for European citizens, provisions of General Data Protection Regulation (“GDPR”) offers a broader scope and more detailed protective measures than any of its global counterparts. According to Article 9 of GDPR, biometric data (e.g. facial features) is classified under the special category of data. Lawful processing of such data requires prior consent of the data subject apart from very few, explicitly prescribed exceptions. In an attempt to introduce facial recognition software in tracing attendance at school, unlike in the case of Chinese University, the municipality of Skelleftea in Sweden was issued a fine of approximately EUR 20.000 by the Swedish Data Protection Authority. Interestingly, the municipality was not fined on behalf of the school due to the lack of consent from data subjects, but because the consent was considered void due to the imbalance of power between the school and data subjects. In other words, Swedish Data Protection Authority drew a line between “consent”, which may be given under duress or coercion and thus considered void and “meaningful consent”, which is given freely and consequentially considered as valid.

This case, however, does not constitute a precedent regarding the application of GDPR within the European Union in relation to facial recognition software. The European Commission’s executive vice president for digital affairs stated in an interview that the Commission will further investigate concerns regarding automated facial recognition, allowing the member states to regulate this issue in the meantime.

Facial Recognition in Serbia

It seems that no official Serbian authority, including the Commissioner for Information of Public Importance and Personal Data Protection (“Commissioner”), had considered the issue systematically or publicly up to this day. However, as Serbian Data Protection Act (“SDPA”) has been modeled after European Unions’ GDPR and stipulates in its Article 4 paragraph 1 (15) that a picture of data subjects face is considered biometric data, processing of such data by facial recognition software falls under requirements for lawful processing prescribed in Article 17 of the SDPA.

It may also be expected that the Serbian Commissioner will closely monitor policies and practices applied in other jurisdictions, especially in the European Union. As technology progresses, new policy proposals and regulations will be drafted accordingly.

This text is for informational purposes only and should not be considered legal advice. Should you require any additional information, feel free to contact us.

By Katarina Zivkovic, Senior Associate, and Dragan Martin, Associate, Samardzic, Oreski & Grbovic