The recent decision no. 35/2022 issued by the Hellenic Data Protection Authority (HDPA) on July 13, 2022 (Decision), marked a record-high EUR 20 million fine against US company Clearview AI Inc. (Clearview). This Decision adds even higher pressure on Clearview, on top of other data protection authorities’ (DPAs) relevant decisions (French, Italian, British), while a similar decision is expected soon by the Austrian regulator, all as a response to a series of complaints filed by an alliance of non-profit privacy-driven organizations.
The complaints essentially disputed Clearview’s business operation of scraping selfies and photos from public social media accounts and including them in its facial recognition database of about 10 billion facial images, with Clearview aiming to reach 100 billion images in the next few years.
The reasoning behind the HDPA Decision is similar to other DPAs’ decisions and quite straightforward, as it argues that Clearview actually uses its software to monitor the behavior of individuals in Greece, irrespective of the fact that it is actually based in the US and does not offer services in Greece or the EU.
In this respect, the regulator identified a series of core infringements related to, inter alia, the principles of lawfulness, fairness, and transparency under the GDPR, thus ruling that collecting images for a biometric search engine is illegal if the data subjects’ prior explicit consent has not been provided.
Specifically, the HDPA ascertained failures of Clearview to (1) establish the legitimacy of personal data processing, including special categories of personal data, and given that Clearview was missing any of the required legal bases, (2) provide appropriate information to data subjects (users) as regards the processing of their data, (3) respond to data subjects’ access request, and (4) appoint an EU representative as required by the GDPR, due to the fact that Clearview is not established in the EU.
In light of the above, the HDPA ordered Clearview to delete not only all images of individuals in Greece that were collected in the course of its normal business activity so far, but also the biometric information that is needed to search for and identify a specific face.
In other words, the Decision essentially puts an end to Clearview’s intrusive business model across Greek territory. Adding to the equation the fact that other DPAs’ similar decisions have already been issued or are still pending, it comes naturally that we may soon talk about the cessation of the whole company’s business (as it now stands) across most EU member states.
In the meantime, if Clearview complies with all these orders to delete and stop processing individuals’ data, it will be unable to keep its AI models updated with fresh biometric data, meaning therefore that the usefulness of its product will gradually degrade.
Following suit with the rest of the reasoning, the HDPA’s ruling does not even differentiate from other DPAs’ rulings in the point that such decisions have not – so far – ordered the destruction of Clearview’s algorithm, although concluding that it was trained on unlawfully collected personal data.
In this respect, a lot of discussions are held on whether the GDPR empowers oversight bodies to be able to order the deletion of AI models trained on improperly obtained data – not just to order the deletion of the data itself, as it seems to have happened so far in this Clearview case.
Nevertheless, incoming EU AI legislation could be set to empower competent regulators to go further. In particular, the (draft) Artificial Intelligence Act provides for powers of market surveillance authorities to “take all appropriate corrective actions” to bring an AI system into compliance – including withdrawal from the market, depending on the nature of the risk it poses.
By Michalis Kosmopoulos, Partner, and Panagiotis Tampoureas, Senior Associate, Drakopoulos
This article was originally published in Issue 9.10 of the CEE Legal Matters Magazine. If you would like to receive a hard copy of the magazine, you can subscribe here.