The Forgotten Privacy-by-Design Will Not Forget You

The Forgotten Privacy-by-Design Will Not Forget You

Lithuania
Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Although the General Data Protection Regulation 216/679 (the GDPR) has been in force for more than a year, the concept of data protection by design (Art. 25) is still largely underestimated and insufficiently implemented into software products and their development processes in Lithuania. Developers of data-rich technologies still disregard or misinterpret this duty despite its business benefits. This is especially true for new technological products which strive for steady and continuous increase in user numbers but lose their grip with user privacy on the way.

The concept of privacy-by-design was originally coined by Dr. Ann Cavoukian, the former Information and Privacy Commissioner of Ontario, Canada. It is based on seven “foundational principles”: 1. Proactive not reactive (i.e., preventative not remedial); 2. Privacy as a default setting; 3. Privacy embedded into design; 4. Full functionality (i.e., positive-sum, not zero-sum); 5. End-to-end security (i.e., full lifecycle security); 6. Visibility and transparency (keep it open); 7. Respect for user privacy (keep it user-centric). The embedding of these principles requires a privacy-centric approach, with creativity as well as knowledge of business project management. In Lithuania issues with implementation of privacy by design into software products exist in both the public and business sectors.

Lithuania’s mandatory public procurement procedures do not, currently, force authorities to require GDPR-compliant solutions in documents related to IT tools and software development tool acquisitions. Paradoxically, however, public bodies use such software for big data processing of all Lithuanian citizens. Privacy issues like poor IDM (identity management system) and insecure access to personal data sets in IT tools of public registries, university healthcare institutions, public online service providers have been recently escalated in the media. The Lithuanian public procurement law still doesn’t contain any special rules for public authorities to require privacy-friendly IT solutions. This means that now software developers can deliver products which are not privacy friendly and leave the public bodies that order these products at risk. Not having clear rules and checklists about what has to be done in Lithuanian procurement procedures creates a systematic privacy risk. Lithuania has the legislative ecosystem to embed privacy into the legal acts. In 2018 Lithuania’s data protection authority even issued a recommendation to public authorities to coordinate all legislative initiatives relating to personal data processing with them. 

The political focus on privacy matters in public procurement thus seems to be the only missing condition. While this action is on the way public bodies can raise privacy-related questions to software developers with the help of privacy experts and national data inspectors. This matters because it is very possible that privacy-impact assessments for big developing projects have to be done before developing processes. Unfortunately, the lack of any common practice on it allows software developers to provide poor privacy-related software.

Lithuania’s public procurement law is not the only example of poor protection of personal data on the national level. Weak user identification is currently a particularly acute issue for mobile fintech apps which have missed out on privacy-by-design and have resorted to relaxed customer identification. Back in 2017 Lithuania’s anti-money laundering law removed rigid identity verification requirements for payment instruments with a non-reloadable maximum monthly transaction limit of EUR 150. As a result, identity fraud and fake app accounts are now on the rise. Attempts to implement privacy-by-design when the technology is already mature and no longer susceptible to easy change is problematic. Consequently, stakeholders find themselves handling complicated high-risk personal data breach situations which require notifying data protection authorities and affected data subjects and attempting to demonstrate that their identification processes meet market standards. This is no easy task considering the high-end identification standard suggested by the European Commission in its April 26, 2018 “Study on eID and digital on-boarding: mapping and analysis of existing on-boarding bank practices across the EU” report, requiring (among other things) the application of fraud prevention measures.

Implementing privacy-by-design into software development processes would bring tangible benefits to both public and private sectors: it helps to create and implement appropriate, compliant, and secure tools for data processing, it provides data subjects with control over privacy settings, it makes software more transparent and user-friendly, it helps public and private organizations build customers or citizens trust, and it gives software developers a competitive advantage on the market. 

A privacy-centric approach is now a must from the outset of each data-rich technology. A mistake in privacy design creates reputational exposure and/or decreases the user experience. Lithuanians know their privacy rights well and do not forgive nor forget their privacy while using the system. 

By Liudas Karnickas, Partner, Karnickas & Partners, and Laura Juodakyte, Partner, L. Juodakyte Law Firm, and Karolis Aulosevicius, Indep. Privacy Technical Expert

This Article was originally published in Issue 6.8 of the CEE Legal Matters Magazine. If you would like to receive a hard copy of the magazine, you can subscribe here.