22
Fri, Nov
45 New Articles

What Should Employers Expect from the EU AI Act?

Serbia
Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

AI is increasingly revolutionizing the way businesses handle recruitment, hiring, management and employee monitoring etc. AI solutions are already adept at personalizing employee experiences, such as benefits and training, streamlining HR processes throughout the employment lifecycle, boosting efficiency and significantly reducing administrative burdens. Additionally, AI provides critical workforce insights, facilitating data-driven decision-making and management. However, these advancements also bring potential risks related to discrimination, protection of privacy, and other fundamental rights that employers must carefully manage.

As the EU AI Act entered into force on 2 August 2024 and should generally be applicable 24 months after its entry into force (with several exceptions), it is crucial to define the role of employers in the AI chain to ensure understanding of the obligations that they must fulfil. While the Act includes provisions on extraterritorial application that could affect AI developers and model creators in Serbia, the question arises: Can it also apply to Serbian employers? Specifically, as deployers of AI systems, some of which may be classified as high-risk, can Serbian employers be subject to the obligations that apply to EU-based employers per EU AI Act?

Applicability of the EU AI Act

The EU AI Act does have provisions that allow its extraterritorial application, meaning it can apply to Serbia, as non-EU entity, under certain circumstances. For instance, AI system providers based in Serbia could fall under the EU AI Act if their systems are used within the EU. However, for Serbian employers who act as deployers (users) of AI systems for their needs within Serbia, the Act does not apply.

For employers in the EU, the EU AI Act introduces strict requirements related to high-risk AI systems. These systems are typically those that monitor employee performance, profile individuals, or make automatic decisions regarding working conditions, promotions, or terminations. Many tools used in employment processes fall into this high-risk category, especially those involved in profiling employees. Additionally, some of the AI tools are strictly prohibited like AI systems for emotions detection or biometrics categorization, which are leading to obtaining sensitive data.

The EU AI Act outlines conditions under which high-risk systems, as defined in Annex III, could claim exemption from the high-risk classification. However, these exceptions do not apply to AI systems used for profiling employees, which are always classified as high-risk.

Consequently, employers in the EU are required to conduct Data Protection Impact Assessments (DPIA) for high-risk systems and adhere to strict implementation procedures. Upcoming guidelines from the European Commission, expected by mid-2025, will provide concrete examples of high-risk AI uses, helping employers navigate the new regulatory framework.

It’s important to note that while the EU AI Act mandates DPIAs for high-risk systems, local data protection regulators in EU member states also adopt their own lists of cases where DPIAs are mandatory. These lists may not align perfectly with Annex III of the Act, potentially expanding the scope of AI-related activities requiring a DPIA. This is a notable aspect for Serbia, as the country does not yet have AI-specific legislation, but a working group is currently drafting new AI regulations, making it interesting to observe how these developments will align with EU standards.

Intersection with GDPR and local privacy regulations

The EU AI Act specifies that deployers of high-risk AI systems must be conduct a Data Protection Impact Assessment (DPIA) as regualted under the General Data Protection Regulation (GDPR).

However, the scope of high-risk systems caught under the EU AI Act may be even broader compared to circumstances under GDPR for which the DPIA is necessary (including also circumstances triggering DPIA under privacy regulations in Serbia, such as the Commissioner’s decision on the list of processing activities requiring DPIA).

In addition to DPIA, the EU AI Act requires employers to conduct Fundamental Rights Impact Assessments (FRIA) in certain cases and notify employee representatives prior to deploying high-risk AI systems in the workplace.

Employers are also responsible, by EU AI Act, for ensuring that their workforce has an adequate level of AI literacy to handle these systems. AI literacy which represents skills, knowledge and understanding that allow providers, deployers and affected persons to make an informed deployment of AI systems, as well as to gain awareness about the opportunities and risks of AI and possible harm it can cause, will be required to apply from August 2, 2025, and this grace period leaves enough time for employers to comply with it.

What about Serbian employers?

Currently, Serbian employers who use AI systems for their internal needs are not subject to the obligations of the EU AI Act. A working group has been established and is working on drafting new AI legislation in Serbia, with the law expected to be completed within a year. Until then, the use of AI systems by Serbian employers will remain governed by the Data Protection Act (DPA).

Having said that, beside ensuring that employees are dully notified on processing of their data, and that legal basis for processing is ensured (which can be tricky when employees' data is processed), using any AI systems that fall under high-risk category of EU AI Act also requires DPIA under Serbian DPA. Namely, under Serbian DPA DPIA is required among other things, when personal data processing is conducted using new technologies (namely AI) and represents a potential risk to the rights and freedoms of the individuals whose data is being processed. For instance, it is explicitly required in cases of automated processing or profiling if based on such automatic processing a decision is made that significantly influences the life of an individual. According to the Commissioner's decision, a DPIA and the DPO's opinion are mandatory for specific processing actions (such as the use of employee monitoring tools based on biometric data).

Thus, employers as controllers who are processing personal data using AI systems which could be qualified as high-risk systems under EU AI Act, are obligated under the DPA to inter alia: (i) conduct a DPIA, (ii) seek the opinion of the Data Protection Officer (DPO), and (iii) request prior consultation with the Commissioner for Data Protection (before any processing takes place).

Consequences of non-compliance

Under the EU AI Act, employers using high-risk AI systems face significant penalties for non-compliance, however, it is important to note that these penalties are not applicable to Serbian employers using AI systems solely for their internal operations, as the EU AI Act’s extraterritorial provisions do not extend to such cases. Relevant penalties for Serbian employers will be those prescribed by national Law on AI, or for any privacy implications penalties under DPA.

Conclusion

Although the EU AI Act does not currently apply to employers in Serbia, and the signing of the Stabilization and Association Agreement obliges Serbia, among other things, to align its legislation with that of the EU Serbian employers should definitely closely monitor the development of this new legal area in EU as to be prepared to what might apply to them in same or similar scope. Regardless, employers should definitely continue compliance with DPA, but also conduct control of currently used AI systems to check whether they are fully compliant with DPA, create records of AI systems used by their employees and create internal rules and policies concerning compliant use of AI systems by their employees and compliant application thereof, all in order to prevent any possible liability in that regards.

By Marija Vlajkovic, Partner, and Andrija Saric and Marija Lukic, Associates, Schoenherr