CMS Partners Daniela Kroemer, Dragana Bajic, Tomasz Sancewicz, and Katalin Horvath discuss the growing role of AI in HR across CEE and delve into its current applications, regulatory challenges, and the broader impact on workplace dynamics.
CEELM: How is AI being used in HR right now, and where is it having the biggest impact?
Horvath: The main areas where AI is used in HR are recruiting, onboarding of new people, performance assessment and prediction, monitoring of employees and their IT devices, termination processes, HR administration, and career path planning.
Sancewicz: Indeed, AI is widely used in recruitment, especially for tasks like reviewing CVs and conducting initial interviews. It can also assess candidates’ skills based on the information in their CVs. In some cases, AI even runs the first round of interviews through chatbots, making the process faster and more efficient. This helps save time and, ultimately, money.
Bajic: Beyond hiring, AI is also being used for things like training and onboarding. Some AI systems can conduct interviews using facial recognition and motion detection, assessing and ranking candidates before they even get to talk to a human. There are also AI-driven games that assess skills –quite trendy in recruitment right now. Looking ahead, we might see platforms analyzing individual employee preferences to personalize their work experience throughout their careers.
Horvath: AI can be a powerful tool in HR, but of course, there are cases when a fully AI-driven system does not make the right decision. When one of our clients used a fully automated AI-based hiring process to hire a person who eventually did not fit the team (even though, on paper, the candidate had all the appropriate skills), team members were upset about the lack of human interaction during the recruitment process. AI can operate without biases, but the human touch is always a necessity. As hinted though, AI can avoid biases and discrimination that exist right now, if the teaching method of AI is appropriate.
Kroemer: We’re seeing more AI that implicitly monitors the workforce, like quality control systems, which raises significant data privacy concerns. Additionally, AI can have biases, which is tricky and requires being very careful. Given that AI is introduced based on existing data, which is taken from an often biased and structurally discriminatory world, most data will have a bias. My advice is to assume AI has a bias and then continuously test against it.
Sancewicz: I agree. We’re at the start of a long road, but even now, it feels like we’re getting dangerously close to over-automating everything. It’s kind of eerie. Speed is great, but there’s a real fear that we’re losing the human element in all this.
CEELM: Is AI in HR regulated locally, or does it fall under EU rules?
Kroemer: Right now, we don’t have a specific national AI act but we have strong legislation on monitoring the workforce and data privacy. I wonder if that’s going to stay the same or be amended following EU AI regulations.
Horvath: The upcoming EU AI Act will definitely cover AI applications in HR. From next year, emotion detection – used to analyze employee emotions – will be banned except in specific medical or safety contexts. AI systems used in recruitment, termination, performance assessment, employee monitoring, and promotions will be classified as high-risk, with special obligations attached. There are also jurisdiction-specific labor law requirements on the use of AI in workplaces. For example, in Hungary, if an employer wants to introduce a new technology, including AI or a new type of data processing, they need to consult the works council and get their non-binding opinion. There are also special rules on the monitoring of employees via tech solutions, including AI.
Bajic: Serbia isn’t part of the EU, but the AI Act will have extraterritorial effects and impact software developers in Serbia. We have a working group drafting an AI law, but it’s unclear what direction it will take. I believe that the core regulations we need to worry about are already there, such as the GDPR. Until specific issues arise in practice, we should focus on applying data privacy rules more appropriately.
Sancewicz: Similarly, in Poland, we don’t have specific laws targeting AI yet, but we do have the GDPR and rules around employee monitoring. Interestingly, trade unions here are starting to push for more transparency around the algorithms employers use, demanding access to them.
CEELM: What are the main risks for employees and employers when AI is involved in hiring or management decisions?
Horvath: Employers using AI have to comply with various laws, including the EU AI Act, especially when using prohibited or high-risk AI systems. In the case of the latter, employers must apply human oversight, ensure that the input data are appropriate, without biases, and the output (decision by AI) is not discriminative, report AI incidents to the AI provider, and follow the user manuals the AI provider provided. Additionally, AI operations should be logged. If something goes wrong, employers are required to report it. Local labor laws also raise liability issues.
Kroemer: You need to look at it from all angles to ensure compliance, especially with data privacy. Just because you’re using AI that is compliant with an AI act, doesn’t mean you’re automatically compliant. The GDPR, for example, comes with big penalties, and if you’re using AI without works council approval in Austria, employees can take you to court to stop its use.
Bajic: I’d also highlight that one should always treat AI implementation as a work in progress. HR should be ready to tweak the AI system together with developers. There’s flexibility to adjust AI systems as new information, data, and unwanted results surface.
Sancewicz: Legal matters aside, employers should ask themselves whether they want to rely solely on AI and whether they want to become an automated, inhuman organization, or maintain a human face.
CEELM: When HR teams are choosing an AI-powered software provider, what should they consider?
Horvath: Entering into a good contract is crucial for liability issues. Sometimes AI providers want ownership of customer data for the development of the AI system, and a good contract could prevent many disputes. HR teams also need a basic understanding of AI to fully grasp the risks involved.
Kroemer: Be very critical about biases and discrimination, not just when buying but also when using AI. Do not trust it blindly. Oxford Internet Institute Professor Sandra Wachter compared AI to an overly enthusiastic, sloppy employee – you get results quickly, but it may be flawed. The key is to remain vigilant.
Bajic: It’s crucial that HR is involved in strategic decisions about implementing AI. HR should know where this might be a real help with minimum risks and which processes to focus on. My advice is to limit the use of AI systems to specific processes – implement it step by step, not across all HR processes.
CEELM: What do you see as future developments in AI for HR?
Kroemer: I believe that AI offers a huge window of opportunity for companies. If you’re aware of the risks and have a clear vision, AI can be a booster. It can handle repetitive tasks, allowing employees to be more productive and focus on more positive, interesting things. Additionally, training people will be a big thing in the next 20-30 years.
Sancewicz: Once it becomes truly intelligent, it will be a real game-changer. When AI can predict human behavior and apply it to strategic decisions, that may change how organizations work.
Bajic: AI systems will be great tools for tech-savvy HR people, improving the total employee experience but you need tech-savvy employees who are ready for new models of work and management, including career paths influenced by automated decision-making processes.
Horvath: AI won’t steal jobs – the next stage will be cooperation between human employees and AI, and together there are no boundaries. As a future-facing and tech-savvy tech lawyer, I see a bright future for AI in HR.
This article was originally published in Issue 11.10 of the CEE Legal Matters Magazine. If you would like to receive a hard copy of the magazine, you can subscribe here.