Facial Recognition and Minor Offences: New Hungarian Law Raises EU Compliance Questions

Hungary
Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

In March 2025, the Hungarian Parliament enacted legislation expanding the use of facial recognition technologies in minor offence procedures. The law is a part of broader efforts aimed at enhancing child protection measures, but has sparked considerable legal debate due to potential conflicts with European data protection laws, including the General Data Protection Regulation (GDPR) and the forthcoming EU Artificial Intelligence Act.

New legal provisions and amendments

The amendment affects several key statutes, most notably the (2012) act on minor offences and the (2015) act governing facial image analysis systems and their registries. As of 15 April 2025, authorities such as courts, administrative bodies handling minor offences and investigative agencies are authorized to use facial recognition technologies to identify individuals suspected of committing minor offences, particularly when the identity of the suspect is unknown. The usage of this technology was previously only permitted for an offence punishable by detention, but the recent legislation expands the applicability to the detection of all offenders.

To facilitate this, Hungary's central registry of facial profiles, originally designed for more serious public security concerns, has been repurposed to include objectives such as the prevention, detection, and interruption of minor infractions, along with ensuring accountability for perpetrators.

Sparse justification and transparency concerns

Despite the significant legal and ethical implications of this change, the explanatory memorandum accompanying the legislation was notably brief. It merely states that the updated provisions allow for the use of biometric identification for suspects in minor offence cases. However, it fails to elaborate on the risk assessment or provide any indication of how the legislator evaluated the legal and data protection ramifications.

GDPR and AI Act compliance in question

Under the GDPR, biometric data, such as facial images, qualify as sensitive personal data and are subject to strict processing conditions. Article 9 of the GDPR permits processing only in exceptional circumstances, typically involving explicit consent or substantial public interest grounded in Union or Member State law. Applying such intrusive technology to relatively low-level offences raises serious proportionality and necessity concerns. The use of biometric surveillance in these cases may not meet the high threshold required under EU law, particularly in the absence of robust safeguards or a clear legal justification.

The forthcoming EU Artificial Intelligence Act further complicates the legal landscape. The Act designates real-time remote biometric identification systems as high-risk or even prohibited in public spaces, except under narrow exceptions (e.g. threats to life or serious crime). These uses typically require prior judicial authorisation and strict oversight. Hungary’s expansion of facial recognition in the context of minor offences could be seen as inconsistent with the aims of this regulation. The European Commission has already expressed concerns and indicated that it will monitor the situation closely.

Conclusion

Hungary’s recent legislative changes mark a notable step in the use of facial recognition technologies in administrative and enforcement contexts. As this area of law continues to evolve, there remains some uncertainty around how such national measures align with existing and forthcoming EU regulations, particularly regarding data protection and artificial intelligence.

Further legislative or regulatory clarification may help ensure a consistent and practical framework for the responsible use of biometric technologies. Clear procedural rules and alignment with broader EU standards will be key in supporting lawful implementation while safeguarding individual rights.

By Reka Fulop, Associate, KCG Partners