{"id":10352,"date":"2026-04-28T13:20:33","date_gmt":"2026-04-28T12:20:33","modified":"2026-04-28T13:20:36","modified_gmt":"2026-04-28T12:20:36","slug":"eu-ai-act-regulation-2026-what-businesses-and-individuals-need-to-know-now","status":"publish","type":"post","link":"https:\/\/www.madarassy-legal.com\/en\/eu-ai-act-regulation-2026-what-businesses-and-individuals-need-to-know-now\/","title":{"rendered":"EU AI Act Regulation 2026: What Businesses and Individuals Need to Know Now"},"content":{"rendered":"If your company uses artificial intelligence \u2014 whether a simple chatbot, AI-powered recruitment or an automated customer service system \u2014 the EU AI Act regulation (Hungarian: EU AI Act szab\u00e1lyoz\u00e1s) applies to you. This is not just a matter for large corporations: it directly affects Hungarian SMEs, employees and consumers alike. In this practical guide, we explain what obligations you need to prepare for by August 2026 and how a lawyer can help.\n\n\n\nWhy should you act now on EU AI Act regulation?\n\n\n\nA three-phase timeline: what is already mandatory?\n\n\n\nThe EU AI Act (Regulation (EU) 2024\/1689 of the European Parliament and of the Council) entered into force on 1 August 2024, with obligations phased in over three stages. The ban on prohibited AI practices and the AI literacy obligation have been in effect since 2 February 2025. Rules for general-purpose AI models (GPAI) and the sanctions framework were activated on 2 August 2025. The deadline for full compliance regarding high-risk AI systems is 2 August 2026.\n\n\n\nThis means part of the preparation period has already passed. Any business that has not yet reviewed its systems against the prohibited practices rules is already behind schedule.\n\n\n\nHow prepared are Hungarian companies?\n\n\n\nDeloitte&#8217;s 2025 Hungarian AI Survey paints a concerning picture: 38% of companies cannot even determine whether their systems fall under the AI Act. Only 16% have a compliance framework in place, and 21% have taken no AI literacy steps whatsoever \u2014 despite this being a legal requirement since February 2025. In our legal practice, we find that most Hungarian SMEs do not even realise they are affected.\n\n\n\nWhat does the AI Act regulate and who does it apply to?\n\n\n\n\n\n\n\nDevelopers, deployers and importers all fall within scope\n\n\n\nThe AI Act does not apply solely to AI developers. Its scope extends to every organisation that develops, distributes, imports or deploys an AI system within the EU \u2014 regardless of whether the developer is based inside or outside the Union. If a Hungarian business uses an American AI tool whose output affects individuals in the EU, the regulation imposes obligations on both parties. The framework takes a risk-based approach, classifying AI systems across four levels from minimal to prohibited, with obligations scaled accordingly.\n\n\n\nWhich EU AI Act prohibited AI systems have been banned?\n\n\n\nEight prohibited practices \u2014 and fines reaching billions\n\n\n\nArticle 5 of the AI Act lists eight AI practices that are prohibited in terms of development, distribution and deployment. These include subliminal manipulation, exploitation of vulnerabilities, social scoring, emotion recognition in the workplace and in education, and mass facial recognition in public spaces. If a Hungarian company uses personalised content to influence customer decisions in ways they cannot perceive, this may already breach the prohibition.\n\n\n\nThese rules have been in effect since 2 February 2025, and the sanctions framework has been active since August 2025. Under the Hungarian implementing legislation (Act LXXV of 2025), the maximum fine for prohibited AI practices can reach HUF 13.3 billion (approximately EUR 34 million).\n\n\n\nHow does EU AI Act compliance affect Hungarian businesses?\n\n\n\nHigh-risk systems in everyday business operations\n\n\n\nThe &#8220;high-risk&#8221; category covers far more companies than one might expect. It includes any AI system used for credit scoring, insurance risk assessment, recruitment, performance evaluation or determining access to public services. A mid-sized company that uses AI to screen CVs will be required to maintain detailed technical documentation, a risk management system and human oversight from August 2026.\n\n\n\nAI literacy: the mandatory training that few know about\n\n\n\nArticle 4 of the AI Act requires every organisation deploying AI to ensure adequate AI literacy (Hungarian: AI literacy k\u00f6telezetts\u00e9g) among its staff \u2014 with verifiable records. This has been mandatory since 2 February 2025, yet 21% of Hungarian companies have taken no action at all. In practice, this means organising training sessions, internal knowledge bases or e-learning programmes. In our experience, most Hungarian businesses are entirely unaware of this obligation.\n\n\n\nGDPR and the AI Act: the dual compliance challenge\n\n\n\nThe AI Act does not replace the GDPR \u2014 it complements it. Where an AI system processes personal data, all GDPR principles remain fully applicable. The greatest practical risk is &#8220;Shadow AI&#8221;: staff copying customer data and internal reports into AI prompts that no one can subsequently audit. For businesses, practical steps on AI regulation (Hungarian: AI szab\u00e1lyoz\u00e1s gyakorlati teend\u0151k v\u00e1llalkoz\u00e1soknak) should therefore always begin by strengthening data handling discipline.\n\n\n\nHow does the AI Act impact employees and individuals?\n\n\n\nEmployee rights: when must you disclose AI use?\n\n\n\nThe AI Act impact on employees and individuals is direct and tangible. If an employer uses an AI system to screen job applications, evaluate performance or decide on promotions, it must inform the affected person \u2014 including how the AI contributed to the decision. Automatically generated logs must be retained for at least six months. Under Hungarian labour law (Section 11\/A of the Labour Code), AI qualifies as a technological tool, meaning all restrictions on workplace technology use apply.\n\n\n\nConsumer protection: chatbots, credit scoring, automated decisions\n\n\n\nWhen operating a chatbot, businesses must clearly indicate to users that they are interacting with AI. AI-based credit scoring or insurance risk assessment falls into the high-risk category, requiring non-discrimination safeguards and human oversight. Amazon was forced to shut down a recruitment system in 2018 after it was found to discriminate against women \u2014 the AI Act is designed to prevent precisely such outcomes.\n\n\n\n\n\n\n\nHungarian AI legislation: what new authorities are being established?\n\n\n\nA new institutional framework under Act LXXV of 2025\n\n\n\nThe Hungarian AI Act (Act LXXV of 2025) and its implementing Government Decree (344\/2025 (X. 31.)) established the domestic institutional framework. The National Accreditation Authority serves as the AI notifying body, while a nationally competent AI market surveillance authority oversees lawful deployment. The Government has set up a single-window system so that businesses deal with one authority only. The law also created the Hungarian Artificial Intelligence Council, chaired by the Government Commissioner for AI, which issues guidance to support consistent legal application.\n\n\n\nWhat practical steps on AI regulation should businesses take?\n\n\n\nThree steps to compliance\n\n\n\nThe first step is a comprehensive AI audit: catalogue every AI tool and use case within your organisation, including the &#8220;invisible&#8221; ones \u2014 such as the marketing team using ChatGPT for copywriting. Second, carry out a risk classification and prepare an internal AI policy: which tools are permitted, what data must never enter an AI system, and how human oversight is ensured. Third, for high-risk systems, establish technical documentation, logging and staff AI literacy training.\n\n\n\nDo you need AI Act lawyer legal advice for compliance?\n\n\n\nWhen should you engage legal counsel?\n\n\n\nAI Act lawyer legal advice (Hungarian: AI Act \u00fcgyv\u00e9d jogi tan\u00e1csad\u00e1s megfelel\u00e9s) is particularly warranted when your organisation deploys high-risk AI, when data protection questions arise at the intersection of AI and the GDPR, or when you work with contractual AI suppliers. Risk classification, regulatory registration and drafting internal policies require legal expertise. Those who get their AI use in order early not only avoid fines \u2014 increasingly, B2B partners expect verifiable AI compliance, making it a genuine competitive advantage.\n\n\n\nEU AI Act regulation is not a threat \u2014 it is an opportunity\n\n\n\nPreparing today is cheaper than paying fines tomorrow\n\n\n\nThe purpose of EU AI Act regulation is not to hinder innovation but to create a framework for safe and transparent AI use. Those who act now will prepare more affordably and with greater confidence than those scrambling before the August 2026 deadline.\n\n\n\nDeadlineObligation2 February 2025Prohibited AI practices ban + AI literacy obligation2 August 2025GPAI rules + sanctions framework + designation of national authorities2 August 2026Full AI Act application, high-risk system compliance2 August 2027High-risk AI systems embedded in regulated products\n\n\n\nNot sure whether the AI Act applies to your company, or unsure what steps to take? Get in touch with us.\n\n\n\nWith over 20 years of experience in data protection, technology law and corporate compliance, Madarassy Law Firm helps clients navigate the practical implementation of the AI Act. Contact us at&nbsp;www.madarassy-legal.com.","protected":false},"excerpt":{"rendered":"If your company uses artificial intelligence \u2014 whether a simple chatbot, AI-powered recruitment or an automated customer service system \u2014 the EU AI Act regulation (Hungarian: EU AI Act szab\u00e1lyoz\u00e1s) applies to you. This is not just a matter for large corporations: it directly affects Hungarian SMEs, employees and consumers alike. In this practical guide, [&hellip;]","protected":false},"author":2,"featured_media":10350,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"rop_custom_images_group":[],"rop_custom_messages_group":[],"rop_publish_now":"initial","rop_publish_now_accounts":[],"rop_publish_now_history":[],"rop_publish_now_status":"pending","footnotes":""},"categories":[63],"tags":[],"class_list":["post-10352","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-data-protection"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/posts\/10352","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/comments?post=10352"}],"version-history":[{"count":7,"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/posts\/10352\/revisions"}],"predecessor-version":[{"id":10444,"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/posts\/10352\/revisions\/10444"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/media\/10350"}],"wp:attachment":[{"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/media?parent=10352"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/categories?post=10352"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.madarassy-legal.com\/en\/wp-json\/wp\/v2\/tags?post=10352"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}