
Artificial Intelligence Law
Practical advice for AI developers, deployers, and operators in Luxembourg
The EU AI Act (Regulation (EU) 2024/1689) entered application in phases from 2 February 2025. For the first time, there is a binding, comprehensive legal framework for artificial intelligence in the European Union — one that applies to any business that develops AI systems for the EU market, deploys AI in its operations, or procures AI tools for use in regulated contexts. The compliance calendar is pressing and the penalties for non-compliance — up to 35 million euros or 3% of global annual turnover for the most serious violations — are not theoretical.
In Luxembourg, the CNPD has been designated as the national competent authority for AI Act supervision, building directly on its existing role in data protection. If your business already has a CNPD relationship from GDPR compliance, your AI Act obligations sit within the same regulatory ecosystem — which is why integrated advice from lawyers who cover both frameworks matters.
Jurisconsul has been advising on digital regulation since well before the AI Act existed. Erwin Sotiri has published analysis of the Act's scope and has lectured on AI regulation. We do not just know the text of the regulation — we know how it is being interpreted in practice and how it will be supervised in Luxembourg.
AI System Classification
The starting point for any AI Act compliance programme is classification. Your AI systems may be prohibited outright, subject to the most rigorous high-risk obligations, subject only to limited transparency requirements, or — for the majority of AI uses — largely unaffected by prescriptive obligations. Getting the classification right determines everything that follows. We work through your systems and use cases against the Act's risk framework and give you a clear picture of your actual exposure.
High-Risk AI System Compliance
Annex III of the Act lists the high-risk categories: biometric identification, critical infrastructure management, education and vocational training, employment and worker management, essential private and public services, law enforcement, migration and asylum, and administration of justice. If your AI system operates in any of these areas, you face prescriptive obligations on technical documentation, conformity assessment, quality management, post-market monitoring, and human oversight. We build compliance programmes proportionate to your system and your business.
GPAI Model Obligations
Providers of general-purpose AI models — including large language models and foundation models — face their own distinct obligations under the Act: transparency documentation, copyright policy, and, for models posing systemic risk, adversarial testing and incident reporting to the European AI Office. We advise GPAI model providers on their specific requirements and on the interaction between the Act and EU copyright law for training data.
AI Procurement and Vendor Contracts
The Act allocates obligations differently between providers and deployers. If you are procuring AI systems from third-party vendors, the contracts governing those relationships need to reflect the Act's requirements — who is responsible for what, what information the provider must give you, and how liability is allocated when something goes wrong. We review and draft AI procurement agreements that protect your position as a deployer.
AI and Intellectual Property
AI raises specific IP questions that the law is still working through: copyright ownership of AI-generated outputs, rights in training data, and the boundaries of the AI Act's text and data mining exceptions. We advise on these questions in the context of both Luxembourg and EU law, drawing on our established IP practice.
CNPD Interaction on AI Matters
We represent clients before the CNPD on AI Act matters, including regulatory enquiries, incident reporting obligations, and any investigation or enforcement proceedings. Our work before the CNPD in data protection matters provides continuity across the overlapping regulatory frameworks.
