DOINA ANGHELESCU ATTORNEY OFFICE
  • Home
  • ABOUT US
  • Partnership & Affiliation
  • Infolex&News
  • PRIVACY POLICY & CONTACT
​CHALLENGES FACED BY NON-EU PROVIDERS IN COMPLYING WITH THE EU AI ACT
 
The Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence, known as EU AI Act, had entered into force on 1 August 2024 and will be implemented in phases from February 2025 to August 2027.
However, until 2 August 2026, most of its key provisions will be in effect and will apply to any entity doing business in the EU markets because it is intended to govern the production and the development, distribution, the use and the deployment of any AI System.
AI Act aims to ensure that AI systems in the EU are safe and respect fundamental rights and values and, for this purpose, stipulate prohibited practices and obligations for general purpose models (GPAI). By forcing this Regulation, EU officials target to turn Europe into place where only trustworthy AI models will be developed and used.
However, AI Act aims also to foster investment and innovation in AI, to enhance governance and enforcement, and encourage a single EU market for AI.
Just like GDPR, EU AI Act has an extraterritorial impact and will apply to business outside the EU. Businesses with no presence in the EU will be subject to the EU AI Act if certain circumstances are met: the providers placing (directly or indirectly) on the EU market an AI system or a GPAI model.
For example, if a non-EU car manufacturer sells cars in the EU and that cars are using an AI-enabled assisted braking system.
 What you should do if you want to do business in EU? You need to start with an assessment on your existing or undergoing technology and systems and identify the risk associated with the type of AI System you are dealing with. You need to know if your technology lies under one of the four risk categories: Unacceptable Risk, High Risk, Limited Risk, Minimal Risk.
Then you need to develop a compliance plan and to comply with the AI Act implementation calendar:
  • By February 2, 2025: providers and deployers of AI systems must take measures to ensure a sufficient level of AI literacy of their staff (Article 4) and the prohibited AI practices go into effect (Article 5) so  "unacceptable risk" Al systems are banned from the EU market.
An unacceptable AI system is that system that infringe the fundamental values of the EU: for example, the system manipulates human behaviour and cause harm, the system is used to make a social scoring or rating systems carried out by public organizations, the system is used for predicting criminal propensity of individuals based on a scoring etc.
  • By May 2,2025: EU shall elaborate the codes of good practice for GPAI models must be ready. The codes shall determine if GPAI models pose systemic risk and to help developers, distributors, and deployers comply with AI Act requirements– (Article 56).
  • By August 2, 2025: Obligations on providers of GPAl models will go into effect (Article 53 and 55). These obligations are:
    • providing technical documentation
    • making publicly available detailed summaries about the content used for training
    • showing compliance with EU copyright law.
 
  • By August 2, 2026: Obligations on high-risk Al systems listed in Annex Ill become applicable.
High risk products are that AI systems that are used in any product or that are themselves the product and where such products are regulated by EU product safety regulations may be considered High-Risk AI systems, as listed in Annex I of the Act.  Also, AI systems in the areas listed in Annex III of the Act, may be considered High-Risk AI Systems:
  • Biometrics
  • Critical infrastructure
  • Education and vocational training
  • Employment, workers management and access to self-employment
  • Access to and enjoyment of essential private services and essential public services and benefits
  • Law enforcement
  • Migration, asylum and border control management
  • Administration of justice and democratic processes
 
What are the requirements for providers of High Risk AI Systems?
  1. Risk Management and Data Governance (assessing and mitigating risks associated with the AI systems)
  2. Technical Documentation and Record-Keeping  
  3. Transparency and Human Oversight  
  4. Accuracy, Robustness, and Cybersecurity
  5. Compliance with High-Risk AI Systems
  6. Appointing  a representative in EU  
 
What are the requirements for importers of High Risk AI Systems?
  1. Verification of Compliance (high-risk AI system- relevant conformity assessment has been carried out, the technical documentation is complete, and the system bears the required CE marking).
  2. Documentation and Record-Keeping (copies of the certificate issued by the notified body, instructions for use, and the EU declaration of conformity, for 10 years after the system has been placed on the market).
  3. Contact Information: Importers must indicate their name, registered trade name or trademark, and contact address on the high-risk AI system and its packaging or accompanying documentation
  4.  Storage &Transport Conditions: must ensure that storage and transport conditions do not jeopardize the system's compliance with the Act
  5. Cooperation with Authorities
What are the requirements for distributors of High Risk AI Systems?
  1. Verification of Compliance   
  2. Non-Compliance Actions: if a distributor believes that a high-risk AI system does not meet the required standards, they must not sell it until it is brought into conformity.  
  3. Storage and Transport Conditions: must ensure that storage and transport conditions do not jeopardize the system's compliance with the Act
  4. Corrective Actions: if a distributor finds that a high-risk AI system they have sold is non-compliant, they must take corrective actions to bring it into conformity, withdraw it, or recall it. They must also ensure that the provider or importer takes the necessary actions.
  5. Cooperation with Authorities
 
On top of these obligations do not forget to comply with local regulations. Each country will put in place specific guidelines and recommendations relating to the use and deployment of AI Systems and it is important to ensure that you also know and understand the local law requirements.
In another article we will elaborate on the compliance costs that should be expected.

This material is written by Doina Anghelescu Law Firm and is protected by copyright according to the legislation in force. 
Proudly powered by Weebly