Introduction to the AI Act for Medical Device Manufacturers

AI Act Insights for medical device manufacturers.

The AI Act will have a far-reaching impact on both current and future software development – particularly in the medical sector, where AI is one of the main drivers of innovation.

The AI Act is a comprehensive, horizontal, i.e. cross-sectoral, EU legal framework that aims to regulate all aspects of the development and use of artificial intelligence (AI). Companies developing medical devices and software will need to familiarize themselves with the new regulations to avoid delays and additional costs.

More regulation for more safety

The AI Act is an additional regulation to the already strict requirements of the MDR. While the MDR focuses on the physical safety and well-being of patients and users, AI and machine learning raise new concerns. The AI Act addresses these issues, in particular ethical questions and societal risks, such as threats to fundamental rights, thus adding an additional layer of safety.

Similar to the MDR, the AI Act is divided into risk-based categories:

Prohibited AI: Applications that are considered an unacceptable risk to justice and freedom, such as social scoring, manipulation or surveillance in public spaces, will already be prohibited from February 2025.

High-risk AI systems with significant safety or fundamental rights-related risks, in areas such as health, education and law enforcement. AI-assisted medical devices often fall into this category due to their potential direct impact on human health.

Limited risk: AI systems that pose fewer risks but still require transparency and supervision, such as chatbots that offer nutritional advice.

Minimal risk: AI systems with negligible impact, such as spam filters, which remain largely unregulated.

The four risk-based categories of the AI Act.

The MDR classifies medical devices into risk classes (Class I to Class III) based on their intended use and potential impact on patient health. For example, a medical device may only be class I despite direct contact with the patient, but an AI-based diagnostic tool may fall into a higher class (IIa or IIb) due to its complexity and the consequences of its performance.

Act early despite unclear guidelines

Manufacturers of medical devices must prepare for the implementation of the AI Act and take necessary measures early.

With the AI Act coming into force in August 2024, manufacturers must now prepare for its implementation and take the necessary measures early. One particular uncertainty is that the Commission will not publish its explicit guidance until Q4 2025, which is relatively late compared to the August 2026 deadline for full compliance.

It is therefore advisable to start thinking about:

  • how data will be used,
  • how AI systems will be trained,
  • how this should be documented.

For example: Manufacturers who want to have their high-risk AI Systems approved in two years’ time need to ensure now that the datasets used will meet future requirements. Otherwise, there is a risk of non-compliance, which could lead to significant delays.

As we at Corscience have always placed great emphasis on patient safety and health, we have in recent years started to implement artificial intelligence in many areas of our organisation (software development, automated testing, AI-based post-market support) and use it to improve our systems. We recommend this to all our customers and manufacturers. To facilitate the regulatory implementation in your company, in the next part of ‘AI Act – Insights’ we will give you tips on how to prepare for the next steps.

Scroll to Top