The EU AI Act is here: requirements for healthcare organizations

The EU AI Act is here: requirements for healthcare organizations

Understanding the AI Act: Implications for Healthcare Organizations

The Artificial Intelligence Act (AI Act) is now officially published after extensive discussions, voting, and amendments. This legislation will officially enter into effect on August 1st 2024 and relates to all AI systems put into service in the European Union, including those used in healthcare. It is crucial to understand that this law impacts not only AI vendors but also imposes obligations on healthcare organizations.

Protecting citizens and promoting trustworthy AI

The AI Act is meant to protect EU citizens from the potential harmful effects of AI systems. Some AI practices are prohibited by this law, such as manipulative systems or facial recognition systems created by scraping the internet or surveillance images. AI systems classified as ‘high risk’ have to meet certain requirements set out in the Act to ensure their safe and responsible use.


"

The purpose of this Regulation is to improve the functioning of the internal market and promote the uptake of human-centric and trustworthy artificial intelligence (AI), while ensuring a high level of protection of health, safety, fundamental rights enshrined in the Charter, including democracy, the rule of law and environmental protection, against the harmful effects of AI systems in the Union and supporting innovation

"


High-risk AI systems in healthcare

You may be familiar with the Medical Device Regulation (MDR) which lays down rules for all medical devices in the European Union. Vendors have to comply to these rules, only then can their products be sold and clinically used in Europe. (If you want to know more, you can check out this easy to read blogpost about the MDR.) All medical software that currently needs CE marking according to the MDR, and utilizes AI, is considered a high risk AI system under the AI act.

This has consequences for both the manufacturer as well as the user of the AI system. While the MDR primarily holds manufacturers responsible, the AI Act also places expectations on the users, referred to as deployers in the Act.

Responsibilities of deployers


"

AI Act, Article 5(4)

‘deployer’ means a natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity;

"


Deployers must ensure, to their best extent, that all people affected are sufficiently informed and trained to use the AI systems responsibly, a concept known as AI literacy. The level of AI literacy required depends on the context and role. For clinical users, this may involve understanding how to interpret the AI system’s output. Other responsibilities for healthcare organizations include monitoring AI solutions and record-keeping, which may necessitate IT personnel or medical physicists becoming AI literate. Management and leadership roles also need to enhance their AI literacy as they are often involved in implementation decisions and governance.

Ensuring AI literacy


"

AI act, Article 4

AI literacy

Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.

"


As AI becomes increasingly prevalent in software products, healthcare organizations will inevitably encounter the obligations set out in the AI Act. It was of course already a good idea to get familiar on the fast-moving and growing field of healthcare AI. Now, with the AI act, it is more important than ever to get everybody up to speed, whether one is involved in the procurement, implementation or actual use of AI-based medical software.

Enhancing AI literacy in your organization

There are numerous online courses available to learn about AI. We previously wrote a blog on Health AI Register listing multiple online self-paced courses.

However, if you want something more applicable to healthcare, to your organization, and the different groups of people in your organization. Through our advisory branch Romion Health, we are offering a modular program that we can tailor to your time and needs.

Check out the curriculum here.

By staying informed and proactive, healthcare organizations can meet the demands of the AI Act and ensure the responsible adoption of AI, ultimately enhancing patient care and safety.


** Bonus tip from our clinical writer Noa Antonissen**

There is a specific ChatGPT environment in which you can ask questions about the legal text. Very convenient if you want to check some details quickly!