Skip to content

Expansion of AI Regulation Reaches into European Healthcare Sectors

Possible impact of EU's sector-neutral AI Act on Europe's public health and patient care services scrutiny

Expanded Reach of AI Legislation Affects European Healthcare Sector
Expanded Reach of AI Legislation Affects European Healthcare Sector

Expansion of AI Regulation Reaches into European Healthcare Sectors

The European Union (EU) is on the brink of a significant step forward in regulating Artificial Intelligence (AI) systems with the pending final approval of the EU AI Act by the Council of the EU. This regulation, approved by the European Parliament in March 2024, aims to ensure the development and use of AI systems that are trustworthy, inclusive, and respectful of individuals' health, safety, and fundamental rights.

At the heart of the EU AI Act is the requirement for high-risk AI systems to have a robust risk management system. This system must identify and analyse known and reasonably foreseeable risks to health, safety, or fundamental rights. Persons affected by decisions made with these high-risk AI systems have the right to receive clear explanations regarding the AI system's contribution to the decision and its potential adverse impacts.

The regulation also follows the ethics guidelines for trustworthy AI, which emphasise diversity, non-discrimination, and fairness. It prohibits the use of AI systems that materially distort human behaviour, causing significant harms to physical, psychological health, or financial interests, except for lawful practices in the field of healthcare.

The use of biometric data for interpretations raises concerns around discrimination and rights infringement. However, the regulation does not ban this practice for AI systems placed on the market strictly for medical or safety reasons.

High-risk AI systems, as defined by the regulation, undergo a specific assessment to determine their potential adverse impact on health, safety, and fundamental rights. These systems include autonomous robots in manufacturing and personal care, advanced diagnostic systems in healthcare, and other AI systems that operate in complex environments to ensure safety and accuracy in critical tasks.

The EU AI Act prioritises inclusive and diverse design and development of AI systems, including access for persons with disabilities. It also requires that AI systems should be developed and used in a way that promotes equal access, gender equality, and cultural diversity, while avoiding discriminatory impacts and unfair biases.

Compliance with accessibility requirements should occur by design in the EU AI Act. Instructions for use of high-risk AI systems should cover potential risks to health, safety, and fundamental rights arising from foreseeable uses and reasonably foreseeable misuses.

The regulation places obligations upon the European Artificial Intelligence Office and Member States to ensure diversity and inclusion (D&I) aims are met. It also delineates a boundary between its regulatory scope and AI-driven medical practices within Member States to ensure it does not impede treatments at the national level.

AI systems intended to detect the emotional state of individuals in workplace and education settings are forbidden under the EU AI Act. The regulation also prohibits the placing on the market, putting into service, or using an AI system that exploits vulnerabilities of a person or a specific group of persons due to their age, disability, or a specific social or economic situation.

The EU AI Act is significant for life sciences and healthcare companies as it applies to various aspects of their operations, including production, clinical studies, post-marketing activities, and day-to-day tasks like recruitment and legal affairs.

The next article in the series on AI will focus on the regulation's risk-based approach, with a focus on (bio)pharmaceuticals, medical devices, and in vitro diagnostics. Stay tuned for more insights on how the EU AI Act is shaping the future of AI in life sciences and healthcare.

Read also:

Latest