Roles and responsibilities

The organisation's management holds the overall responsibility for ensuring compliance with, among other things, requirements for quality assurance of the health service, cf. Regulations on management and quality improvement in the health and care service [143]. The AI Act requires providers and users to have a quality management system in place for high-risk AI systems, cf. Article 17 [144]. As the AI Act will gradually enter into force, the organization must prepare itself to comply with the new regulations. Existing management systems should therefore be expanded to include quality management for AI systems. The AI Management System Standard (ISO/IEC 42001) can serve as a useful tool for leadership [145].

The operation and use of AI systems may require the creation of new roles and responsibilities to ensure safe and effective implementation. If the organization has acquired a high-risk AI system, several areas must be addressed in accordance with the AI Act:

  • The AI system, like medical equipment, must be used in accordance with the supplier's instructions for use [146] 
  • Human oversight
  • Transparency and information to users of the system (applies to all AI systems)
  • Monitoring the AI system in operation, including management of logs and deviations

Human oversight is described in Article 14 of the AI Act as a requirement for high-risk AI systems. It mandates that such AI systems must be developed in a way that allows for human control of the system. This includes understanding the capabilities and limitations of the AI system, enabling a human to override outputs, monitor performance, detect and manage deviations, as well as understanding how its use can lead to automation bias [147] [148]. This role must be assigned to a person with the necessary competence, support, and authority.

  • At Vestre Viken Hospital Trust, for example, designated AI physicians (AI radiologists) have been appointed with specific expertise in the AI system used in clinical practice.

In larger organizations, it may be appropriate to establish an interdisciplinary group with both operational and strategic responsibility to ensure efficient system management and governance. This can be a helpful organizational structure for discussing system performance, identifying areas for improvement and risks, and providing a clear framework for decision-making and communication. An example of a group with personnel with broad expertise and clear responsibilities:

  • Daily operations: Operations Manager, Systems Manager and IT personnel who play key roles in ensuring continuous and stable operations, technical responsibility for performance, infrastructure and security. Will coordinate technical updates, logging maintenance and integration with other systems.
  • Domain expertise for relevance: Clinical expertise and/or super-user is essential to ensure that the AI system works as expected. They need to be the link between technical and clinical teams by translating clinical needs into technical requirements. They can provide training and support to other users of the AI system and report deviations or limitations.
  • Compliance and risk: Professionals in privacy, information security and legal affairs are more central to overall management than to day-to-day operations. Will ensure patient trust and compliance with legislation and agreements.
  • Information management: Responsibility for keeping your own house in order. Good information management will ensure that data used by the AI system is accurate, relevant and of high quality. This helps to manage and avoid data bias and data drift.
  • Strategic management: The system owner has overall responsibility for ensuring that the system delivers value and complies with the organization's goals and regulations.

Reference is also made to the RACI matrix in phase 1 and AI fact sheet 0.

 

 

[146] For example, if the organization makes significant changes to the AI system or does not use it in accordance with the instructions for use, the organization will assume the role of manufacturer and will be obliged to comply with the requirements imposed on manufacturers under both the Medical Devices Regulation and the CI Regulation. Cf. Article 16 of the MDR and Article 25 of the CI Regulation.

[147] Article 14 of the AI Act: Human oversight (eur-lex.europa.eu)

[148] A tendency to automatically trust or over-rely on the results of an AI system (automation bias) that provides decision support.

Last update: 23. mai 2025