Do you want to have an impact on moral issues raised by the deployment of new digital health technologies by healthcare providers and patients? Do you want to be a part of a leading European school and build your own team as professor in Digital Health Ethics? Then we might have an opportunity for you.
Erasmus School of Health Policy & Management (ESHPM), part of Erasmus University Rotterdam (EUR), is a leading European school in health policy and management, with strong presence in health economics and health services and systems research.
In partnership with Erasmus Medical Centre and Erasmus School of Philosophy we are now recruiting a new:
Professor of Health Ethics in a Technological Society (1,0 fte) The position is created through endowed funding from the Royal Dutch Medical Association (KNMG).
Job descriptionDigital Health Ethics The specific focus of this Chair is on the particular moral issues raised by the deployment of new digital health technologies by healthcare providers, physicians in particular, and (future) recipients of care (“patients”). This holds for both the individual doctor-patient relationship and the doctor-community relationship. The Royal Dutch Medical Association (KNMG) has acknowledged these increasing moral issues and the impact of care delivery in transitioning health systems. Therefore, KNMG has initiated this chair to be appointed by Erasmus University Rotterdam.
Digital health technology raises moral questions in relation to each to the identified physician roles. Leading up to 2040, the
medical practitioner will become increasingly familiar with eHealth and digital care, AI, and robots. This raises important questions of responsibility, for example: Can practitioners still be fully responsible (and liable!) for their actions if digital health technology becomes an important and partly independent actor? The physician as
interpreter (as “human interface”) must coach patients, or communities, by interpreting digital health information, misinformation and disinformation about care and prevention. But where is the line between (neutral) coaching and influencing or nudging? How does the physician remain the “supervisor” of AI (chatbots, predictive algorthms such as PacMed Critical – Discharge) and robots?
Digital health technology also raises questions in relation to patients. If medical technology does indeed become a more or less independent actor, what are the moral implications for patients? Where does their responsibility begin, and where does it end? And, of course, patients are in danger of being exposed to misinformation and disinformation. To what extent are patients themselves responsible as regards the interpretation of health information? New digital health technologies can help people live healthier lives, assist patients in shared decision-making, strengthen their autonomy, and ensure access to better healthcare. But will that be true for everyone? With the integration of advanced digital health technologies into care and prevention, moral questions arise as regards accountability and oversight. Who is responsible if a technology fails or causes harm? Ensuring clear lines of responsibility and accountability is essential for ethical decision-making and patient safety. Professional codes of conduct, regulatory frameworks, and independent oversight can help maintain responsible digital health practices.
Aim of the Chair - Promoting research, education, and knowledge transfer in digital health ethics, by addressing moral challenges faced by physicians and patients in their respective roles at different levels;
- More specific: Exploring the moral aspects of digital health linked to the roles of the physician and the patient, and integrate them into education, practice and policy. This means the Chair will focus on both the education and practices of those who are (going to be) active in healthcare as physicians, but also as managers, directors, governors and healthcare policy making;
- Building connections between and collaborate with experts at various schools and campuses within the framework of the following guiding Erasmus Initiatives: Dynamics for Inclusive Prosperity, Smarter Choices for Better Health, Vital Cities and Citizens, and the Societal Impact of AI;
- Closely cooperating with the Chair of Law, Healthcare Technology, and Medicine (Amsterdam University Medical Center)
- Combining impact-driven research and teaching (academic excellence) with societal interaction and engagement;
- Building a research group, including Ph.D. candidates, and actively seeking the necessary funding for research.
- Taking into account scientific independence, the Chair will also establish connections with the KNMG. This collaboration can take various forms, such as contributing to development of policies and guidelines for doctors regarding ethical aspects of digital health technology; advising on concrete ethical questions regarding the use of digital health technology by practicing doctors; and regular and structural consultation between the Chair and the ethics department of the KNMG.