From 2019 to 2022, I was part of a research project at the CSIRO investigating how stakeholders understood ethical responsibility for AI-designed surgical tools that would be attached to surgical robots. So far this project has produced three papers: a theoretical paper examining how ethical responsibility should be determined for physical products designed by generative AI (and in particular, evolutionary algorithms); a paper examining stakeholders’ views on ethical responsibility for AI-designed surgical tools; and a paper examining stakeholder’s views on the ethical risks of using AI-designed surgical tools.
This project was also my first work using qualitative research methods.
The official project page can be found here: Responsibility for Bespoke 3D Printed Surgical Robots (CSIRO).
I was also interviewed for the CSIRO Responsible Innovation Future Science Platform blog about this project: Who Bears Responsibility When AI Systems Go Wrong? (CSIRO).
References
2023
-
Ethical risks of AI-designed products: bespoke surgical tools as a case study
David M. Douglas, Justine Lacey, and David Howard
AI and Ethics, Jun 2023
An emerging use of machine learning (ML) is creating products optimised using computational design for individual users and produced using 3D printing. One potential application is bespoke surgical tools optimised for specific patients. While optimised tool designs benefit patients and surgeons, there is the risk that computational design may also create unexpected designs that are unsuitable for use with potentially harmful consequences. We interviewed potential stakeholders to identify both established and unique technical risks associated with the use of computational design for surgical tool design and applied ethical risk analysis (eRA) to identify how stakeholders might be exposed to ethical risk within this process. The main findings of this research are twofold. First, distinguishing between unique and established risks for new medical technologies helps identify where existing methods of risk mitigation may be applicable to a surgical innovation, and where new means of mitigating risks may be needed. Second, the value of distinguishing between technical and ethical risks in such a system is that it identifies the key responsibilities for managing these risks and allows for any potential interdependencies between stakeholders in managing these risks to be made explicit. The approach demonstrated in this paper may be applied to understanding the implications of new AI and ML applications in healthcare and other high consequence domains.
2022
-
Ethical responsibility and computational design: bespoke surgical tools as an instructive case study
David M. Douglas, Justine Lacey, and David Howard
Ethics and Information Technology, Feb 2022
Computational design uses artificial intelligence (AI) to optimise designs towards user-determined goals. When combined with 3D printing, it is possible to develop and construct physical products in a wide range of geometries and materials and encapsulating a range of functionality, with minimal input from human designers. One potential application is the development of bespoke surgical tools, whereby computational design optimises a tool’s morphology for a specific patient’s anatomy and the requirements of the surgical procedure to improve surgical outcomes. This emerging application of AI and 3D printing provides an opportunity to examine whether new technologies affect the ethical responsibilities of those operating in high-consequence domains such as healthcare. This research draws on stakeholder interviews to identify how a range of different professions involved in the design, production, and adoption of computationally designed surgical tools, identify and attribute responsibility within the different stages of a computationally designed tool’s development and deployment. Those interviewed included surgeons and radiologists, fabricators experienced with 3D printing, computational designers, healthcare regulators, bioethicists, and patient advocates. Based on our findings, we identify additional responsibilities that surround the process of creating and using these tools. Additionally, the responsibilities of most professional stakeholders are not limited to individual stages of the tool design and deployment process, and the close collaboration between stakeholders at various stages of the process suggests that collective ethical responsibility may be appropriate in these cases. The role responsibilities of the stakeholders involved in developing the process to create computationally designed tools also change as the technology moves from research and development (R&D) to approved use.