United Kingdom: Doctors and public health experts have warned that artificial intelligence (AI) has the potential to pose negative health impacts and may even threaten humanity, calling for a halt to the development of artificial general intelligence until it is regulated.
In an article published in the journal BMJ Global Health, health professionals from the UK, US, Australia, Costa Rica, and Malaysia stated that AI could harm the health of millions via the social determinants of health through the control and manipulation of people, the use of lethal autonomous weapons, and the mental health effects of mass unemployment should AI-based systems displace large numbers of workers.
Although AI has the potential to revolutionise healthcare by improving the diagnosis of diseases and extending care to more people, the risks associated with medicine and healthcare include AI errors that could cause harm to patients, data privacy and security issues, and the use of AI in ways that will worsen social and health inequalities.
One example of harm they cited was the use of an AI-driven pulse oximeter that overestimated blood oxygen levels in patients with darker skin, resulting in the undertreatment of their hypoxia. The report warned that the development of AI must be regulated to prevent such negative health impacts and existential threats to humanity.
Threats also arise from the loss of jobs that will accompany the widespread deployment of AI technology, with estimates ranging from tens to hundreds of millions over the coming decade.
“While there would be many benefits from ending work that is repetitive, dangerous and unpleasant, we already know that unemployment is strongly associated with adverse health outcomes and behaviour,” the group noted.
“Furthermore, we do not know how society will respond psychologically and emotionally to a world where work is unavailable or unnecessary, nor are we thinking much about the policies and strategies that would be needed to break the association between unemployment and ill health,” the statement added.