Legislative Council - Fifty-Fifth Parliament, First Session (55-1)
2025-02-06 Daily Xml

Contents

Medical Diagnosis, Artificial Intelligence

The Hon. F. PANGALLO (14:59): I seek leave to make a brief explanation before asking the Attorney-General, representing the Minister for Health in the other place, a question about the use of ChatGPT in medical diagnosis.

Leave granted.

The Hon. F. PANGALLO: There will also be a legal question in there as well to the Attorney. This morning I received contact from a constituent who yesterday attended a GP appointment for a sick mother. Since December, her mother has been in and out of both her local GP's office and Flinders Hospital emergency department, being misdiagnosed on multiple occasions.

During yesterday's consultation the constituent observed the GP entering her mother's ongoing symptoms into ChatGPT and engaging in a back and forth exchange to determine a diagnosis and appropriate treatment plan. When questioned, the GP confirmed it was ChatGPT for doctors to assist in making clinical decisions. This can be quite concerning and you have to wonder at the accuracy of the diagnosis and/or any consequences that could follow.

I say this because a well known person I know who used ChatGPT to test its accuracy by requesting a CV about themselves returned numerous inaccuracies. My questions to the minister are:

1. Given the very well known risks that these AI applications produce incorrect information, does the minister believe it is appropriate for doctors to rely on AI-driven applications like ChatGPT for medical diagnosis and treatment advice?

2. What guidance, if any, has been issued by the government to medical professionals?

3. Regarding the use of AI tools like ChatGPT in patient care, what steps are being taken by the government to safeguard patients?

And one for the Attorney-General:

4. What guidance, if any, has the Attorney-General and the government issued to public servants and Crown law about the use of AI technology?

The Hon. K.J. MAHER (Minister for Aboriginal Affairs, Attorney-General, Minister for Industrial Relations and Public Sector, Special Minister of State) (15:01): I thank the honourable member for his question; it is a very important question. Certainly, when people are getting treatment for themselves and their loved ones it can be a very difficult and distressing time. When things don't go as they should that just compounds what can be some of the most traumatic parts of people's lives, when loved ones are undergoing serious medical treatments and receiving diagnoses.

The interaction of new technologies with how we do everything in life is a field that is very rapidly changing. We are seeing with machine learning and iterative artificial intelligence exceptionally dramatic changes in many fields, and I think that will only increase. It is something that, as technology changes, governments and ourselves in this chamber as legislators necessarily need to grapple with.

We saw only in the last week a new appointment in terms of the executive of the South Australian government, with the Member for Florey, Michael Brown, appointed as parliamentary secretary, particularly with responsibility for looking at issues around artificial intelligence and I think that shows how seriously, as the South Australian government, we are taking how new technology is applied to what we do as governments, but how we interact in society generally. In my area we released a discussion paper recently looking at how artificial intelligence can—whether a person should have some sort of right of ownership of their likeness and their image in terms of how artificial intelligence is used to create what I think are commonly known as deep fakes. So, yes, these are areas that are developing.

Certainly in relation to this particular technology's use in relation to health settings, I will be happy to take that on notice for the honourable member and refer it to the health minister and bring back a reply.