More people ask AI generative questions about their health. But the wrong answer can be risky

More people ask AI generative questions about their health. But the wrong answer can be risky

More people turn to generative artificial intelligence (AI) to aid them in their everyday and professional life. Chatgpt is one of the most celebrated and widely available generative AI tools. It gives adapted, probable answers to each question for free.

There is such a great potential for AI generative tools that aid people learn about their health. But the answers are not always correct. Relying only on chatgpt for health advice can be risky and cause unnecessary care.

Generative artificial intelligence is still a relatively novel technology and is constantly changing. Our New study Provides the first Australian data about who uses chatgpt to answer health questions for what purposes.

The results can aid people utilize this novel technology for their health, and novel skills needed to utilize safely – in other words, to build “Ai Health Liturons”.

Who uses chatgpt for health? What are they asking about?

In June 2024, we invited a sample of over 2,000 Australians representative at the national level, or used chatgpt to answer health questions.

One in ten (9.9%) asked chatgpt in the first half of 2024.

On average, they informed that they “a bit” trusted chatgpt (3.1 out of 5).

We also found that the percentage of people using chatgpt for health was higher for people who had a low skill of health skills, were born in a non -English country or spoke in a different language.

This suggests that chatgpt can support people who are complex to get involved in classic forms of health information in Australia.

One in ten Australians asked chatgpt in the first half of last year.
Kampus Productions/Pexels

The most common questions that people asked chatgpt related to:

  • Cognition of health (48%)
  • Determining what symptoms mean (37%)
  • Question about activities (36%)
  • or understanding of medical terms (35%).

Over half (61%) asked at least one question that would usually require clinical advice. We qualified these questions as “more risky”. Chatgpt question, which means your symptoms, can give you a complex idea, but it cannot replace clinical advice.

People who were born in a non -English country or talked about a different language at home, more often asked such questions.

Why does it matter?

The number of people using generative artificial intelligence to health information will probably augment. In our study, 39% of people who have not yet used chatgpt for health would consider over the next six months.

The total number of people using AI generative tools for health information is even higher if we consider other tools, such as Google Gemini, Microsoft Copilot and Meta AI.

In particular, in our study we have seen that people from different cultural and language communities can utilize chatgpt more often to obtain health information.

If they asked ChatgPT to translate health information, it adds another layer of complexity. AI generative tools are generally less accurate in other languages.

We need investment in services (both human and machine) to say that another language is not a barrier to high quality health information.

What does “Ai Health Literacy” look like?

Generative artificial intelligence will remain, presenting both the possibilities and the risk of people who utilize them for health information.

On the one hand, this technology appeals to people who They are already encountering significant barriers Access to healthcare and health information. One of his key benefits is his ability to immediately provide health information, which is uncomplicated to understand.

Recent Review of research The AI ​​generative tools shown are more and more able to answer general health questions using a regular language, although they were less precise in the case of complicated health topics.

This has clear benefits because most of the health information is Written at a level that is too complex for the general populationin this during a pandemic.

On the other hand, people turn to general AI tools for health advice. This is more risky if questions requiring clinical assessment and a broader understanding of the patient.

It was already Case studies Showing the dangers of using AI tools a general goal to decide whether to go to the hospital or not.

Where else can you choose this information?

We must aid people think carefully about the questions that AI tools ask, and combine them with the appropriate services that can answer these more risky questions.

Organizations such as Healthdirect Provide the national free helpline in which you can talk to a registered nurse about whether to go to the hospital or see a doctor. Healthdirect also provides online Symptoms A tool that will aid you find the next steps.

While many Australian Health Agencies They develop artificial intelligence principles, most focus on how health services and employees are involved in this technology.

We urgently have to equip our community with AI health skills. This need will grow, because more and more people utilize AI tools, and will also change as AI evolutions.

Leave a Reply

Your email address will not be published. Required fields are marked *