In Focus

From code to cure: Could ChatGPT be your future doctor?

ChatGPT

In a tech-savvy world where patients often turn to Google and other social media forums for advice about their health-related problems, it’s likely that ChatGPT will be inundated with medical queries in the coming years

Since its launch in November 2022, ChatGPT has taken the world by storm, sparking discussion across social media platforms, web forums, and the media about its potential to disrupt various industries, including healthcare. And now with OpenAI releasing GPT-4, the more powerful version of its language model systems, which even accepts visual inputs, the big question remains if the healthcare industry is ready for this chatbot driven by artificial intelligence

In a normal scenario when a patient needs clinical advice, they choose the doctor based on their own assessment, knowledge, and word of mouth and the accountability of treatment lies with the doctor. ChatGPT, on the other hand, is a data repository from thousands of doctors or millions of patients from around the world. It may be an intelligent database but is a doctor without any face and accountability,” explains Sudhir Bahl, Founder and Chairman of LifeForce Health Systems.
  
For physicians, this development is highly relevant as AI algorithms could help identify patterns and abnormalities not visible to the human eye when analyzing images such as X-rays, MRIs, and CT scans. This is just one example of how AI could improve accuracy in medicine and speed up the diagnostic process, leading to earlier intervention, more accurate diagnoses, and improved outcomes for patients. 

That said, according to Sudhir Bahl, a senior healthcare executive with over three decades of management and entrepreneurial experience, ChatGPT can never replace human healthcare professionals. 

“The physician will of course be required to conduct a sanity check and make the final decision. More so, because AI can give you a very accurate diagnosis, but it will never tell you how they reach this diagnosis. Thus, in a supportive role, ChatGPT can assist doctors to help them make more informed decisions, as well as support their education and continuous learning,” he adds.

Reshaping patient care

In a tech-savvy world where patients often turn to Google and other social media forums for advice about their health-related problems, it’s likely that ChatGPT will be inundated with medical queries in the coming years. 

“Chatbots will make patient education easier because currently when a patient googles any information, all the content they access may not be clinically right or authentic, whereas ChatGPT in future may be able to ensure the validity and authenticity of the clinical information to some extent,” Bahl points out.

Other medical possibilities 

While many physicians are enthusiastic about using ChatGPT for low-risk tasks, medical researchers can also benefit from the chatbot. It can be used for tasks like summarizing text, so long as the user knows that the bot may not be 100 percent correct and can generate biased results. Although, the journal Nature in its recent guidelines has stated that no AI program could be credited as an author because “any attribution of authorship carries with it accountability for the work, and AI tools cannot take such accountability.” 

“ChatGPT is a tool that can assist a medical researcher understand the nuances of medical language, however, it cannot verify the authenticity or identify the intellectual property rights associated with certain medical processes,” says Dr Tejinder Kataria, Chairperson, Radiation Oncology Cancer Center at Medanta in the Medicity Gurugram.

Meanwhile, the working group on artificial intelligence (WG-AI) from Europe looked at the capabilities of ChatGPT as an aid in medical and cancer diagnostics and treatment. The group generated 10 laboratory results to be interpreted by ChatGPT and concluded that the chatbot could interpret each test independently but without giving a consolidated report or an interpretation of clinical efficacy. The results by ChatGPT were superficial and the AI engine could not make any meaningful suggestions or follow-up tests or examination based upon the deviation from the set reference limits.

“ChatGPT in its current form is not trained for interpretation of medical tests and is drawing clinical conclusions from the laboratory tests or case records. It may be considered as a tool for identifying deviations from the laboratory tests on a test-by-test basis rather than give a diagnostic interpretation in its present avatar,” added Dr Kataria.

While ChatGPT offers many benefits, its use in medicine also poses several technological risks including security, privacy and algorithmic bias. Besides, it raises ethical concerns about the interplay between machines and humans.

 “Other than ethical, privacy and legal concerns, a few other drawbacks associated with ChatGPT I would say are contextual understanding and inadequate clinical judgement,” Bahl says. 

However, despite all the advantages and disadvantages associated with the new-age chatbot, future generative AI applications are expected to empower patients and transform healthcare for the better. These tools will help patients receive timely care, especially in remote villages and unchartered areas where there’s a lack of physicians and hospitals

“ASHA workers in India are doing great work and are bridging the health care gap in places, both rural and urban. If equipped with the relevant information and accurate guidelines on how to use ChatGPT they might play an even bigger role in helping their communities better,” says Bahl.

For now, as technologies like ChatGPT continue to evolve and become more sophisticated, the healthcare industry is likely to see an increase in the adoption of similar tools. More so, because as a big data repository, these chatbots can be used for medical information and patient education. 

Also Read: ChatGPT can be handy in sexual assault, mental health issues: study

Author