A doctor recently shared her experience dealing with a stubborn foreign male patient who refused to follow proper medical procedures because he was overly reliant on recommendations from the artificial intelligence (AI) application, ChatGPT.
The incident occurred when the patient visited her clinic with a specific request for particular injections and antibiotics.
Through Dr. Ruusydina’s TikTok sharing, the patient refused to pay for consultation fees, claiming he already knew what treatment he needed.
“The patient insisted on getting the same medication and injection recommended by the ChatGPT application.
“However, as a doctor, I need to know the medical history and symptoms before providing any treatment,“ she said.
The examination revealed that the patient complained of discharge from his private parts for several months, symptoms that were most likely related to gonorrhoea infection, a type of sexually transmitted disease.
The doctor explained that if it was indeed gonorrhoea, the patient’s sexual partners would also need to receive treatment to prevent reinfection.
“I asked, ‘This might be gonorrhoea, which is a sexually transmitted infection. Do you have a regular partner?’ Then he answered, ‘No, no, sister. I’m not married yet.’
“I don’t care whether you’re married or not, I just want to know how many partners you have. By that time, it was already late and the clinic was about to close,“ she shared.
She added that the patient stubbornly insisted on only wanting antibiotics as recommended by ChatGPT, even rejecting the doctor’s suggestion of alternative medication with more comprehensive treatment coverage.
“He still wanted to follow what ChatGPT recommended, as if we who studied medicine for years don’t know better. AI can indeed help, but it’s not the ultimate reference for medical treatment,“ the doctor said.
She emphasized that the use of AI technology like ChatGPT should be used as a learning tool or basic guidance, not as an absolute reference to the point of rejecting professional medical advice.
“Reading journals, referring to books, that’s the real foundation. AI can help, but still needs clinical validation. If you follow it blindly, it can endanger yourself,“ she stressed.