The doctor’s waiting room has always been a place of anticipation, but the dynamics of consultations are dramatically changing. Patients arriving at appointments armed with researched information are not new, but the emergence of artificial intelligence (AI) tools such as ChatGPT is transforming the doctor-patient interaction. Their confident presentation of this information can leave physicians feeling that their expertise is challenged.
A doctor recalled a specific case: a patient visited his clinic reporting dizziness and described her symptoms with unusual precision: "It’s not vertigo, but more like a presyncope feeling." She then suggested that the tilt table test might be useful for diagnosis. Curious, he asked if she worked in the healthcare sector. She replied that she had consulted ChatGPT, which recommended the test.
While patients have long brought newspaper clippings, internet research, or advice from friends and relatives to consultations, this particular encounter was different. The patient’s tone and level of detail conveyed competence, and the confidence with which she presented the information subtly challenged his clinical judgment and treatment plans.
Clinical Practice and the AI Challenge
It is not surprising that large language models (LLMs), such as ChatGPT, are appealing. Recent studies have confirmed their remarkable strengths in logical reasoning and interpersonal communication. However, a direct comparison between LLMs and physicians is unfair. Clinicians often face immense pressure, including constrained consultation times, overflowing inboxes, and a healthcare system that demands productivity and efficiency. Even skilled professionals struggle to perform optimally under adverse conditions.
In contrast, generative AI is functionally limitless. This imbalance creates an unrealistic benchmark, yet this is today’s reality. Patients want clear answers, and more importantly, they want to feel heard, understood, and reassured. While they value accurate information, they also want to feel recognized and reassured.
Despite the capabilities of generative AI, patients still visit doctors. Although these tools deliver confidently worded suggestions, they inevitably conclude: “Consult a healthcare professional.” The ultimate responsibility for liability, diagnostics, prescriptions, and sick notes remains with physicians.
In practice, this means dealing with requests, such as a tilt table test for intermittent vertigo—a procedure that is not uncommon but often inappropriate. the doctor noted: “I find myself explaining concepts such as overdiagnosis, false positives, or other risks of unnecessary testing. At best, the patient understands the ideas, which may not resonate when one is experiencing symptoms. At worst, I sound dismissive.” He added that there is no function that tells ChatGPT that clinicians lack routine access to certain tests or that echocardiogram appointments are delayed because of staff shortages. “I have to carry those constraints into the examination room while still trying to preserve trust,” he emphasized.
There is also a concern about a new kind of paternalism creeping in. The old line, “They probably WebMD’d it and think they have cancer,” has morphed into “They probably ChatGPT’d it and are going to tell us what to order.” This attitude often reflects defensiveness from clinicians rather than genuine engagement, and it carries an implicit message: “We still know the best.” He concluded that this attitude “risks eroding the sacred and fragile trust between clinicians and patients.”
Patient Advocacy
One patient told him plainly, “This is how I can advocate for myself better.” The word “advocate” struck him, capturing the effort required to persuade someone with more authority. Although clinicians still control access to tests, referrals, and treatment plans, the term “advocate” conveys a sense of preparing for a “fight.”
When patients feel unheard, gathering knowledge becomes a strategy to be taken seriously. In such situations, the usual approach of explaining false-positive test results, overdiagnosis, and test characteristics is often ineffective. From the patient’s perspective, this sounds more like: “I still know more than you, no matter what tool you used, and I’m going to overwhelm you with things you don’t understand.”
The Evolving Role of the Physician
The role of physicians is constantly evolving. The transition from the “physician as authority” to the “physician as advisor” is intensifying. Patients increasingly present with expectations shaped by non-evidence-based sources, often misaligned with the clinical reality. As he observed, “They arm themselves with knowledge to be heard.” This necessitates a professional duty to respond with understanding rather than resistance.
The physician’s approach centers on emotional acknowledgment before clinical discussion: “I say, ‘We’ll discuss diagnostic options together. But first, I want to express my condolences. I can hardly imagine how you feel. I want to tackle this with you and develop a plan.’” He emphasized, “This acknowledgment was the real door opener.”
A Global Trend and Solutions
What began as a US trend has now spread worldwide, with patients increasingly arriving at consultations armed with medical knowledge from tools like ChatGPT, rather than just “Dr Google.” Clinicians across health systems have reported that digitally informed patients now comprise the majority. In forum discussions, physicians from various disciplines have shared their experiences, highlighting how previously informed patients are now the norm.
Inquiries often focus on specific laboratory values, such as vitamin D or hormone levels. In gynecologic consultations, internet research on menstrual disorders has become a routine part of patient interactions, with an overwhelming range of answers available online. A gynecologist shared, “The answers range from, ‘It’s normal; it can happen’ to ‘You won’t live long.’”
How should doctors respond to this trend? Opinions are clear: openness, education, and transparency are essential and, ideally, should be delivered in a structured manner. A specialist in gynecology and obstetrics, commented: “Get the patients on board; educate them. In writing! Each and every one of them. Once it’s put into words, it’s no longer a job. Invest time in educating patients to correct misleading promises made by health insurance companies and politicians.”
The presence of digitally informed patients is increasingly seen not only as a challenge but also as an opportunity. Conversations with these patients can be constructive, though they can also generate unrealistic demands or heated debates. Therefore, a professional, calm, and explanatory approach remains crucial and, at times, a dose of humor can help. As another internal medicine specialist added, “The term ‘online consultation’ takes on a whole new meaning.”