Healthcare AI July 7, 2025

We Have a New Type of Patient: What Clinicians Should Know About AI Tools

"Health is wealth."
People have always been curious about their health, often searching the web for possible answers. Sometimes they are right, and many times they are wrong, so it was easy to say, "Don't believe the internet."

With the evolution and adoption of generative AI tools with popular chat interfaces like ChatGPT and Gemini, it is no longer valid or wise to say, "Don't believe the AI."

So, as a clinician, what should you know about AI tools?

Clinicians make judgment calls based on data: from clerking (what the patient said), from investigations (tests and observation), and from context (experience and training).

AI tools are based on algorithms capable of learning and finding patterns from billions and trillions of data points. So, they are specifically trained to be "context machines." They include data from several sources: literature, treatment protocols, research, various countries, and diverse experiences.

When patients query these tools, it's fundamentally different from an independent web search. While AI tools can still be wrong — as they are tools and not professionals, and patients are not experts in crafting appropriate prompts or interpreting results — they can generate more nuanced and contextualized responses based on the data provided.

Here's what's particularly interesting about context: AI tools often have access to more patient-reported information than clinicians do. They have a kind of "360-degree" overview that includes random symptoms patients assume are not connected, details about other parts of their lives that they consider irrelevant, and details they consider too embarrassing to share with a human.

This willingness to share with AI is also interesting because, while it was previously assumed people might be hesitant, current behavior shows they are not. (Online discussion pages, such as those on Reddit and X (formerly Twitter), highlight this trend of users sharing details with AI and finding answers).
They are happy to upload their entire medical history, provide more information they may not share with humans, and they even find the accessibility useful. They do not feel that they are bothering anyone on limited clinic visit time, and they can ask infinite questions about themselves or their loved ones.
This shift creates a new dynamic in your consultation room. Your patients may arrive with AI-generated insights, differential diagnoses, or treatment questions that are surprisingly sophisticated. They might also come with misconceptions wrapped in convincing medical language.

AI Patient Insights

The key is recognizing that these AI-informed patients aren't trying to replace your expertise; they're seeking validation, clarification, or deeper understanding of information they've already processed.

Is this just another "AI will take your job" article? No.
This is about providing context on what your patients now know, why it's important to prepare for the kinds of questions they may bring to you, and why now is the time to upskill as a clinician.
Rather than dismissing AI-generated patient insights, consider them as additional data points in your clinical assessment. The patient who comes in saying "I discussed my symptoms with ChatGPT and it suggested I might have X" is giving you valuable information about their thought process, concerns, and the research they've already done.

If you're curious about how AI is shaping healthcare, explore my LinkedIn Learning series on AI for Healthcare (free with LinkedIn Premium):

Stay ahead — your patients already are.