From Dr. Google to ChatGPT: How AI Can Help or Hinder Your Healthcare

It’s 3 a.m. and you can’t sleep. The pain you’ve been feeling all week seems worse at night, and so does your worry about what might be causing it. Frustrated, you grab your phone. But instead of heading to Google like you might have in the past, you open your go-to AI Chabot app, the one that’s always ready and happy to help you think things through and give you some answers.

For about 20 years, people have been turning to the internet to figure out what might be wrong with them or their loved ones. A quick symptom search usually pulled up a long (and often scary) list of possibilities. It was the first time patients could step outside of the doctor’s office and try to better understand their own health. But it also brought a lot of “Dr. Google” anxiety and confusion along with misinformation, particularly on social media.

Today, instead of scrolling through search results, you can seemingly talk to ChatGPT, Google Gemini, Claude or any of the other AI chatbots by asking questions, getting explanations, and having what feels like a conversation. It’s more personal, more interactive, and yet can be disturbingly sycophantic and misleading (or worse) if not used properly.

While Canadian patients are being cautioned against relying on AI for medical advice by groups like the Ontario Medical Association and research institutions such as the Sunnybrook Health Science Centre, some health professionals are seeing this as an opportunity to re-think the patient/clinician relationship.

According to an article written by clinical Psychologist, Harvey Liberman, PHD, “generative AI tools don’t just summarize; they simulate conversation. They let people organize thoughts, explore outcomes, and rehearse how they’ll describe what they’re feeling. The result isn’t a diagnosis. It’s a draft.”

Where AI Falls Short

Transitioning the patient/doctor relationship towards one where the patient feels empowered to think critically about the healthcare they are receiving is nothing less than a sea change in how people interact with their medical practitioners. And with over 6 million Canadians without a family doctor, AI is increasingly becoming part of that empowerment, for better or for worse. There are, however, some real limits and risks that people need to keep in mind.

✅ It is not advised to use an AI Chatbot to diagnose health issues you’re facing. A recent study led by researchers at the University of Waterloo asked Chat GPT4 (a Large Language Model, or LLM)  a series of open-ended medical questions. The findings were striking. Only 31 per cent of ChatGPT’s responses were found to be totally correct, and just 34 per cent were considered clear.

“The danger is that people trying to self-diagnose will get reassuring news and dismiss a serious problem or be told something is very bad when it’s really nothing to worry about” explained Troy Zada, doctoral student at Waterloo, and researcher on this study

✅ The next limitation just might start with the human at the keyboard. If you don’t ask the right questions, or frame your discussion contextually, you may get incorrect or misleading answers. See below for a cheat sheet on how to get the most out of your AI Chatbot health questions.

✅ Check sources the chatbot cites. AI chatbots are well known to fabricate sources so make sure you verify that the source is real and also, credible.

✅ Ethically, most of us have no clue how AI comes up with its answers, and sometimes the information it relies on is biased, which can make health inequalities worse instead of better.

✅ There are also safety and legal questions. If AI gives advice that turns out to be wrong, causing someone to delay care or worry unnecessarily, who’s responsible? The patient? The doctor? The company that built the app?

✅ Privacy is an important issue. As of May 2025, Canada has no approved AI regulation framework. The Canadian government has been working on the Artificial Intelligence and Data Act (AIDA), which aims to regulate AI applications at the federal level. One of the most important high-impact use cases is healthcare and medical diagnostics. The bill was tabled in 2025 due to the election, however now that Parliament is back in session, AIDA will likely resurface. 

Any information covered by Canada’s  Personal Information Protection and Electronic Documents Act (PIPEDA) and the Personal Health Information Protection Act (PHIPA) should not be uploaded into any Chatbot. This would include any personally identifying information such as your medical records, name, SIN etc.

Once information is submitted to a Chatbot, you can’t guarantee where it’s stored, who can review it within the commercial enterprise or how it will be used in the future. Hackers and security threats are a reality for any online system. Be very careful with your private information.

From Symptom Search to Storytelling

Even with these challenges, AI is changing the way patients approach their health. What makes it so different from “Dr. Google” is that it doesn’t just hand you a list of possible conditions—it can help you shape a story. You can ask follow-up questions, get explanations in plain language, and rehearse different scenarios before you ever set foot in a doctor’s office.

Instead of showing up with random printouts from the internet, proactive patients now come in with a story already formed about what they think is happening. These stories feel personal, emotional, and convincing. And if doctors don’t take the time to listen to the story that’s already in a patient’s head, they might miss the chance to help shape how that story ends.

AI – A Jumping off Point for Discussion with Your Doctor

Used thoughtfully, AI can be a powerful partner in your health journey.

Cheat Sheet: Getting the Most Out of AI for Your Health

AI can be a great tool, but only if you use it wisely. Here are some ways to get better, safer answers when you chat with an AI app:

1. Ask Open-Ended Questions

Instead of: “Do I have diabetes?”
Try: “What are some common causes of increased thirst and fatigue?”

2. Ask for Explanations in Plain Language

Example: “Can you explain this test result to me like I’m a high school student?”

3. Compare Options

Example: “What are the differences between ibuprofen and acetaminophen for pain relief?”

4. Use Clarifying Prompts

5. Get Help Preparing for Appointments

Example: “What questions should I ask my doctor if I’m worried about chest pain?”

6. Remember the Golden Rule

Always finish by asking:
“What are the limits of your advice, and why should I still see a doctor?”

Some doctors may not want to hear about your journey with AI, but as a proactive patient, it’s your right – and even responsibility – to share information you found on AI with your physician and ask questions: “Is this accurate? Does it apply to my case? And if not, why not?” You may also ask your doctor if there’s any AI health tool that they trust.

So, what does this mean for your relationship with your healthcare team? It means the conversation is shifting. It’s no longer just, “What brings you in today?” but also, “What have you already learned, or been told about what’s going on?”

The best doctors will recognize that you’re walking in with a story already shaped by research and perhaps a “discussion” with AI. Their role is less about being the only source of information and more about being the trusted guide who helps edit, clarify, and confirm your story.

The Era of the Proactive Patient

Patients being more in charge of their health than ever before isn’t a bad thing, it’s inevitable and an opportunity, especially within our evolving healthcare landscape.

AI isn’t here to replace doctors—but it is changing how we show up in the exam room. And the more we use it wisely and understand potential pitfalls, the more empowered we’ll all be to play an active role in our own care. And for those without a family doctor, this is more important than ever.

~ Read more from The Health Insider ~


The information provided on TheHealthInsider.ca is for educational purposes only and does not substitute for professional medical advice. TheHealthInsider.ca advises consulting a medical professional or healthcare provider when seeking medical advice, diagnoses, or treatment. To read about our editorial process, click here.

Exit mobile version