AI in Practice

9 min read

Patient Journey Workshops Need an AI Upgrade (Here's How I'd Do It)

Learn how to upgrade patient journey workshops with AI. Get practical tips for pharma teams from a Senior Product Manager who's seen what works (and what doesn't).

I just came out of patient journey workshop. You know that moment when you're on hour 2 of mapping pain points, the sticky notes in every color are falling off the wall, and someone asks "wait, do we have data on this?"

Yeah. That moment…

Don't get me wrong. These workshops are valuable. They bring cross-functional teams together to understand what patients actually go through, exchange real insights and get everyone aligned on priorities. But sitting there with my coffee ☕, I kept thinking: we could do this better…

Here's the thing, I'm not suggesting we kill the workshop. I'm saying we should use AI to make it actually work for us.

What AI Could Do for These Workshops (If We Let It)

Get Us Started Faster
Imagine walking into the room and AI has already analyzed patient forums, clinical guidelines, claims data, published research - all the stuff we usually scramble to review beforehand. Instead of starting with blank flipcharts, you start with real patterns already mapped out.

Spot What We're Missing
AI can look at thousands of patient records and find patterns we'd never see. Like which symptoms show up together. Where patients drop out of treatment. What actually makes people stop taking their meds. This isn't theory, this is what the data shows.

Give Us Different Patient Types
We always end up designing for some "average patient" who doesn't really exist. AI can create actual personas based on real data, young patients vs. older patients, people with other conditions, first-time treatment vs. follow-up patients. We need this.

Let Us Focus on the Hard Stuff
If AI does the heavy lifting on synthesis of the data, we can spend workshop time on what humans are actually good at: understanding context, making judgment reviews, being creative about solutions.

How I'd Upgrade the Workshop (Three Simple Steps)

Before the Workshop: Do the Prep Work That Actually Helps

Look, nobody reads 50-page pre-workshop documents. Here's what I'd send instead:

One-page evidence map from AI: Top symptoms, unmet needs, key decision moments. All with refereneces so we know what's solid and what's a guess.

Three patient personas: Based on actual data, not made up. Different ages, different situations. Each one gets a simple journey map.

Top 5 hypotheses to test: AI ranks them by what's likely to make the biggest difference. Now we have something real to discuss.

Everyone gets this 48 hours before. That's it. No novels to read.

During the Workshop: Use AI as Your Assistant

This is where it gets practical:

Test ideas fast: Show the AI persona. Break into groups. Each group designs an intervention. AI synthesizes all the proposals and shows where people disagree. You just saved 90 minutes of confusion.

Get live notes: AI captures the conversation, clusters themes, suggests opportunity areas as you talk. Humans check if it makes sense and add the stuff AI misses (which is the nuance, the context, the "yeah but" moments).

Prioritize with data: AI scores each idea on impact and how hard it'll be to do. Your team debates the ranking and adjusts. Way better than arguing about sticky note placement.

After the Workshop: Actually Get Stuff Done

Here's what usually happens: we leave feeling great, and then... nothing moves. AI can fix this:

Draft the first version of everything: educational content, one-pagers, measurement plans. Not perfect, but a real starting point instead of a blank page.

Set up tracking: Feed pilot results back to AI, get recommendations on what to adjust.

Document decisions: Auto-generate meeting notes and next steps. No more "wait, what did we agree on?" three months later.

My Framework: How to Use AI Without Screwing It Up

If you're going to do this, here's what I'd insist on:

Keep Humans in Charge
AI does data processing, finds patterns, creates first drafts. Humans validate everything, make ethical jujments, and do creative problem-solving. Don't flip this.

Know Where Your Data Comes From
Track sources!!!! Know confidence levels and references. Keep records of why you accepted or rejected AI suggestions. This matters when regulatory/medical comes knocking.

Assign Clear Roles
Someone manages the AI inputs, checks medical validity and makes sure it matches real life. Don't skip it!

Start Small
Test one AI-enhanced workshop. See what works. Fix what doesn't. Then scale up.

Real Talk: What Can Go Wrong

AI bias is real: If your data under-represents certain groups, your AI outputs will too. Always check who's missing.

Don't automate everything: You still need to validate it with reality. I've seen teams waste months designing for problems that don't exist.

Make it explainable: AI needs to show its work. You need to know why it's suggesting something, especially when you're explaining decisions to management.

Bottom Line

AI doesn't replace the human part of patient journey mapping. It handles the grunt work so we can focus on what matters—understanding patients and designing better solutions.

Next time you're planning one of these workshops, ask yourself: Am I making my team do work that AI could handle in minutes? If yes, let's fix that 🚀

The goal isn't to be fancy. The goal is to serve patients better. AI gets us there faster.

Have you tried using AI in your strategy workshops? What worked? What was a disaster? Tell me in email, I'm always learning, usually with coffee in hand ☕

I just came out of patient journey workshop. You know that moment when you're on hour 2 of mapping pain points, the sticky notes in every color are falling off the wall, and someone asks "wait, do we have data on this?"

Yeah. That moment…

Don't get me wrong. These workshops are valuable. They bring cross-functional teams together to understand what patients actually go through, exchange real insights and get everyone aligned on priorities. But sitting there with my coffee ☕, I kept thinking: we could do this better…

Here's the thing, I'm not suggesting we kill the workshop. I'm saying we should use AI to make it actually work for us.

What AI Could Do for These Workshops (If We Let It)

Get Us Started Faster
Imagine walking into the room and AI has already analyzed patient forums, clinical guidelines, claims data, published research - all the stuff we usually scramble to review beforehand. Instead of starting with blank flipcharts, you start with real patterns already mapped out.

Spot What We're Missing
AI can look at thousands of patient records and find patterns we'd never see. Like which symptoms show up together. Where patients drop out of treatment. What actually makes people stop taking their meds. This isn't theory, this is what the data shows.

Give Us Different Patient Types
We always end up designing for some "average patient" who doesn't really exist. AI can create actual personas based on real data, young patients vs. older patients, people with other conditions, first-time treatment vs. follow-up patients. We need this.

Let Us Focus on the Hard Stuff
If AI does the heavy lifting on synthesis of the data, we can spend workshop time on what humans are actually good at: understanding context, making judgment reviews, being creative about solutions.

How I'd Upgrade the Workshop (Three Simple Steps)

Before the Workshop: Do the Prep Work That Actually Helps

Look, nobody reads 50-page pre-workshop documents. Here's what I'd send instead:

One-page evidence map from AI: Top symptoms, unmet needs, key decision moments. All with refereneces so we know what's solid and what's a guess.

Three patient personas: Based on actual data, not made up. Different ages, different situations. Each one gets a simple journey map.

Top 5 hypotheses to test: AI ranks them by what's likely to make the biggest difference. Now we have something real to discuss.

Everyone gets this 48 hours before. That's it. No novels to read.

During the Workshop: Use AI as Your Assistant

This is where it gets practical:

Test ideas fast: Show the AI persona. Break into groups. Each group designs an intervention. AI synthesizes all the proposals and shows where people disagree. You just saved 90 minutes of confusion.

Get live notes: AI captures the conversation, clusters themes, suggests opportunity areas as you talk. Humans check if it makes sense and add the stuff AI misses (which is the nuance, the context, the "yeah but" moments).

Prioritize with data: AI scores each idea on impact and how hard it'll be to do. Your team debates the ranking and adjusts. Way better than arguing about sticky note placement.

After the Workshop: Actually Get Stuff Done

Here's what usually happens: we leave feeling great, and then... nothing moves. AI can fix this:

Draft the first version of everything: educational content, one-pagers, measurement plans. Not perfect, but a real starting point instead of a blank page.

Set up tracking: Feed pilot results back to AI, get recommendations on what to adjust.

Document decisions: Auto-generate meeting notes and next steps. No more "wait, what did we agree on?" three months later.

My Framework: How to Use AI Without Screwing It Up

If you're going to do this, here's what I'd insist on:

Keep Humans in Charge
AI does data processing, finds patterns, creates first drafts. Humans validate everything, make ethical jujments, and do creative problem-solving. Don't flip this.

Know Where Your Data Comes From
Track sources!!!! Know confidence levels and references. Keep records of why you accepted or rejected AI suggestions. This matters when regulatory/medical comes knocking.

Assign Clear Roles
Someone manages the AI inputs, checks medical validity and makes sure it matches real life. Don't skip it!

Start Small
Test one AI-enhanced workshop. See what works. Fix what doesn't. Then scale up.

Real Talk: What Can Go Wrong

AI bias is real: If your data under-represents certain groups, your AI outputs will too. Always check who's missing.

Don't automate everything: You still need to validate it with reality. I've seen teams waste months designing for problems that don't exist.

Make it explainable: AI needs to show its work. You need to know why it's suggesting something, especially when you're explaining decisions to management.

Bottom Line

AI doesn't replace the human part of patient journey mapping. It handles the grunt work so we can focus on what matters—understanding patients and designing better solutions.

Next time you're planning one of these workshops, ask yourself: Am I making my team do work that AI could handle in minutes? If yes, let's fix that 🚀

The goal isn't to be fancy. The goal is to serve patients better. AI gets us there faster.

Have you tried using AI in your strategy workshops? What worked? What was a disaster? Tell me in email, I'm always learning, usually with coffee in hand ☕

I just came out of patient journey workshop. You know that moment when you're on hour 2 of mapping pain points, the sticky notes in every color are falling off the wall, and someone asks "wait, do we have data on this?"

Yeah. That moment…

Don't get me wrong. These workshops are valuable. They bring cross-functional teams together to understand what patients actually go through, exchange real insights and get everyone aligned on priorities. But sitting there with my coffee ☕, I kept thinking: we could do this better…

Here's the thing, I'm not suggesting we kill the workshop. I'm saying we should use AI to make it actually work for us.

What AI Could Do for These Workshops (If We Let It)

Get Us Started Faster
Imagine walking into the room and AI has already analyzed patient forums, clinical guidelines, claims data, published research - all the stuff we usually scramble to review beforehand. Instead of starting with blank flipcharts, you start with real patterns already mapped out.

Spot What We're Missing
AI can look at thousands of patient records and find patterns we'd never see. Like which symptoms show up together. Where patients drop out of treatment. What actually makes people stop taking their meds. This isn't theory, this is what the data shows.

Give Us Different Patient Types
We always end up designing for some "average patient" who doesn't really exist. AI can create actual personas based on real data, young patients vs. older patients, people with other conditions, first-time treatment vs. follow-up patients. We need this.

Let Us Focus on the Hard Stuff
If AI does the heavy lifting on synthesis of the data, we can spend workshop time on what humans are actually good at: understanding context, making judgment reviews, being creative about solutions.

How I'd Upgrade the Workshop (Three Simple Steps)

Before the Workshop: Do the Prep Work That Actually Helps

Look, nobody reads 50-page pre-workshop documents. Here's what I'd send instead:

One-page evidence map from AI: Top symptoms, unmet needs, key decision moments. All with refereneces so we know what's solid and what's a guess.

Three patient personas: Based on actual data, not made up. Different ages, different situations. Each one gets a simple journey map.

Top 5 hypotheses to test: AI ranks them by what's likely to make the biggest difference. Now we have something real to discuss.

Everyone gets this 48 hours before. That's it. No novels to read.

During the Workshop: Use AI as Your Assistant

This is where it gets practical:

Test ideas fast: Show the AI persona. Break into groups. Each group designs an intervention. AI synthesizes all the proposals and shows where people disagree. You just saved 90 minutes of confusion.

Get live notes: AI captures the conversation, clusters themes, suggests opportunity areas as you talk. Humans check if it makes sense and add the stuff AI misses (which is the nuance, the context, the "yeah but" moments).

Prioritize with data: AI scores each idea on impact and how hard it'll be to do. Your team debates the ranking and adjusts. Way better than arguing about sticky note placement.

After the Workshop: Actually Get Stuff Done

Here's what usually happens: we leave feeling great, and then... nothing moves. AI can fix this:

Draft the first version of everything: educational content, one-pagers, measurement plans. Not perfect, but a real starting point instead of a blank page.

Set up tracking: Feed pilot results back to AI, get recommendations on what to adjust.

Document decisions: Auto-generate meeting notes and next steps. No more "wait, what did we agree on?" three months later.

My Framework: How to Use AI Without Screwing It Up

If you're going to do this, here's what I'd insist on:

Keep Humans in Charge
AI does data processing, finds patterns, creates first drafts. Humans validate everything, make ethical jujments, and do creative problem-solving. Don't flip this.

Know Where Your Data Comes From
Track sources!!!! Know confidence levels and references. Keep records of why you accepted or rejected AI suggestions. This matters when regulatory/medical comes knocking.

Assign Clear Roles
Someone manages the AI inputs, checks medical validity and makes sure it matches real life. Don't skip it!

Start Small
Test one AI-enhanced workshop. See what works. Fix what doesn't. Then scale up.

Real Talk: What Can Go Wrong

AI bias is real: If your data under-represents certain groups, your AI outputs will too. Always check who's missing.

Don't automate everything: You still need to validate it with reality. I've seen teams waste months designing for problems that don't exist.

Make it explainable: AI needs to show its work. You need to know why it's suggesting something, especially when you're explaining decisions to management.

Bottom Line

AI doesn't replace the human part of patient journey mapping. It handles the grunt work so we can focus on what matters—understanding patients and designing better solutions.

Next time you're planning one of these workshops, ask yourself: Am I making my team do work that AI could handle in minutes? If yes, let's fix that 🚀

The goal isn't to be fancy. The goal is to serve patients better. AI gets us there faster.

Have you tried using AI in your strategy workshops? What worked? What was a disaster? Tell me in email, I'm always learning, usually with coffee in hand ☕

Let's Decode the Future of Medicine with Technology
- Together

The views and opinions expressed on this website are solely those of The Health Tech Advocate and do not necessarily reflect the official policy or position of my current employer or any affiliated organizations.

© 2025 The Health Tech Advocate.

Based on template created by Hamza Ehsan .

The views and opinions expressed on this website are solely those of The Health Tech Advocate and do not necessarily reflect the official policy or position of my current employer or any affiliated organizations.

© 2025 The Health Tech Advocate.

Based on template created by Hamza Ehsan .

The views and opinions expressed on this website are solely those of The Health Tech Advocate and do not necessarily reflect the official policy or position of my current employer or any affiliated organizations.

© 2025 The Health Tech Advocate.

Based on template created by Hamza Ehsan .

Let's Decode the Future of Medicine with Technology
- Together

No spam, unsubscribe anytime.

Let's Decode the Future of Medicine with Technology
- Together

No spam, unsubscribe anytime.