• How the AI Literacy Crisis Is Killing Healthcare | Dr. Steven Labkoff
    2026/05/12

    Send us Fan Mail

    AI is reshaping medicine — but the people in charge may not understand what they're actually using.

    Dr. Steven Labkoff, physician executive and former VP at Bristol Myers Squibb, joins Chris Hutchins on The Signal Room to expose a growing crisis hiding in plain sight: senior healthcare leaders deploying powerful AI tools without the literacy to use them safely.

    Drawing on decades at Pfizer, AstraZeneca, and the Multiple Myeloma Research Foundation, Dr. Labkoff breaks down why treating AI like a search engine is one of the most dangerous mistakes in modern medicine — and why "democratizing data" may be doing more harm than good.

    In this episode:

    • Why AI literacy is the most overlooked risk in healthcare leadership today
    • What "data chemistry" means and why your AI is only as good as it
    • The real dangers of over-trusting large language models in clinical care
    • How pharma's failure to treat data as a product is holding the industry back
    • Lessons from the front lines of Pfizer, AstraZeneca, and rare disease research

    Whether you're a clinician, health tech professional, or just someone who cares about where medicine is headed, this conversation will change how you think about AI in the exam room and the boardroom.

    Connect with Dr. Labkoff: www.PracticalAIinHealthcare.com

    Steve@LuminantConsulting.com

    https://www.linkedin.com/in/stevelabkoff/

    Connect with Chris Hutchins: https://www.linkedin.com/in/chutchins-healthcare/

    https://www.youtube.com/@SignalRoomPodcast

    https://signalroompodcast.com/

    Enjoying the show? Chris's new book Beneath The Signal is now available on Amazon → https://www.amazon.com/dp/B0GYJDQBZR


    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    49 分
  • Healthcare AI Fails at the Data Layer: Privacy, Governance & Trust | Sid Dutta
    2026/04/30

    Send us Fan Mail

    Sid Dutta is a 24-year cybersecurity veteran, former data-protection executive at American Express, Worldpay, Activision Blizzard, and OpenText and Founder & CEO of Privaclave AI. His work targets the layer most healthcare AI conversations ignore: the runtime data layer where pilots actually stall.
    In this episode, Chris Hutchins and Sid examine why healthcare data governance is the bottleneck in clinical AI, what privacy-preserving AI looks like in practice (tokenization, federated learning, secure enclaves, differential privacy), how organizations can collaborate on PHI without exposing it, why static perimeter controls fail in non-deterministic AI workflows, what healthcare leaders get wrong about vendor risk, and the shift from "block first" to enabling controlled data usage at runtime.
    For health-system executives, CIOs/CISOs, data and compliance leaders moving AI from pilot to production without compromising patient trust.


    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    1 時間 3 分
  • Healthcare Is Losing Its Best People | Provider Burnout, Trust & Ethical Leadership in AI with Poonam Patel
    2026/04/21

    Send us Fan Mail

    Healthcare AI and ethical leadership must give time back to clinicians, not take more away — Poonam Patel on AI strategy, provider burnout, and trust erosion in healthcare.

    Provider burnout is pushing clinicians out of healthcare at an unsustainable rate. In this episode of The Signal Room, Chris Hutchins sits down with Poonam Patel, a pediatric nurse practitioner turned healthcare strategy advisor, to examine what happens when the system built to care for patients stops caring for its own people. From pajama time documentation burdens to the erosion of trust between patients and providers, Poonam shares what she has witnessed firsthand across clinical and operational settings.

    What We Cover

    • Why provider burnout is a workforce sustainability crisis, not a wellness problem
    • How pajama time documentation burden erodes the patient and provider relationship
    • Where clinical AI and ambient clinical intelligence are actually giving time back
    • Why healthcare interoperability is still the biggest structural barrier to useful AI
    • What empathetic leadership looks like in healthcare organizations under pressure

    Key Takeaways

    • Trust drives adherence, not dashboards. Patients follow clinical guidance when they trust the provider delivering it. Systems that erode trust erode outcomes.
    • Giving time back is a survival strategy. Efficiency gains from AI should flow back to the clinician, not into more patient volume per shift.
    • Empathetic leadership has to run through every layer. Front-line supervisors need empathy training as much as the C-suite. Burnout is solved in the middle, not at the top.
    • Solve one problem well. AI initiatives fail when they try to fix everything at once. Pick one workflow, fix it end-to-end, and consolidate inside the EMR.

    Timestamps

    • 0:00 – Welcome and the shared mission behind the conversation
    • 2:33 – The multi-lens view: clinician, operator, and program builder
    • 6:45 – Pajama time and the intangible ROI of giving time back
    • 8:25 – Trust as the through line for patient adherence
    • 13:19 – The emotional toll on pediatric and frontline providers
    • 18:19 – Burnout, raising your hand, and why clinicians cope alone
    • 25:07 – Solving for the human component first
    • 28:32 – The workforce shortage and the incentive to enter healthcare
    • 32:00 – AI scribing, diagnostics, and early detection that actually helps
    • 36:28 – Interoperability and why AI has to live inside the EMR
    • 39:24 – Trust erosion and the case for empathetic leadership
    • 44:03 – Consolidating patient information and family navigation
    • 46:58 – Empathy as a management training requirement, not a poster
    • 49:21 – Closing thoughts and how to reach Poonam

    About Poonam Patel

    Poonam Patel, NP, is a pediatric nurse practitioner turned healthcare operator and co-founder with 20 years of experience across clinical care, consulting, and healthcare innovation. As Chief Operating Officer and Co-Founder of a care management and remote patient monitoring services company, she led o

    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    46 分
  • The Dark Side of the $50B AI Medical Boom | Lorraine Fernandes
    2026/04/15

    Send us Fan Mail

    The $50B AI in healthcare investment wave is outpacing what most health systems can govern — Lorraine Fernandes on AI strategy, AI governance, and the dark side of the medical AI boom.

    The $50 billion AI in healthcare investment wave is accelerating faster than most health systems can evaluate, integrate, or govern the tools arriving on their doorstep. Lorraine Fernandes, a global health information leader with 50 years at the center of clinical data strategy, joins Chris to examine what vendors leave out of their pitch decks and what health system leaders should be asking before signing their next AI contract.

    What We Cover
    • Why data stewardship is the single word that decides whether a $50B AI bet pays off or collapses
    • How the Health Information Management role is shifting from manual data entry to governance of AI-generated records
    • What global standards like ICD-11 and SNOMED reveal about the structural gaps AI cannot close
    • Practical upskilling moves that let HIM professionals thrive as AI tools replace rote work
    • Why leadership at the intersection of clinical, technical, and administrative functions is the real AI readiness test
    Key Takeaways
    • A trustworthy AI in healthcare strategy starts with data stewardship. If the inputs are ungoverned, the outputs are liability.
    • AI governance requires the HIM profession, not the other way around. Health systems that treat HIM as clerical work will inherit every bias, gap, and error their models produce.
    • Global terminology standards are the scaffolding for clinical AI. ICD-11 and SNOMED are not paperwork. They are the prerequisites for AI that can actually be audited.
    Frameworks & Tools Mentioned
    • IFHIMA (International Federation of Health Information Management Associations)
    • IFHIMA AI Toolkit
    • ICD-11 (WHO International Classification of Diseases)
    • SNOMED CT (clinical terminology standard)
    • World Health Organization digital health initiatives
    • Focus on the Future 2026 webinar series

    ## Timestamps 0:00 – The $50B AI Investment in Healthcare 1:40 – Evolution of HIM: From Paper to Digital Stewardship 4:55 – Curators vs. Creators: The New Role of Data Experts 8:45 – The Trust Factor: Why Stewardship Prevents AI Failure 13:10 – Global Perspectives: The IFHIMA AI Toolkit 17:25 – Digital Health Trends and WHO Initiatives 20:55 – Upskilling for the AI Workforce: Will AI Replace Jobs? 23:45 – Event Preview: Focus on the Future 2026 Series 26:05 – Deep Dive: ICD-11, SNOMED, and Global Classifications 31:00 – Building Better Health Outcomes Through Trusted Data

    About Lorraine Fernandes

    Lorraine Fernandes is a globally recognized expert in health information management whose 50-year career includes leadership roles at IFHIMA and sustained advocacy for data privacy, clinical terminology standards, and ethical digital health implementation. She works at the intersection of global policy and on-the-ground health system operations.

    Related Resource

    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    35 分
  • Strengthen your AI Projects in 2026. Privacy and AI Governance Insights with Andre Samokish
    2026/04/08

    Send us Fan Mail

    AI governance is the difference between shipping healthcare AI and watching the project get shut down — Andre Samokish on privacy, AI strategy, and governance for 2026.

    AI governance is becoming the difference between shipping AI in healthcare and watching the project get shut down. Andre Samokish, a privacy and AI governance expert, joins Chris Hutchins to explain why most AI initiatives will fail by 2026 and what responsible AI actually looks like inside organizations that refuse to take vendor assurances at face value.

    What We Cover

    • The concrete difference between privacy governance, AI governance, and cybersecurity, and why conflating them creates blind spots leaders will pay for later
    • Why governance is not a project blocker. It is the pathway that lets teams move fast without inheriting regulatory debt
    • The 3 pillars of AI literacy that separate organizations ready for responsible AI from ones that will inherit their vendor's mistakes
    • How to embed privacy by design into AI product workflows before launch, not after incidents
    • The failure modes hiding in data collection, model deployment, and organizational culture that teams routinely misdiagnose

    Key Takeaways

    • The "vendor has it covered" assumption is the single most dangerous governance gap in AI today. If you cannot explain how a model was trained, you cannot defend the decision it made.
    • AI literacy is not training. It is infrastructure. Organizations treat it as optional, then discover their executives cannot distinguish generative AI risk from traditional IT risk when regulators ask.
    • Data minimization is a governance principle before it is a privacy one. The less data you collect, the less exposure you carry through the model's full lifecycle.

    Frameworks & Tools Mentioned

    • OneTrust (privacy + AI governance platform)
    • IAPP (International Association of Privacy Professionals) certifications
    • Privacy by design methodology
    • AI literacy pillars (technical, operational, governance)
    • Vendor governance frameworks

    ## Timestamps 00:00 Introduction: The AI project failure wave of 2026 03:00 Andre Samokish on why AI governance is the root cause 09:30 AI strategy beyond proof of concept: what enterprises get wrong 16:00 AI implementation challenges that kill projects at scale 22:30 AI readiness: governance maturity vs. technical capability 29:00 Responsible AI development when privacy controls are inadequate 35:00 AI regulation signals and what they mean for 2026 planning 41:00 Leadership strategies for surviving the AI contraction

    About Andre Samokish

    Andre Samokish is a privacy and AI governance expert whose work spans regulated industries implementing responsible AI at scale. He advises organizations on embedding governance into product workflows, building AI literacy across technical and non-technical teams, and navigating the intersection of privacy law and machine learning practice.

    Related Resources

    • Episode: The Dark Side of the $5

    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    43 分
  • Good People Are Quietly Quitting: Ethical Leadership, AI Strategy & Why Culture Determines AI Success | Carly Caminiti
    2026/04/01

    Send us Fan Mail

    Ethical leadership and AI strategy collapse when the people executing the strategy are quietly burning out — Carly Caminiti on why culture determines healthcare AI success.

    Healthcare innovation leadership stops working when the people who execute the strategy are quietly burning out. Carly Caminiti, an ICF-certified executive coach and creator of the 5C Leadership Performance System, joins Chris Hutchins to examine why healthcare's best people are disengaging, why AI adoption amplifies the problem, and what ethical leadership in healthcare requires when strategy depends on humans who are under-resourced.

    What We Cover
    • Why "quiet quitting" is a governance signal, not a workforce trend, and what it reveals about leadership capacity
    • How executives promoted for clinical or technical skill end up running teams without ever learning how to lead
    • The 5C Leadership Performance System and why healthcare organizations need a repeatable framework, not more off-site retreats
    • What happens when AI transformation lands on top of existing burnout, and why technology strategy is fundamentally a people strategy
    • How to identify the high performers who are about to leave before they tell you
    Key Takeaways
    • The healthcare leaders who will survive AI transformation are the ones who invest in the people executing it. Tools do not fix culture. Culture determines whether tools get adopted.
    • Ethical leadership in healthcare is not a values statement. It is a weekly operating practice visible in how communication, feedback, and decisions happen across teams.
    • Retention is a leading indicator of AI readiness. Organizations that cannot hold onto their strongest people will not have the capacity to absorb AI-driven change.
    Frameworks & Tools Mentioned
    • 5C Leadership Performance System (Caminiti's 12-week executive coaching framework)
    • ICF (International Coaching Federation) certification standards
    • Executive coaching methodology for healthcare leaders
    • Burnout detection signals
    • Communication frameworks for team performance

    ## Timestamps 00:00 Introduction: The quiet quitting signal leaders are missing 03:00 Carly Caminiti on why culture eats AI strategy for breakfast 09:30 Ethical leadership as the prerequisite for AI adoption 16:00 AI leadership strategies that actually retain talent 22:45 Leadership ethics when automation changes the work itself 29:00 AI coaching for leaders: what it looks like in practice 35:30 Why quiet quitting is an AI governance signal 41:00 Building organizations where ethical AI and ethical leadership coexist

    About Carly Caminiti

    Carly Caminiti is an ICF-certified executive and personal development coach who works with healthcare and corporate leaders to build performance without burning out their teams. She is the creator of the 5C Leadership Performance System, a 12-week coaching program designed for leaders who need a framework they can actually apply, not another leadership theory.

    Re

    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    49 分
  • Healthcare Experts on Ethical AI in Operational Reality: AI Transformation Strategies and Healthcare Innovation | Markeisha Snaith
    2026/03/25

    Send us Fan Mail

    AI strategy for healthcare fails when strategic intent hits operational reality at the bedside — MarKeisha Snaith on ethical AI, transformation, and healthcare innovation.

    Healthcare innovation leadership rarely fails at the strategy level. It fails when strategic intent hits operational reality at the bedside. MarKeisha Snaith joins Chris Hutchins to examine the signals that matter most inside large health systems, why AI leadership strategies stall between planning and execution, and what distinguishes leaders who drive transformation from the ones who announce it.

    What We Cover
    • How AI governance decisions made in the boardroom play out at the point of care, and where the translation breaks
    • Why communication patterns inside health systems determine whether AI transformation strategies survive contact with operations
    • The operational signals leaders routinely miss because they live between departments, between roles, and between what gets measured and what actually happens
    • How to build healthcare leadership capacity for AI readiness before the technology arrives
    • What generational workforce shifts mean for leadership models in health systems
    Key Takeaways
    • AI transformation strategies that do not account for operational reality will not survive their own rollout. The strongest leaders treat clinical execution as the first-class design constraint.
    • Trust is the currency of healthcare innovation leadership. When communication breaks, AI tools inherit the distrust regardless of how good the model is.
    • Healthcare innovation requires both technical fluency and operational empathy. Leaders who have one without the other produce strategy decks nobody executes.
    Frameworks & Tools Mentioned
    • Strategic planning vs. operational execution frameworks
    • Healthcare leadership and system transformation methodology
    • Cross-generational workforce leadership models
    • AI governance decision-making in clinical settings
    • Communication cascades in large health systems
    Timestamps
    • 00:00 Introduction: what healthcare experts really face with AI transformation
    • 03:30 MarKeisha Snaith on AI governance in clinical reality
    • 10:00 AI transformation strategies that survive contact with operations
    • 17:00 AI healthcare innovations: what is working and what is not
    • 24:00 Healthcare innovation leadership at the intersection of tech and care
    • 31:00 Ethical AI when patient outcomes depend on the model
    • 37:00 Building healthcare leadership capacity for AI readiness
    • 43:00 The future of AI transformation strategies in health systems
    About MarKeisha Snaith

    MarKeisha Snaith is a healthcare leader whose work focuses on the operational reality of AI transformation inside complex health systems. She examines how strategic decisions cascade through clinical, technical, and administrative functions, and what it takes to build leadership capacity that

    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    53 分
  • Healthcare AI and Rare Disease Caregiving: Why Patient Advocates Deserve a Seat at the Table | Amanda Roser
    2026/03/18

    Send us Fan Mail

    Healthcare AI succeeds or fails at the connective tissue of care — Amanda Roser on AI strategy, rare disease caregiving, and why patient advocates belong at the table.

    AI applications in healthcare succeed or fail at the connective tissue of care delivery, the caregivers, patient advocates, and family members who hold fragmented systems together. Amanda Roser, who has spent 5 years navigating her son's rare genetic disorder across endocrinology, genetics, metabolic medicine, and gastroenterology, joins Chris Hutchins to examine what responsible AI in healthcare requires when the actual users are families, not specialists.

    What We Cover
    • How rare disease caregivers become the de facto data stewards, record keepers, and medical translators the system requires but rarely recognizes
    • Why interoperability failures create a "Groundhog Day" problem where patients retell their history at every appointment, and what AI could actually fix
    • How Amanda trained an AI tool on her son's daily health patterns and lab history, and the clinical conversation that shifted in real time when she showed it to a physician
    • The gap between what caregivers expect from healthcare systems and what systems actually deliver
    • Why patient advocacy panels belong at every healthcare innovation conference
    Key Takeaways
    • AI in healthcare that ignores caregivers is not responsible AI. Every system decision about interoperability, documentation, and coordination lands on the family in the waiting room.
    • Caregivers are the operational infrastructure the healthcare system depends on. Any AI strategy that does not account for this inherits the fragility the system already has.
    • Rare disease care is the stress test for healthcare innovation. If your AI tool does not work for multi-system patients, it will not work for anyone.
    Frameworks & Tools Mentioned
    • Care coordination across multi-specialty clinical teams
    • Healthcare interoperability standards (and where they fail)
    • AI-assisted patient advocacy and symptom tracking
    • Rare disease care models (glycogen storage disease type zero)
    • Digital health tools for caregiver-physician communication
    Timestamps
    • 00:00 Amanda's story: an ER dismissal that became a turning point for caregiver advocacy
    • 02:12 What caregivers expect vs. what the healthcare system actually delivers
    • 05:18 Becoming the coordinator: when parents realize the system depends on them
    • 10:12 The invisible operational burden families carry between appointments
    • 13:30 Gaps in patient tracking, documentation, and clinical communication
    • 16:21 Learning medical terminology as a
    Humanizing AI for Care.
    Empowering healthcare with ethical, scalable AI and data strategies that work.

    Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

    Support the show

    About The Signal Room: The Signal Room is a podcast and communications platform exploring leadership, ethics, and innovation in healthcare and artificial intelligence. Hosted by Christopher Hutchins, Founder and CEO of Hutchins Data Strategy Consultants. Leadership, ethics, and innovation, amplified.


    Website: https://www.hutchinsdatastrategy.com

    LinkedIn: https://www.linkedin.com/in/chutchins-healthcare/

    YouTube: https://www.youtube.com/@ChrisHutchinsAi

    Book Chris to speak: https://www.chrisjhutchins.com

    続きを読む 一部表示
    50 分