Michele Hart-Henry, Managing Director, CP Health
Having recently returned from the annual health technology conference, ViVE, in Los Angeles, I would summarize the atmosphere as expensive and slightly confused. Between the sprawling, multi-space booths and the high-end giveaways, the investment in “the future of health” is staggering.
Yet, walking the floor, I noticed a curious paradox: for all the capital being poured into the room, many players struggled to articulate exactly what they do for the people at the center of the ecosystem, or why they do it.
AI was, predictably, the oxygen of the conference. It was everywhere. But as I sat through sessions and navigated the show floor, it became clear to me that we are in danger of building a very expensive, sophisticated house without ever asking the residents who will live in it what they need.
We are obsessed with how to solve healthcare’s problems with AI, but I’m not sure which of healthcare’s myriad problems we’re addressing. At the same time, I fear we’re neglecting the who and the why.
Are we reducing stress and burnout in clinicians? Are we making it easier for health systems to accurately bill payers, or to plan for capacity issues? Are we making it less expensive to answer routine questions and phone inquiries? Are we making it quicker to find clinical trials? Are we connecting home devices to remote patient monitoring tools? The answer to all of these questions, and many others like them, is yes. That’s exactly why all of these companies come to conferences like ViVE or HIMSS.
But why? Why are we doing this? And most importantly, for whom are we doing this? At Connelly Partners, we believe that if you start with humans at the center, their fears, their language, and their motivations, the technology finds its rightful place as a high-powered assistant.
The Inclusion Gap: About Them, Not With Them
One of my biggest takeaways from the week is that AI is currently being built in a vacuum. Most Large Language Models (LLMs) are trained on data written by clinicians and researchers. While that’s great for clinical accuracy, it creates a language divide. Clinicians and patients don’t speak the same language, nor do they share the same priorities.
We see this in the development process: too much AI is being created about the patient, but not with the patient. If we don’t include the consumer in the co-creation process, we risk building “solutions” that are technically brilliant but practically alienating. For example, ambient transcription is a godsend for reducing clinician burnout, and we should celebrate that, but are we sharing those insights with the patient? Are we using that technology to help the patient see the “whole picture” of their own health, or is it just another way to automate a back-office task?
The Data Silo Problem (Again)
It’s 2026, and we’re still talking about interoperability. Even within the same platforms, the mountains of data we’re collecting don’t always talk to each other.
More importantly, we are ignoring a goldmine of unstructured data. Patient-supplied notes, journals, and lived-experience observations often stay outside the system. Yet this “subjective” data is often the most vital context for healthcare, helping people live their most fulfilled lives. If AI can’t digest the patient’s voice alongside the clinician’s expertise, it isn’t really “intelligent.” It’s just a fast filing cabinet.
At the conference, I observed three divergent tracks of AI:
- The Enterprise Track: Tools built for and by health systems to streamline operations, which were much more prevalent.
- The Clinician Track: Tools intended to make it easier for practitioners to do their jobs while reducing administrative burden.
- The Consumer Track: Tools patients use independently to increase their health literacy, the builders of which were clustered among the startups at the fringe aisles of the larger players.
The concerning point? These worlds aren’t harmonized. Dr. Google is dead; long live Dr. AI. Consumers are already using conversational AI as their first entry point into the health journey. They are turning to AI “companions” to explain their labs or symptoms before they ever call a doctor, or to translate what a doctor told them. If the healthcare system doesn’t keep up and find a way to bridge these tracks, the gap between “system-speak,” “clinician-speak,” and “patient-reality” will only widen.
Moving Toward “More Birthdays”
It wasn’t all tech for tech’s sake, though. There were moments of genuine clarity that reminded me why we do this work.
City of Hope, for example, shared how they are using AI to bring clinical trial opportunities directly to the bedside. By giving doctors the ability to find and recommend trials in the moment, they aren’t just “optimizing a workflow”—they are living up to their mission of providing “More Birthdays.”
The Architecture of Care: Tool vs. Foundation
It feels as though we are treating AI as if it is the healthcare system, when in reality, it should simply be the scaffolding that allows ALL humans in the system to function better. When we prioritize the “How” (the algorithm) over the “Who” (the patient and practitioner) and the “Why” (the outcome), we build a system that is technically efficient but emotionally bankrupt.
My take on all of this is simple: Technology should make the invisible visible. It should capture the patient’s whispered concerns in unstructured data, bridge the literacy gap between clinician-speak and care-delivery-reality, and automate the mundane so that the human connection can flourish.
More AI isn’t the answer. Thoughtfully designed, co-created tools are. AI should be used to eliminate the friction faced by ALL humans when navigating and operating in the healthcare world. Not just patients. Not just doctors. All humans. And as such, all humans (not just those charged with lowering costs) should be part of the discussion that identifies where the friction is and how AI can help.
After all, the most expensive booth in the world can’t buy the trust that comes from finally feeling seen, heard, and understood. That isn’t a tech problem; it’s a human one.



