Carla Velastegui is an international advisor who champions placing patients and caregivers at the heart of every digital health innovation—a message for startups worldwide, including those across Asia’s highly diverse regions. In this interview with HealthTechAsia, conducted shortly after the HLTH Europe event in Amsterdam, she shares insights on how to design truly people-centric approaches—where the voices of patients, caregivers, and clinicians are not an afterthought, but central to the entire process.
Carla Velastegui is a Canada- and Switzerland-based healthtech advocate who believes in the transformative potential of technology in healthcare — but she also calls for careful oversight. She is committed to ensuring that innovation is both accessible and equitable, without deepening existing disparities.
“We need to approach this intentionally,” she explained. “Technology must be designed in ways that reduce health and equality gaps — not widen them. It should ease the burden on patients and caregivers, not add to it. If we co-develop solutions with these values in mind, we can truly make a difference.”
With a strong background in digital health, having worked with startups and governments in Canada, Velastegui also has clinical expertise in rehabilitation and a solid foundation in systems innovation. As if that weren’t impressive enough, she is also a long-term primary caregiver for her mother, who lives with an early-onset neurological condition.
“Healthcare systems must work for everyone — not just the easiest to reach. That includes those who face the greatest barriers to access and quality care: people living in rural and remote communities, those experiencing homelessness, individuals with low digital literacy, and members of immigrant and refugee communities.
These are not edge cases — they are part of our communities. Designing healthcare solutions with and for them, is essential,” she said.
Velastegui, who works to bridge clinical and technical teams by highlighting the critical role of patients and caregivers in healthcare design—particularly given that caregiving often sustains health systems while remaining unpaid, invisible, and isolating—emphasises that designing inclusive digital health tools is not only the right thing to do — it also makes business sense.
“Too often, we’re not designing for the realities of the populations around us. I recognise that building accessible and inclusive technology takes time and effort — but there’s real value in it. If your product only reaches 20% of the population, you’re leaving out the majority of your potential market. Designing for 90 or 95% opens up far greater opportunities.”
The key, she said, is building genuine relationships with the many communities that reflect the full diversity of our society.
“Work collaboratively with diverse cultural groups and identity-based organisations, as well as trusted local community partners in your target regions. Approach these relationships with genuine humility and willingness to listen deeply — really learning — what matters to them in healthcare.
Quality of life means different things to different people. What it means to me won’t be the same as what it means to you — or to my mother. But through meaningful and respectful engagement, we can identify common themes, shared values and weave them into better, more inclusive healthcare solutions.”
She acknowledged that this isn’t easy work — but insists it’s necessary, both for business viability and for the kind of health systems we aspire to build.
“If we’re serious about improving care, we have to examine our own biases and be willing to do the work. That includes embedding different perspectives into the entire development process.”
A critical factor is building trust.
“When new tools are introduced, they’re often not trusted or adopted by patients, or they don’t fit into clinical workflows,” Velastegui explained. “As a result, tools with great potential to support healthcare staff — by improving efficiency and reducing administrative burden — fall flat.”
She added that many solutions fail because they weren’t co-designed with the end users in mind. “They miss key factors like trust, or how patients actually live day to day.
If a provider is managing multiple appointments and navigating an EMR system, expecting them to suddenly integrate a new tool that hasn’t been embedded into their reality — it’s just not realistic.”
Velsategui encourages healthtech founders to approach innovation with empathy and humility. “I always ask founders to consider: if this were you, your partner or parent navigating the system – how would you want it to respond? How should these tools work for you? What kind of support would feel not just efficient, but dignified, seamless – and in some ways, simply human? That shift in perspective can be powerful.”
The role of policy in embedding patient voices
Velastegui believes that robust policy frameworks can play a pivotal role in ensuring that patients and caregivers are meaningfully involved in the development of health technologies.
“One big opportunity we have is to strengthen policy requirements or provide clearer guidance around incorporating patients into the design of new tools,” she said. “Health Technology Assessments (HTAs) already exist in regions like Canada, the US, the UK, and Europe — and they offer a natural place to embed these voices.”
She however acknowledged that not all digital health tools undergo HTA, and even when they do, the inclusion of patient and caregiver perspectives is often limited or tokenistic.
“While some HTAs include questions about the societal or ethical impact of technologies, there isn’t yet a consistent or explicit requirement to incorporate patient and caregiver input in a meaningful way. That’s a gap we urgently need to address.”
Velastegui calls for the systematic integration of these voices into HTAs, especially for tools going through formal submission and approval processes for medical devices and software.
On a broader level, she points to the need for ethical frameworks and policy guidance that address the rapid evolution of technologies, especially in areas such as artificial intelligence.
“There was a very helpful framework recently released by the US National Academy of Medicine — a code of conduct for artificial intelligence in health and medicine. It offers practical tools for developers, founders, and startups to begin thinking critically about how they include patient voices and assess the broader impact of their innovations.”
While that framework is US-based, Velastegui argues that our thinking around AI in healthcare must extend beyond national boundaries.
“One of the biggest challenges — and opportunities — with AI is that it doesn’t respect borders. The same goes for healthcare itself. Diseases travel. Patients move. Technology scales rapidly across markets. So we can’t afford to regulate AI in silos,” she said.
“I believe we need a shared global conversation — across oceans and continents — about what we value as a society when it comes to health. We need alignment on how AI should be used to genuinely support improved outcomes for communities and populations.
That means thoughtful, coordinated frameworks, regulatory mechanisms and governance structures that reflect our collective values, not just national priorities.”
Velastegui sees these policy tools as essential not only for the ethical development of AI, but also for identifying and addressing bias in datasets, mitigating systemic risks, and building trust in digital health systems.
“We need stronger governmental guidance on how to ethically and systematically bring in patient and caregiver partners. And by caregivers, I mean unpaid individuals—family members or friends — who support someone through an acute or chronic condition, whether on a short-term or ongoing basis.
These individuals are essential stakeholders and must be recognised as such in the design and regulation of new tools.”
Innovating with responsibility
Velastegui also acknowledged the growing role of large technology companies in healthcare—and the importance of balancing innovation with responsibility.
“I think big tech can bring enormous value to healthcare. They offer innovation, resources, and new ways of thinking. But we absolutely need strong policies, clear regulations, and ethical frameworks to ensure that this is done thoughtfully and in a way that earns public trust.”
“It’s not just about innovation for its own sake. We must make sure we’re not only benefiting select subgroups, but truly supporting society as a whole to improve health outcomes.”
She also stresses the role of citizens in holding big tech—and governments—accountable.
“We, as members of society, also have a voice. We’ve seen how collective action—when we raise concerns or push back—can lead to real change. And we must remember that at some point, all of us will be affected by the healthcare system, either directly or while caring for someone close to us. It’s more a question of when than if.”
She concluded with a call for more engaged, informed citizens and forward-thinking governance. “Big tech isn’t inherently bad for healthcare. In fact, they can bring capabilities and delivery models that governments or smaller organisations often aren’t equipped to offer.
But we need to stay ahead in terms of regulation—and we need citizens who are actively involved, asking questions, and shaping the future of healthtech.”
CEO of one’s own health journey
Looking ahead, Velastegui envisions a future where people feel truly empowered in their healthcare—where they are no longer passive recipients, but active participants.
“I want patients to feel engaged with their own care—to feel a sense of ownership over their health. And I think technology plays a big role in that. For example, when people have access to their own health data, they can better understand their diagnosis, their care plan, and what actions they can take.
That understanding leads to empowerment. When patients are informed and included in the conversation, they’re far more likely to take meaningful steps to manage their health.”
She continued: “People should be the CEO of their own health journey. And we can support that by using technology thoughtfully—designing tools that are simple, intuitive, and don’t add to cognitive overload. Technology should be integrated seamlessly into healthcare systems, not make things more complicated.”
Velastegui believes the ultimate success of healthtech lies in enabling the human side of healthcare. “Healthcare is deeply personal. It’s human. It requires empathy. So technology should enhance that—not replace it. Tools should free up clinicians to focus on what really matters: the patient in front of them.”
She imagines a future in which patients arrive at appointments as equal partners—bringing their own data, their own questions, and a clear understanding of how to improve their quality of life. “That’s my dream: a world where tools integrate smoothly into both clinical workflows and the lived reality of someone navigating an acute or chronic condition.”
In that vision, people are not overwhelmed by clunky platforms or siloed systems. Instead, Velastegui sees a future where patients and caregivers—no matter their background—can remain active in society, in their communities, and in their jobs.
“Technology should reduce the administrative burden and the mental load. That way, people can still be volunteers, employees, neighbours—fully engaged in life.
That’s what it means to be the CEO of your own wellbeing.”
Discover more from HealthTechAsia
Subscribe to get the latest posts sent to your email.