Patients weren't completing their rehab programs. Not because they didn't want to recover, but because nothing on the market gave them a reason to keep going. I led product and experience strategy from day one at Jintronix: competitive analysis, ethnographic research across clinical populations, team leadership, and agile delivery. Seven years later the platform had 500+ clinical clients across North America.
Jintronix was my deepest build. I joined as the founding PM when there was a prototype and a hypothesis. When I left, there were 500+ clinical clients, peer-reviewed outcome data, and a product that therapists recommended by name. The competitive analysis, the ethnographic research, and the team building all ran at the same time. Clinical outcomes were the constraint that didn't flex on any of it.
First and only PM for the first three years. No inherited roadmap, no existing research practice, no PM playbook. I built all of it from scratch while also doing the work.
As the company grew I put together the team: one UX designer, four Unity developers, one portal developer, one QA, and eventually a second PM. I ran sprints, kept research moving into the backlog, and tried to make sure nobody was ever blocked waiting on a product decision.
Get a patient to finish their prescribed program. Every product decision was evaluated against that single question.
Jintronix was built on a hypothesis competitive analysis confirmed: no digital rehab platform had solved patient engagement. The market was scheduling tools and exercise libraries. The engagement gap was the opening and clinical credibility was the gate to every enterprise sale along the way.
I benchmarked every digital rehab platform I could find across patient engagement mechanics, clinical workflow integration, outcome measurement, and provider adoption. The picture was the same everywhere: scheduling tools, exercise libraries, telehealth bolt-ons. Nobody had touched engagement.
Before any design work started, I spent weeks inside rehabilitation centers just watching. Not running structured sessions. Just observing. Where patients stopped. Where therapists intervened. Where the paper program fell apart at the kitchen table or the clinic hallway. The behavioral gap between what was prescribed and what actually got done wasn't a data problem. It was a design problem nobody had tried to solve.
I synthesized those observations into testable hypotheses, not findings. Each one had a recommended mechanic attached and went to leadership with a business rationale before any design or engineering investment was made. The research established three north-star KPIs: adherence rate, average session length, and patient-reported satisfaction. Those governed every subsequent product decision.
None of these were popular when I made them. All three turned out to be right. Here's what the data showed and what I recommended.
Research showed patients used what their therapist recommended. Competitive analysis showed no existing platform had invested in therapist-facing tooling at all. The adoption gate was the prescriber. Not the patient.
Build both at the same time, even though it meant more scope. I framed it as a go-to-market risk question, not a development cost question. If therapists couldn't see patient data, they wouldn't prescribe the product. Simple as that.
The therapist dashboard became our most powerful enterprise sales tool. Clinicians with real-time adherence data became product advocates, driving the referral network that took us to 500+ clients without a paid acquisition model.
Pilot data showed leaderboards and peer benchmarking drove higher engagement. But clinical advisor review flagged that comparing recovery rates across patients with different diagnoses was clinically inappropriate and a liability in enterprise sales.
Cut social comparison. Redirect gamification to self-referential progress: patients competed only against their own prior performance. I presented the enterprise sales risk alongside the engagement data so leadership could weigh both.
Short-term engagement dipped slightly. Long-term, the decision unlocked hospital and senior care facility partnerships that would have been impossible with social comparison in the product.
After year two, the product had strong pilot data but known rough edges. The clinical team wanted more iteration. Sales had three hospital networks ready to sign. New entrants were beginning to surface in competitive analysis.
Scale now. I presented a risk analysis that distinguished cosmetic limitations from functional ones, and proposed expanding with existing partners while addressing gaps in concurrent sprints.
All three hospital networks signed within the quarter. The cosmetic limitations were addressed in two sprints. The first-mover position we protected became a meaningful barrier to entry for later entrants.
The sprint pattern was simple: make a hypothesis, build the smallest version that could test it, measure it against the KPIs, bring a recommendation to the next sprint review. I kept ceremonies tight and made sure research was feeding the backlog continuously. The thing I was most paranoid about was people waiting on product decisions.
The pilots were where the product either proved itself or didn't. I ran sessions at partner rehab centers in person, watching how therapists introduced the system and where patients dropped off. We tested gamification mechanics as separate hypotheses and killed the ones that didn't move adherence numbers. The ones that did got doubled down on immediately.
The biggest shift in that phase wasn't a feature. It was realizing therapists needed to understand the activity library through their own clinical mental model: organized by function and joint, not by game name. Once we restructured the prescription interface around how therapists already thought about treatment planning, adoption changed. They stopped being cautious about the tool and started recommending it by name.
Before / After: Therapist Prescription Interface
The clinical pilots. Being in the room when a therapist saw the dashboard for the first time and realized they could actually see what their patient did at home. That never got old. It was the clearest signal that the product was doing something real.
Building the team. Going from doing everything myself to having people who cared about the problem as much as I did, and watching them solve things I wouldn't have thought of.
The therapist is the customer. Not eventually. From day one. Everything else flows from that one insight, and you're going to spend two years half-believing it before the data makes it undeniable.
Also: document everything. Not for process reasons. Because the decisions that feel obvious now will feel mysterious in three years, and you'll spend more time relitigating them than it would have taken to write them down.