All Work Case Study

Curology

illustration

Redesigning how patients request prescription formula changes, for both the person asking and the provider answering.

Product Design UX Research Healthtech Q3 2022
57%
Questionnaire completion
In first 25 days post-launch
1,700
Patient interactions
Within the first 25 days
36hrs
Provider wait time cut
Eliminated by structured intake

The Problem

illustration

Patients whose skin needs changed had no good way to ask for help. So they waited, or gave up.

Curology's Custom Formula is the core product: a dermatological prescription tailored to each patient's unique skin. But skin changes. Goals change. And when patients needed a formula update, the only path was a 7-step messaging thread that buried the request under generic support flows and left them waiting up to 36 hours for a provider response.

That wait was a churn risk. A patient who feels unheard by their skincare provider doesn't stick around. The challenge was building a path that worked for both sides: simple enough that patients would actually use it, and structured enough that providers could act on it without playing 20 questions first.

How might we... Give patients a clear, low-friction way to request formula changes while giving providers everything they need to respond without the back-and-forth?

The Process

illustration
01

Mapping the Friction

I started by walking through the existing flow end-to-end. Seven steps to send a message about a formula change. Each step added cognitive load and introduced a place to drop off. The 36-hour response window wasn't just a wait time problem. It was a trust problem. Patients who couldn't see progress assumed nothing was happening. I mapped every decision point and identified where the process was asking too much of both sides.

02

Interviewing Providers

The most useful insight in this project came from the medical team, not patients. Provider interviews revealed that the majority of their follow-up messages were asking for the same three things: how often the patient was using their current formula, whether it was causing irritation, and what their goal was. If I could collect that upfront, I could eliminate the back-and-forth entirely. The design problem shifted: this wasn't just about patient UX, it was about building a smarter handoff.

03

Two Rounds of Design

Round one pulled from mobile messaging patterns, making the request feel as lightweight as replying to a text. Round two added a targeted mini-questionnaire after medical team feedback flagged that the first version still didn't give providers enough context. Working with our Senior Content Designer, we shaped the questions to feel personal rather than clinical, and leveraged our existing quiz component so engineering didn't have to build from scratch.

04

Testing and an Unexpected Finding

I collaborated with our Senior User Researcher to test six patients (ages 18-45) on mobile. We ran the existing messaging flow as a control and the new card experience as the experiment. Five out of six preferred the old flow. Not because it was better, but because it was familiar and felt like fewer steps even though resolution took much longer. That result didn't invalidate the new design; it told me the answer was to offer both paths, not pick one.

Findings

illustration
01

Providers needed structure, not messages

Medical provider interviews revealed that most follow-up messages were asking for the exact same information. Usage frequency, irritation levels, and goal. Collecting those three things upfront could eliminate most of the back-and-forth entirely, cutting provider response time by 36+ hours and making requests actionable on first read.

02

Familiarity beat efficiency in testing

5 out of 6 usability participants preferred the existing messaging flow over the new card experience. Not because it was better, but because it felt like fewer steps despite taking much longer to resolve. This wasn't a failure of the design. It was a signal that both pathways needed to coexist rather than one replacing the other.

03

Visibility on the dashboard changed the math

Placing the formula change card directly on the patient dashboard (not buried in a messages flow), meant patients encountered the option before they even thought to go looking for it. Discoverability drove interaction volume. Without that placement decision, the 1,700 interactions in 25 days would not have happened.

The Solution

illustration
[ Formula change card on dashboard ]
[ Mini-questionnaire flow ]
[ Confirmation screen ]

Two paths to the same outcome: a formula change card on the dashboard for patients ready to engage, and the familiar messaging route for those who weren't.

The formula change card lived directly on the patient dashboard, surfacing the option before patients went looking for it. Tapping it opened a short questionnaire covering usage frequency, irritation levels, and goals: the exact information providers needed to act without follow-up. The questions used existing quiz components, which kept the build fast and the experience consistent with other parts of the product.

Rather than forcing all patients into the new flow, I kept the messaging path intact. Testing told me that patient preference and patient benefit aren't always the same thing, and the right call was to serve both rather than bet on one.

Design decision

Keeping both pathways wasn't a compromise. It was the finding. Testing revealed that familiarity has real weight for users, even when a better option exists. A design that ignores preference data in favor of efficiency metrics isn't actually user-centered.

Impact

illustration
1,700
Patient interactions
In first 25 days post-launch
57%
Questionnaire completion
Submitted full formula change requests
50%
Traffic volume
Results came from A/B test only
36hrs
Provider wait time eliminated
Structured intake replaced follow-up messages

Learnings

illustration
01

Design for both sides of the exchange

The real unlock in this project came from provider interviews, not patient research. The patient-facing problem was visible and obvious. The provider-side problem was invisible until I went looking for it. Solving for both at once is what made the feature actually work: a smoother patient experience that also gave providers what they needed to act without follow-up. I wouldn't have gotten there by only talking to patients.

02

User preference and user benefit aren't the same thing

Testing surfaced a real tension: most participants preferred the old flow, even though the new one led to faster resolution. That preference was real and worth respecting, which is why we kept both paths. But it also clarified something about how to read usability results. Preference data tells you what people are comfortable with today. It doesn't always tell you what's better for them. Knowing the difference is the job.

More Work