Project Summary

Switch season at Wefox is when brokers earn their year. Every new contract starts with the same task. The customer hands over their old insurer's document. The broker copies the fields into our switch form by hand. It's the highest-volume task in their workflow, and the slowest.

I led the design of an OCR-powered flow that extracted the data, prefilled the form, and let brokers review in under a minute. But this was Wefox's first AI feature, and the real challenge wasn't technical. It was trust. Brokers don't trust data they didn't type themselves, and the top performers were the most resistant to change.

So I designed two things in parallel. A review experience that made AI output faster to verify than to retype. And a phased rollout that gave brokers control over when AI moved from optional to default.

The rollout framework became Wefox's template for shipping AI.

Responsibility

Research

Wireframes

User Flow

User Personas

UI/UX Design

Prototyping

Tools

Figma

Miro

The Bottleneck Nobody Wanted To Talk About

It was switch season at Wefox. Every broker I shadowed had the same setup: two monitors, a stack of old insurer documents, a half-finished coffee. Flip a page, type a field. Flip, type. For hours.

Each contract started the same way. The customer handed over their previous insurer's document. The broker copied the fields into our switch form by hand. Name. Address. Date of birth. Policy details. Again and again.

It was the highest-volume task in their workflow. It was also the slowest. And the more brokers we hired, the more we paid for the same repetitive work.

Could AI fix this?

What I Actually Heard In Interviews

Before I sketched a single screen, I sat down with brokers across regions. I expected complaints about typing. I got something more interesting:

"I'd rather type it myself than check someone else's work."


That line, in different forms, came up in nearly every interview. Brokers didn't just want speed. They wanted certainty. And they trusted their own fingers more than they trusted anyone or anything else.

Three things stood out.

Brokers re-checked every field they didn't type themselves. So automation only saves time if review is faster than typing.Document quality varied wildly. Clean PDFs, blurry phone photos, smudged scans, all in the same week.

The brokers most resistant to OCR were the top performers. They had muscle memory for the old form. To them, any change was friction.

That last finding rewrote the brief. The risk wasn't that OCR would fail technically. It was that we'd ship a working feature, and nobody would use it.

The Decision: Phase Trust, Don't Force It

I pitched a three-phase rollout. The idea was simple. Separate the moment we launched the technology from the moment we changed broker behavior.

Phase 1. Dual entry: Brokers chose between OCR or manual filling. Every correction they made became training data.

Phase 2. AI training: OCR became the recommended path. Manual stayed available. We measured confidence per field and let brokers correct errors.

Phase 3. Full adoption: OCR became the default. Not because we forced it. Because brokers had already chosen it.

This was the most important decision of the project. It traded short-term adoption metrics for long-term trust. It also gave the ML team a steady stream of broker-validated data to retrain on.

That framework outlived the project. Other teams at Wefox later used it to ship their own AI features.

Understand

User Research

User Interview

Competitive Analysis

Define

User persona

User Journey

Ideate

User Flow

Information-
Architecture

Design

Wireframe

Hi-Fi Designs

Prototype

Test

Feedback

Conclusion

Future concept

When The First Prototype Flopped

We built a prototype. I sat down with three in-house brokers to test it. It went badly.

The OCR took over three minutes to populate the form. Long enough that one broker shrugged, opened the form in another tab, and started typing manually while we waited.

And when the form finally filled in, review felt heavier than typing. Brokers scanned every field, top to bottom, looking for errors. By the time they finished, they could've keyed the form from scratch.

The speed problem went to engineering. I worked with the ML team to bring processing time down to a threshold where brokers would wait instead of giving up.


The review problem was mine. I redesigned it around three principles:

Confidence-weighted review. Fields the model was sure about got visually quieter. Uncertain fields got attention. Brokers stopped scanning everything and started scanning what mattered.


Inline correction. Any field could be edited in place. No modals. No leaving the review screen.


Source document side by side. The original document stayed visible next to the extracted fields. Brokers could verify without switching context.


The next test went differently. The same brokers who'd given up on the first prototype were finishing review in under a minute.

Outcomes

  • Reduced switch-form completion time from manual baseline to OCR-assisted flow.

  • Field validation under 1 minute. The success metric I defined upfront, hit in the second iteration.

  • OCR became the default form-filling method once accuracy crossed broker-approved thresholds. Brokers chose it. Nobody forced them.

  • Established the AI rollout pattern Wefox reused for subsequent AI features.

What I Took From It

Trust is a design material. The dual-entry phase looked like a compromise on paper. Two flows to maintain. Slower headline adoption. In practice, it was the reason adoption stuck. Brokers chose OCR because they'd been allowed to verify it on their own terms.

Performance is a UX feature. A slow correct answer loses to a fast manual one. Every time. Latency wasn't an engineering afterthought. It was a design constraint, and I treated it like one.

Reflection

I came into this project thinking the hard part would be the interface. The upload flow. The review screen. The visual treatment of extracted fields.

The hard part turned out to be quieter than that. Brokers weren't worried about AI being wrong. They were worried about being held accountable for AI's mistakes. The interface was just the surface. The real design work happened underneath, in how we structured trust, control, and accountability.


If I were doing this again, I'd spend more time with the brokers who didn't like the feature. The top performers who resisted OCR taught me more about what was actually broken than the ones who loved it from day one.


The thing I'm proudest of isn't the feature. It's that the rollout framework outlived the project.

© 2026 Nasim Raeesi