Case Study

Scroll down

Scroll down

Case Study

Platform

Omnichannel
Cross-platform system

Role

Product Designer
(UX Strategy, Research, Design)

Team

CEO, Process
Engineers, Ops, QA, Dev

The Ask

Improve client satisfaction through quality control

Stakeholders saw AI as a path to better consistency, and stronger trust.

Integrate AI into managers’ workflows — without friction

Adoption needed to feel invisible: no added complexity, no learning curve.

Design with both users and marketing in mind

The tool needed to work and serve as a marketing asset for sales teams.

What We Built

An AI-powered tool to map site coverage and generate smarter audit routes

Automatically reduced blind spots and redundancy by analyzing history, risk, and site complexity.

A new system integrated into the managers' daily workflow — with zero friction

The system was designed to fit within daily reviews — streamlining existing tasks instead of adding a new one.

A polished feature designed to support sales and marketing narratives

Showcased SBM’s operational intelligence in a way that was easy to demo and hard to ignore.

The Outcome

The new system showed high adoption rates in unmoderated settings. The tool led to a predicted 2x increase in site coverage and a 70% drop in redundant inspections — delivering consistent quality control with less manual oversight.

You're about to enter the realm of details. Below is the detailed process that led to the end product.

The Plan & Process

To elevate quality control, I had to break it down first — a layered study approach for a layered process

Before jumping into screens, I stepped back to study the behavior behind the workflow. My goal was to understand how managers make decisions — identifying key points, external influences, and how their mental model shaped the flow. I interviewed users to map their mental model, then I continued with contextual inquiry and workflow mapping to observe real-world behavior and surface hidden steps, informal workarounds, and handoffs between roles that wouldn’t show up in usage data. I scoped the study to follow the full audit lifecycle — planning, assigning, executing, and reviewing — to uncover where design could better support natural behavior and reduce friction.

I conducted several interviews, then created a journey map to visualize the ideal flow described by managers

I mapped the process into key phases — planning, assigning, preparing, executing, documentation, and reviewing — using what managers described in interviews. Even in this idealized flow, gaps surfaced: decisions relied on memory, tools were fragmented, and critical steps lacked system support — all of which hinted at behavioral mismatches.

To validate the journey told in interviews, I observed managers in the field and mapped how they actually dealt with audits on site.

I conducted on-site observation to see how audits were actually approached in real conditions. This helped me capture behaviors that might not be articulated — including shifts in attention, rerouting in the moment, and real-time decision-making. I translated these observations into a behavioral map to contrast stated expectations with lived experiences.

Findings & Key Insights

The behavior study made one thing clear: we needed a routing tool built on managers’ mental models and decision triggers

Mapping the gap between the expected flow and real behavior exposed the root UX issues. Managers weren’t struggling to use the tool — they were struggling to make it fit how they actually think and operate. Their planning process wasn’t linear or structured — it was reactive, memory-based, and driven by external pressures. I wasn’t fixing a feature — I had to redesign around how managers think without adding load to their processes. The system didn’t reflect that reality — and that’s where the design had to shift.

Key Insights

The interface lacked coverage state — forcing managers to rely on memory

Managers couldn’t see what had been inspected vs. what was at risk, or where to go next — the UI offered no visual hierarchy or state cues to guide planning.

Without structure or prioritization, audit planning became random and reactive

The system didn’t support sorting by risk, recent issues, or audit history — leaving managers to decide routes on the fly, often based on habit, urgency, or convenience.

The product wasn’t designed for planning — only for reporting.

It showed completed audits and scores, but didn’t help managers know what to do next. And when plans changed mid-walk, the system had no way to flex with them.

The repository worked for reviews and clients talks, but….

  • No indicators for what’s been audited vs. what hasn’t

  • No visuals to show urgency, risk, or gaps

  • No way to group by zone, risk, or urgency — everything blends together visually

  • Audit results dominate — but there’s no planning layer

  • UI focuses on presenting scores, not supporting decision-making

Shaping The Solution

I handed the use cases to process engineers — they mapped the logic into AI, while I designed for the pain points

We used observed behavior — like rerouting based on complaints, location, and staff availability — as input for how the AI should prioritize audits. While they worked through the logic layer, I moved forward with addressing the remaining gaps through design: reducing planning guesswork, surfacing site coverage, and supporting the way managers actually think under pressure.

Process engineers turned real-world audit behavior into logic paths — one for each use case. This map turned unpredictable audit behavior into structured routing logic

While engineers focused on turning observed triggers into audit routing, I used their outputs as a baseline to design for the gaps they couldn’t solve — like helping managers stay oriented, make sense of AI decisions, and adapt when the plan changed.

The Solution Design

I redesigned the audit repository to reduce planning friction and lay the foundation for AI-powered routing

Managers started their day in the audit repository, but the original interface was built for reviewing the past — not planning the future. I redesigned the page to surface what’s been inspected, what’s been missed, and what needs attention next. This redesign reduced the mental load of planning and made space to support daily routing, based on real behavior.

The new header is designed for progressive disclosure and highlights KPIs — giving managers a fast, scannable snapshot of site coverage and quality

Managers memorized the high-level metrics, so the new header surfaces just enough context to trigger action — while keeping deeper detail accessible when needed. This supports both fast scanning and more informed decisions, without overwhelming the start-of-day flow.

Below the KPIs, I split the repository to 3 sections — giving a comprehensive and customizable view into the site state

  • Redesigned audit cards to give a quick, visual snapshot — serving both usability and marketing goals.

  • Site segmentation and filters reduced guesswork, surfacing priority areas fast.

  • Area type cards showed patterns in what kinds of spaces were failing — supporting more strategic decisions.

Managers react to what they see in the web app — so the generator had to be placed there and designed to work around their flow, not dictate it

Managers didn’t want a system telling them where to go — they wanted help deciding where to start. The generator used simple prompts to guide them through key priorities like missed areas, recent failures, or high-risk zones. This not only reduced mental load, but gave managers a sense of control.

The Outcome

We translated manager behavior into system logic — turning audits from reactive tasks into AI-guided routines

What started as scattered audits based on memory and pressure became a system-driven routine grounded in real context. Managers now begin their day with a plan shaped by behavior, not guesswork — and they’re able to adjust with confidence when things change. The design didn’t just make the product better — it made the job easier. And by embedding AI where it mattered, we created a foundation for smarter quality control at scale.

Reflection

Designing with Behavior, Not Just Features

This project reminded me that designing for behavior means going deeper than what users say — it’s about how they actually think, react, and adapt in the real world. The most meaningful shifts didn’t come from adding features, but from restructuring how the system worked around the way managers made decisions under pressure. Collaborating with process engineers also taught me how powerful UX can be when it works hand-in-hand with operational logic.

Thank You for Your Time!

Get in Touch