Home

Design Projects

About Me

Copy Taiki's Email

Home

Design Projects

About Me

Home

Design Projects

About Me

Table of Contents

What is Fetch?

Fetch is a mobile app that rewards users with points for specific actions like scanning receipts, shopping online, engaging with partner products, and other activities within the app. These points can be redeemed for gift cards from popular retailers.

What is "Snap" a receipt?

"Snapping" a receipt is the core action where users scan their receipts. Optical character recognition reads the receipt and determines the points a user earns.

This flow is the catalyst for almost every feature in the Fetch app including offers, boosts, personalization, etc.

Role

Product Designer

Timeline

2 Months (Q3-Q4 2023)

Team Formation

Design, Product, Engineering, Data Science

Role

Product Designer

Timeline

2 Months (Q3-Q4 2023)

Team Formation

Design, Product, Engineering, Data Science

The Problem

The Fetch receipt camera was at the core of the product. As the user base grew, pain points grew louder:

  • Performance gaps: Slow capture and upload times.

  • Outdated UI: The camera's interaction patterns felt static and clunky compared to modern, sticky experiences.

  • Third-Party Dependency: The receipt scanning pipeline replied on a 3P called "MicroBlink". While sufficient, it acted as a "Black box" bottleneck that allowed little visibility into what signals were available to improve the UX.

We needed to rebuild the camera into a trustworthy and intuitive experience.

  1. What the camera experience looked like

Problem Recap

  • Current experience: Slow, unreliable and dependent on a third party "Black box".

  • Core flow: Almost every Fetch user uses this feature, yet the feature didn't feel exciting or sticky.

Project Goal

Redesign Fetch's core receipt snapping flow by replacing a third-party dependent camera technology with a sticky, intuitive experience.

Alignment

Team Alignment

Initially, there was an underlying tension - Were we solving a real problem or just redesigning for the sake of it?

We ran a quick workshop to break down what we were really trying to solve. The results looked like this:

  • Users: A camera that just "works" every time.

    • Captures the right information

    • Can navigate through the flow seamlessly (Open camera, take photo, submit, etc.)

  • Engineering: An architecture that reduced errors and supported ML-driven improvements.

  • Product: Improve camera open to submit rates

Framing the project around the MicroBlink to in-house migrations gave us confidence that we weren't simply refreshing the visuals for the sake of refreshing - we were addressing structural issues and unlocking future opportunities.

Team Alignment

Initially, there was an underlying tension - Were we solving a real problem or just redesigning for the sake of it?

We ran a quick workshop to break down what we were really trying to solve. The results looked like this:

  • Users: A camera that "just works" every time.

    • Captures the right information

    • Can navigate through the flow seamlessly (Open camera, take photo, submit, etc.)

  • Engineering: An architecture that reduced errors and supported ML-driven improvements.

  • Product: Improve camera open to submit rates

Framing the project around the MicroBlink to in-house migrations gave us confidence that we weren't simply refreshing the visuals for the sake of refreshing - we were addressing structural issues and unlocking future opportunities.

  1. Team alignment. No need to reinvent the wheel.

Alignment Recap

Success = Reliability, and improvement in camera open to submit metric.

Framed the redesign as part of the 3P to in-house migration, not just a visual refresh.

The opportunity wasn’t about inventing something new-

It was about amplifying an underutilized mechanic, making it visible, and turning it into a feature that could scale.

Design Phases

Discovery

By building our own receipt OCR, we gained transparency into the pipelines and flexibility to shape both inputs and outputs. We took this approach to keep breaking down the problem.

Through these channels, we found that our camera experience lacked transparency in information and the excitement and "stickiness" that other camera experiences displayed.

UX and Interaction Ideation

This project was unique in that solidifying the interaction design - which is traditionally saved until after the UI is established - was the initiator of what the UI could shape up to be. To be bold and create a truly intuitive experience, we needed to own the fluidity of the motions to be lead the design.

This resulted in us following a model of:

  • Early and often prototypes

  • Quick feedback syncs with stakeholders

  • Feasibility jams with engineers

The core of what we were aiming to do was find the right balance of autonomy and control. i.e. What can we remove from the user's cognitive load as they're going through this intricate flow of alignment, live camera feed, tap interactions, data input and output, etc. to ultimately get them to their receipt details screen.

^^^

^^^

The fragility and complexity of all these factors made this one of the most fun projects I've ever worked on.

The fragility and complexity of all these factors made this one of the most fun projects I've ever worked on.

One of the most distinctive explorations we continued to explore was a "Slide to submit" feature. Instead of the static button, users tapped down and slid to confirm - triggering subtle haptics, color shifts, and animations on screen.

The inspiration came from a core childhood memory I distinctly remember from visiting my grandparents in Japan each summer.

A festival game called Kingyo Sukui - which translates to goldfish scooping. In the game players use a delicate paper scooper to transfer goldfish from a shared tub into their own bowl.

At first, it's frustrating and fragile - but once you get the hang of the motion with some guidance from the game host, it becomes deeply satisfying. A feeling and motion you won't forget.


I wanted to replicate that same balance: a small moment of friction that, with clear instruction, becomes an addicting and rewarding interaction.

The slide-to-submit motion reinforced user intent while adding a unique, memorable moment of delight.


But this came with its own set of challenges.

Initially, we explored and released a couple of different ideas to a small cohort of users:

What we quickly realized, we were overly dependent on tooltips and copy to teach. More user tests and feedback sessions, revealed the constantly changing, non-static camera feed, disoriented users' focus and attention to be diverted from where we needed.

This failure forced us to rethink how we teach and guide users.

We moved away from the text instructions and introduced a more visual and animated approach to submitting the receipt. We also included a first time user intro animation to bring delight to the experience.

See the resemblance? :)

After experimenting with this iteration to a small cohort of users, we immediately knew this'll become the foundation for the final slide-to-submit interaction.

Hi-Fidelity Specs

Specified motion specs

This resulted in us following a model of:

  • Early and often prototypes

  • Quick feedback syncs with stakeholders

  • Feasibility jams with engineers

The core of what we were aiming to do was find the right balance of autonomy and control.

i.e. What can we remove from the user's cognitive load as they're going through this intricate flow of alignment, live camera feed, tap interactions, data input and output, etc. to ultimately get them to their receipt details screen.

  1. Current Experience Breakdown/Audit. Fetch User & Partner POV.

Design Phases Recap ————

  • Start with the foundation: Mapped core user journeys with low-fi sketches and flows, validating concepts and technical constraints early.

  • Emphasis on For Dev Specs: Delivered high-fi interactive prototypes with a distinct visual identity, polished micro-interactions, and full end-to-end user flows.

Project Release

The new snap experience was launched as a cross-functional initiative between Product, Design, Engineering, and Data - not just as a visual redesign, but as a complete re-integration via an in-house OCR system.

  • Rapid iterations: Ideate, iterate, test, and re-iterate.

  • Cross-functional collaboration: Because the backend teams were working in parallel with the design, we had to stay closely aligned throughout the entire project for feasibility and timeline reasons.

  • Bringing more delight to Fetch: Millions of apps worldwide. We continued to make Fetch as delightful as possible.

By transforming what was once a fragile, background experience into a confident feature, we saw success across all metrics.

The camera became something that Fetch could proudly highlight.

Project Outcome

Individual performance

The rebuilt camera experience turned Fetch's most used feature from a behind-the-scenes utility tool into a proudly showcased experience - one that redefined why Fetch feels so rewarding.

96%

First snap success

First snap success

90%

Overall snap success

Overall snap success

+15%

Week 2 retention

Week 2 retention

Reflecting back…

What went well:

  • Fluid prototypes: Continually updating and showing stakeholders and users a prototype (rather than static screens) helped push this project efficiently.

  • Cross-functional partnership: Lockstep alignment with engineering, product, and data allowed us to push the boundaries on this experience.

What could've gone better:

  • Less reliance on copy: We were potentially too conservative with how we taught users. We quickly learned that showing is much better than telling.

Table of Contents

Send me a message on Linkedin!

Send me a message on Linkedin!

Send me a message on Linkedin!