Design Document

Designing an
Agentic Vision
Workflow,
for Fly Fishing.

Author
Chad Stauffer
Role
Sole Designer & Developer
Stack
React Native · OpenAI Vision · Supabase
Status
Live Beta ↗
What

What is
HatchMatch?

AI-first mobile app for identifying flies and insects from photos
Transforms angler uncertainty into visible confidence matching
Built end-to-end as a solo designer / developer
The real design problem wasn't the camera-to-result flow — it was what the interface does when the model isn't sure
Why

Why
Agents?

Visual ambiguity requires reasoning, not rules
Identification requires judgment, not just pattern matching
Users need to understand why a result was suggested — not just what it is
A bad rig costs you fish. Uncertainty should be surfaced, not suppressed.
How

How I Used
Agents

The model makes classification and confidence calls
The app adds structure, guardrails, and context
Prompts were tuned iteratively based on real failure cases — not lab conditions
The confidence score is a design output, not just a model artifact — it shapes every UI decision downstream
When the model is uncertain, the interface says so — with a different rig recommendation posture and a clear visual treatment
Stack

Built With
What?

React Native
Cross-platform mobile
iOS beta-active
OpenAI Vision
Insect classification
Confidence scoring
Node.js
API layer
Prompt orchestration
Supabase
Backend & auth
Session data
Vercel
Web hosting
Fast deploys
Solo Build
Design + development
0 to live beta
Status

Where It
Stands

Live Beta · TestFlight
Beta Testing — iOS Active
HatchMatch is in active beta testing on iOS via TestFlight. Real anglers, real rivers, real feedback. The app is being refined in the field based on actual use — not simulated edge cases. If you fly fish, your feedback is worth more than a lab test.
Join Beta on TestFlight
@stfrcreative Full Case Study ↗