Evidence of Research Impact & Qualitative Evaluation
These Evidence Briefs document how I reduce risk, improve decision quality, and deliver usable insight in complex systems.
Each brief highlights a scoped problem, the evaluation or research work I personally executed, and the measurable outcomes that followed. This format reflects how I actually work: I come in, run the study, surface human risk early, and give teams evidence they can act on quickly.
Singlestore
Research Operations for Scalable UX in Enterprise SaaS
Project: Enterprise UX Research Program Build | Company: SingleStore | Role: Senior UX Researcher (IC) | Timeframe: 2020–2021
Problem
Enterprise SaaS teams lacked consistent research practices across products. Insights were fragmented, turnaround was slow, and teams struggled to make confident product decisions at scale.
What I Did
- Operationalized an end-to-end UX research program supporting core cloud and on-prem products, focusing on speed, rigor, and reuse.
- Standardized research protocols, templates, and documentation across teams.
- Built scalable insight repositories to centralize findings and reduce duplicate research.
- Enabled Product and Design partners to run low-friction studies independently.
- Led mixed-methods studies aligned to roadmap decisions and customer risk areas.
Impact
- Reduced research turnaround time by 25%.
- Increased stakeholder response rates 3×.
- Doubled NPS across key workflows.
- Cut free-trial signup time by 50%.
- Improved consistency and decision confidence across product teams.
Why This Matters
This work reduced delivery risk by making research faster, repeatable, and actionable—so teams could ship with evidence instead of assumptions.
Core Skills Demonstrated
Research operations · Program enablement · Mixed-methods UX research · Insight systems · Cross-functional collaboration · Decision clarity at scale
View supporting artifacts →WEX Inc.
Human-in-the-Loop UX & Qualitative Research in FinTech
Project: Workflow & System Evaluation for High-Risk Enterprise Products | Company: WEX Inc. | Role: Senior UX / Human Factors Researcher (IC) | Timeframe: 2024–2025
Problem
High-stakes operational workflows were complex, and automated monitoring failed to surface nuanced usability, comprehension, and cognitive-load risks affecting both users and downstream operations.
What I Did
- Designed and led human-in-the-loop evaluation frameworks to detect real-world failure modes early and translate them into actionable product signals.
- Conducted qualitative evaluations and usability audits across 12+ enterprise products.
- Identified edge cases, regression risks, and workflow breakdowns missed by automated metrics.
- Led mixed-methods studies combining task analysis, observation, and structured feedback.
- Built scalable evaluation tools and reporting frameworks to accelerate insight delivery.
Impact
- Reduced Account Manager cognitive load, enabling higher customer self-service.
- Improved early detection of usability and comprehension risks in complex workflows.
- Accelerated roadmap decisions without sacrificing evaluative rigor.
- Strengthened human-in-the-loop validation across operational systems.
Why This Matters
This work ensured product quality was measured against human judgment—not just system outputs—reducing downstream risk and improving decision confidence.
Core Skills Demonstrated
Qualitative evaluation · Human-in-the-loop testing · Usability auditing · Failure-mode analysis · Workflow analysis · Scalable evaluation frameworks
View supporting artifacts →