Helping Customers Understand Overdraft Options



Project Context

The Challenge:
U.S. Bank’s digital Overdraft Options page was confusing and intimidating for users, leading to misunderstandings about overdraft products, low enrollment, and increased calls to customer support. The dense, disclosure-like text and lack of clear visual separation between options made it difficult for users to quickly grasp the differences between overdraft coverage and protection, or to confidently make enrollment decisions.

Project Goals:

  • Improve user comprehension of overdraft services.

  • Increase enrollment in appropriate overdraft products.

  • Reduce customer support calls related to overdraft confusion.

  • Enhance the overall digital experience and user satisfaction.

Background:
The project was initiated as part of a broader effort to modernize U.S. Bank’s digital self-service tools. The overdraft page was identified as a high-friction touchpoint based on analytics and customer feedback.

My Role

As the Lead UX Researcher, I was responsible for:

  • Designing and executing the research plan.

  • Developing reusable research templates and study guides for the team.

  • Coaching junior researchers and reviewing their work.

  • Ensuring ethical standards and data security throughout the project.

  • Synthesizing findings and collaborating with designers, product managers, and compliance teams to translate insights into actionable recommendations.



Research Methodology

Approach:

  • Moderated Usability Testing: 1:1 sessions with 9 participants, focusing on both the current and proposed page designs.

  • Qualitative Interviews: To probe for comprehension, emotional response, and decision-making processes.

  • Comparative Analysis: Participants compared the current and proposed versions, including an explanatory video.

  • Metrics Tracked: Task completion rate, self-reported understanding (1-5 scale), and qualitative feedback on clarity and confidence.

Why These Methods?
Given the complexity of financial products and the need to understand both cognitive and emotional responses, qualitative methods were prioritized. Usability testing allowed us to observe real-time confusion points, while interviews provided depth on user motivations and needs.



Research Process

1. Recruitment & Panel Management:

  • Recruited a diverse panel of 9 participants, reflecting a range of banking experience and language proficiency.

  • Used a secure, GDPR-compliant panel management tool to track participation and incentives.

  • Incentivized participants with digital gift cards, ensuring fair compensation and high engagement.

2. Tooling & Tech Stack:

  • Remote video conferencing for moderated sessions.

  • Digital whiteboards for collaborative note-taking and affinity mapping.

  • Secure cloud storage for all recordings and transcripts, with access restricted to the research team.

3. Compliance & Ethics:

  • Obtained informed consent from all participants.

  • Anonymized all data before analysis.

  • Followed U.S. Bank’s strict data security and privacy protocols.

  • Instituted a review process to ensure all research activities met ethical and legal standards.

4. Data Analysis:

  • Thematic coding of qualitative data.

  • Quantitative scoring of user understanding and task completion.

  • Collaborative synthesis workshops with cross-functional stakeholders.



Key Findings


1. Clarity & Comprehension:

  • The proposed version was consistently rated as clearer and easier to digest.

  • Participants found the current version “intimidating” and “like a disclosure statement.”

  • Most users could not distinguish between overdraft coverage and protection in the current design.

2. Visual Hierarchy:

  • Lack of clear separation between coverage and protection led to confusion.

  • Users preferred boxed, visually distinct sections for each product.

3. Video Content:

  • The explanatory video was well-received, especially by visual learners.

  • Users appreciated its placement at the top of the page.

  • Feedback indicated the video was too fast and used abstract graphics; participants wanted real-life scenarios and a slower pace.

4. Decision Confidence:

  • The proposed design, with improved layout and video, increased users’ confidence in making enrollment decisions.

5. Accessibility:

  • Fast narration in the video was a barrier for English learners.




Impact


Design Decisions:

  • The research directly informed the adoption of the proposed page design, with clear visual separation and improved content hierarchy.

  • The video was revised to be shorter, slower, and more relatable, featuring real people and scenarios.

Business Outcomes:

  • Reduced Customer Support Calls: Early analytics post-launch showed a measurable decrease in overdraft-related support queries.

  • Increased Task Completion Rate: More users successfully enrolled in overdraft services without assistance.

  • Improved User Satisfaction: Post-launch surveys indicated higher Net Promoter Scores for the overdraft enrollment experience.

  • Decreased Bounce Rate: Fewer users abandoned the page before completing their intended action.



Deliverables

  • Usability Testing Reports: Detailed findings and recommendations.

  • Annotated Wireframes: Highlighting user pain points and proposed solutions.

  • Reusable Research Templates: For future usability studies.

  • Coaching Materials: Onboarding guides for new researchers and cross-functional partners.

  • Ethics Checklist: To ensure ongoing compliance and participant safety.



Reflections

What I Learned:

  • Even small changes in visual hierarchy and language can have a profound impact on user comprehension and confidence, especially in regulated industries.

  • Video content is powerful, but must be accessible and relatable to diverse audiences.

  • Building reusable research assets and coaching teammates amplifies the impact of research across the organization.

What I’d Do Differently:

  • Incorporate more quantitative measures (e.g., A/B testing with larger samples) to complement qualitative insights.

  • Engage with customer support teams earlier to identify additional pain points.

  • Pilot accessibility improvements (e.g., subtitles, multiple language options) in video content from the outset.



UX Research Metrics

  • Reduced support calls (measured via call center analytics)

  • Decreased bounce rate (web analytics)

  • Increased task completion rate (usability testing)

  • Improved Net Promoter Score (post-launch survey)

  • ROI: Streamlined enrollment process and reduced support costs