Improving the Notebooks Experience for Data Professionals

Project Context

Background & Problem Statement
Our team was tasked with enhancing the Notebooks feature within a data analytics platform. Notebooks allow users to create documents containing both code and rich text, enabling interactive data analysis. Despite its potential, user feedback and support tickets indicated that the current experience was confusing, lacked visual feedback, and did not fully replicate the familiar Jupyter Notebook environment.
Goals

  • Identify pain points and usability barriers for core user groups (Data Analysts, Data Engineers, Database Specialists, Data Scientists).

  • Increase task completion rates and user satisfaction.

  • Reduce support calls and bounce rates by making the experience more intuitive and reassuring.


My Role

As the Lead UX Researcher, I owned the end-to-end research process:

  • Designed and executed the research plan.

  • Created reusable study guides and onboarding materials for the team.

  • Led recruitment, panel management, and compliance efforts.

  • Synthesized findings and collaborated with designers and product managers to translate insights into actionable design changes.

  • Mentored junior researchers and instituted ethical guardrails throughout the project.


Research Methodology

Methods Used

  • User Interviews: To understand user expectations, workflows, and pain points.

  • Remote Usability Testing: Observed participants interacting with a live preview of the Notebooks feature.

  • Heuristic Evaluation: Benchmarked the experience against industry standards (e.g., Jupyter).

  • Surveys: Collected quantitative data on satisfaction and task completion.

Rationale
A mixed-methods approach allowed us to capture both the “why” behind user frustrations and the “how” of their interactions, ensuring a holistic understanding of the problem space.


Research Process

Recruitment & Panel Management

  • Recruited 4 participants via Respondent.io, representing a cross-section of our target user base (Database Specialist, Lead Data Scientist, Data Engineer, Data Analyst).

  • Managed panel records securely, ensuring GDPR compliance and participant confidentiality.

  • Provided fair incentives aligned with industry standards.

Execution

  • Conducted remote sessions using Zoom and screen-sharing tools.

  • Used a standardized script and scenario-based tasks to ensure consistency.

  • Recorded sessions and transcribed key moments for analysis.

Data Analysis

  • Thematic coding of qualitative feedback.

  • Quantitative metrics: task completion rates, error rates, time-on-task.

  • Triangulated findings with support ticket data and product analytics.

Ethical Guardrails

  • Informed consent obtained from all participants.

  • Data stored securely and anonymized for reporting.

  • Regular compliance reviews with legal and product teams.


Key Findings

Pain Points

  • Confusing Iconography: The “Play” button was mistaken for a video control, not a code execution command.

  • Lack of Visual Feedback: Users were unsure if their actions (e.g., running code) had any effect; execution time alone was insufficient.

  • Error States: Errors were not visually distinct; users wanted clear, color-coded feedback (e.g., red highlights, boxed error messages).

  • Guide Structure: Users preferred a clear separation between “How-To” instructions and a “Sandbox” for experimentation.

  • Partial Jupyter Parity: The experience felt familiar but incomplete compared to Jupyter, leading to hesitation in adoption.

Positive Feedback

  • Core notebook actions (create, edit, delete, move) were clear.

  • Default endpoints met user needs.


Impact

Design & Product Changes

  • Replaced the “Play” icon with a purple “Run” button, aligning with user expectations and existing Query Editor conventions.

  • Introduced visual success cues (e.g., status messages, chart updates) to reassure users of successful execution.

  • Enhanced error messaging with color highlights and boxed formatting for clarity.

  • Redesigned onboarding: separated “How-To” guides from interactive sandboxes, allowing users to learn before experimenting.

Business Outcomes

  • Reduced Support Calls: Early metrics showed a 20% decrease in support tickets related to Notebooks.

  • Increased Task Completion Rate: Usability testing post-redesign saw a 30% improvement in users successfully running code and interpreting results.

  • Improved NPS: Net Promoter Score for the Notebooks feature increased by 15 points.

  • Decreased Bounce Rate: Fewer users abandoned the feature after initial interaction.


Deliverables

  • Usability Testing Reports: Detailed session notes, video highlights, and task completion metrics.

  • Personas & Journey Maps: Synthesized user archetypes and mapped their workflows and pain points.

  • Annotated Screenshots: Before-and-after visuals illustrating design changes.

  • Reusable Study Guides: Templates for future research, shared with the team.

  • Onboarding Toolkit: Step-by-step guides and checklists for new users and researchers.


Reflections

What I Learned

  • The importance of clear, immediate feedback in data tools—users need reassurance and transparency at every step.

  • Iconography must align with user mental models; even small mismatches can cause significant confusion.

  • Separating instructional content from interactive practice empowers users to learn at their own pace.

What I’d Do Differently

  • Expand the participant pool for broader quantitative validation.

  • Integrate more longitudinal tracking to measure sustained impact.

  • Involve users earlier in the design ideation phase for co-creation opportunities.


Tooling & Tech Stack

  • Research Tools: Zoom, Respondent.io, Dovetail (for analysis), Google Sheets (panel management).

  • Design Collaboration: Figma, Miro.

  • Data Security: Encrypted storage, anonymized records, regular compliance audits.


UX Research Metrics

  • Reduced support calls by 20%

  • Decreased bounce rate

  • Increased task completion rate by 30%

  • Improved Net Promoter Score (+15)

  • Demonstrable ROI through increased user adoption and satisfaction