Providing Context

So how did we get here?

Capstone Project

As a senior in Informatics, I partnered with Microsoft for a 6-month capstone to tackle a real-world design challenge: improving adoption of Z.AI, an internal tool for experimenting with prompt engineering using customer support data and Azure OpenAI.

Cross-Functional Collabortation

I worked alongside 1 researcher, 1 PM, and 2 engineers, with sponsorship from a Principal Engineer, Data Science Manager, and UX Designer. Together, we balanced user research, technical feasibility, and business priorities.

Z.AI is…

An internal Microsoft tool that lets engineers quickly experiment with prompt engineering using secure customer support data and Azure OpenAI. It empowers teams to test ideas safely and provide world-class support.

Z.AI is also…

Challenging to learn, with a steep learning curve and usability issues that slowed early adoption. Many users struggled to complete basic tasks, limiting the tool’s impact despite its potential.

Providing Context

Why does Z.AI matter?

Enhance Support Efficiency

Reduce errors and streamline workflows so support teams can resolve issues faster.

Empower Support Teams

Equip employees with AI-driven tools that improve accuracy and speed in decision-making.

Drive AI Innovation

Strengthen Microsoft’s leadership in AI-powered customer service by making cutting-edge tools usable at scale.

Research to solutions process

Uncovered Pain Points

Through user research and heuristic analysis, we identified where Z.AI’s complexity created barriers for adoption.

Defined Key Personas

Synthesized findings into two user archetypes (Creator and Consumer), capturing their motivations and frustrations.

Designed Targeted Solutions

Created and iterated on design concepts that addressed usability and onboarding challenges.

Validated with Users

Tested prototypes with Z.AI users to refine solutions and measure improvements in usability.
Our research revealed two distinct user groups that shaped the design challenge:

→ Creators pushed the limits of Z.AI but struggled with its steep learning curve, often losing time to trial-and-error.
→ Consumers valued efficiency and quick wins but were frustrated when the platform prioritized flexibility over clarity.

Why these personas matter

Through surveys (30), usability testing (10 participants), and heuristic evaluation, we uncovered consistent friction points across navigation, workflows, and onboarding. These patterns highlighted where Z.AI’s steep learning curve was holding users back

Research Methods

Opportunities for improvement

We translated these pain points into four design opportunities, ensuring each solution targeted a specific user frustration.

User Flows

Based on our personas and pain points, we mapped user flows to capture how Consumers and Creators would move through Z.AI. These flows helped us uncover friction points early and align on opportunities to simplify interactions

Use Case #1

Goal: Create and share experiments with flexibility and control

Key Friction: Overwhelming configuration screens, unclear naming conventions

Design Focus: Streamlined creation flow with clearer inputs and progressive disclosure

Use Case #2

Goal: Explore experiments and analyze results without technical overhead

Key Friction: Too many setup steps before meaningful output
Design Focus: Reduce clicks, surface default values, and guide users with contextual prompts

Sketches

I started with quick sketches to translate research insights into possible design directions. These low-fidelity explorations allowed me to rapidly test different layouts for onboarding, experiment creation, and support resources before committing to wireframes.

Building on my sketches, I translated key concepts into low-fidelity wireframes. The goal wasn’t visual polish, but speed, to test flows, prioritize features, and validate that our design direction solved real user pain points before investing further.

Lo-Fidelity Wireframes

Homepage: Simplified entry point that makes tools easier to find (addresses navigation issues)

AI Chat: Centralized interaction model with clearer flow (addresses workflow inefficiencies)

Learn Page: Central hub for tutorials and resources (addresses lack of guidance)

Experiment Creation: Streamlined form for setup with advanced options hidden until needed (addresses onboarding difficulty)

Home Page Improvements

Redesign: Reduced sidebar to five tabs, added a global search, and surfaced the three most important CTAs (Explore, Create, Chat). Added a Social Hub for community updates.

Impact: Clearer navigation, less cognitive load, and stronger sense of community.

The old homepage was cluttered with unnecessary tabs and text overload, making it hard for users to find core actions.

Before

After

The experiment creation flow had eight steps and no clear guidance, overwhelming new users.

Create Page Improvements

Redesign: Reduced steps to five with a wizard UI, added guidance text and tutorial videos, and introduced progress tracking.

Impact: Faster, less confusing setup that supports both beginners and advanced users.

Before

After

AI Chat Page Improvements

The original page overloaded users with text and technical terms like “temperature” and “top-p value.”

Redesign: Simplified the layout, moved the chat input to the bottom for immersion, and added tooltips to explain advanced settings.

Impact: More intuitive and approachable chat experience, lowering the barrier to entry.

Before

After

Q&A Page Improvements

The old Q&A page was text-heavy and overwhelming, burying critical guidance in long paragraphs.

Redesign: Transformed it into a new Learn page with clear sections for tutorials, FAQs, community discussions, and office hour recordings. Used visuals and structured layouts instead of walls of text.

Impact: Simplified onboarding, reduced cognitive load, and gave users faster access to help and peer support.

Before

After

Our Impact

From our work redesigning Z.AI, Microsoft's LLM experimentation platform, we streamlined prompt engineering workflows and introduced progressive disclosure to simplify onboarding, reduce friction in experiment setup, and improve support efficiency across engineering teams.

Capstone Showcase

For the Informatics Capstone showcase, we set up a display with a poster, prototype walkthrough video, and a video prototype. I designed the poster layout, filmed the walkthrough, helped create the prototype video, and presented the features I led along with the ideation phase. Our design received positive feedback, and we had great conversations with judges and guests throughout the event.