ux / product designer

© karena vongampai

seller support research @ Amazon

student ux designer / researcher | 1 year

Due to a Non-Disclosure Agreement, I cannot share every aspect of my work.

I worked with other designers and researchers to explore how contextualization could be applied to a machine learning support system for Amazon sellers. We utilized multiple UX research and design methods to meet the dynamic needs of stakeholders. We began by looking into the use of contextualization across different websites, then worked to understand the unmet needs of 3rd party sellers. From that research we worked to develop large design concepts to meet those needs. And lastly, we switched gears by working on designing and testing feedback systems within the Seller Support system.


I worked with a variety of undergraduate and graudate students in the Human Centered Design and Engineering program at the University of Washington. We were advised by HCDE professor Jan Spyridakis and Amazon Senior UX Researcher Michael Berg. I am truly thankful to the collaborative team I worked with, who all shaped the way I approach problem solving.

User Research with Amazon Sellers

Interviews with sellers to understand their experience with Seller Central.

High Level Concept Designs

Exploring design solutions that help sellers manage their businesses.

Designing and Testing User Feedback Systems

Designing and evaluating methods of digitally collecting user feedback.

HCDE Research Showcase Presentation

Presenting my experience, process, and learnings at a departmental showcase.


I worked with a group of 4 to plan and conduct semi-structured interviews and a survey with Amazon sellers. We created our survey questionnaire to focus on understanding how sellers seek help. Additionally, we conducted 4 semi-structured interviews. From this experience, I learned how to plan and troubleshoot remote usability tests.

Skills Gained
  • Edit and write non-biased survey and interview questions
  • Logistically plan and execute remote usability tests
  • Probe and question participants
  • Analyze data to interpret meaningful research findings

After conducing research with Amazon Sellers, we switched gears and designed solutions for 2 tasks:

  • Help sellers manage their businesses
  • Help sellers access help and support for their problems in Seller Central
Participatory Design with Sellers

We conducted a participatory design session with 30 sellers. In these collaborative sessions we evaluated our preliminary sketches and ideas, then listened to user feedback. We split the sellers into 4 different groups, to evaluate designs of different purposes. All groups followed the same format of informal focus group discussion, sketching concepts,refining sketches, discussion, and a final discussion about the session.


To summarize our high level concept designs, we created moodboards to inspire other designers working on the same problem. These moodboards contained our refined sketches, participant sketches, and other pieces of inspiration.

Overlay allows sellers to annotate and interpret their Seller Central accounts for themselves.

Seller Connect enables sellers to seek support and advice from a community of sellers prior to reaching out to Seller Support.

Outreach allows sellers to attend digital and in-person seminars, where they can learn about managing their accounts from Support representatives.


The last UX task we had with Amazon was to design and test different methods for collecting user feedback. During this time we had to reflect on what type of information would be the most useful to Amazon, and what forms of data can best measure the user experience of a Support system. Additionally, we had to understand what would encourage users to interact with the feedback system.

Design Challenge

How can we engage sellers to provide meaningful feedback about the system?

Iteratively Sketching Ideas and Solutions

We began by thinking big and sketched "out of the box" ideas. The sketch below shows a feedback system that allows users to provide contextual feedback with a wearable, inspired by Life Alert.

The next step was to brainstorm how Amazon could collect feedback within the constraints of their digital help experience. The image below showcases a preliminary sketch of a few contextual chatbot ideas.


After brainstorming how we wanted our contextual chatbot to interact with users, we created wireframes to explain and test how we could collect feedback. An example of a wireframe we created is shown below. This wireframe identified different ways of collecting data on how useful and relevant their support experience was, to solving their problem.

Guerrilla Usability Test

We had 3 days to test our concepts, so we decided to seek out anyone who did not have a background in UX. For this evaluation stage, I walked friends and school mates through my click-through prototype, which can be found here. Secondly, we asked users about their general perception of feedback, and how they interact with it. Then we asked them about which of the options above best embody their support experience. This was important since we wanted to ask the correct question when collecting feedback.

Usability Test with Amazon Sellers

After performing our guerrilla usability test, we walked away with the impression that we should design and test 3 things:

  • How often should the contextual chatbot appear?
  • When should the contextual chatbot appear?
  • What kind of data should the chatbot be collecting?

To wrap up our experience with Amazon, my research group and I presented at the 2017 HCDE Research showcase. The following image is the poster we created to summarize and reflect on our UX journey.