My RoleUX Designer
TimelineMar - Aug 2020
2 x UX Designers
1 x Researcher
Designed research to understand and design with users who are blind, understanding current assistive technology usage and shopping behavior. Synthesized insights, design principles and design opportunities.
Designed the conversational experience of the smart glasses, building sample dialogues and VUI flow,
Design the prototype and testing sessions to iteratively improve the experience leveraging user's existing behaviors.
Scripted and directed a video to showcase the features and scenarios when IRIS can bring value to the users.
Product Demo Video scripted and co-directed by Weixi
8.96 million blind and
visually impaired in 2050
According to CDC, there are currently around 4.3 million people in the US with blindness and visual impairment, and this number is projected to double by 2050. In a world that mostly is designed BY and FOR the sighted people, the blind often face daily obstacles to perceive their environments, such as reading printed instructions, sorting mails, and differentiating medicine bottles.
User who is blind using Seeing AI to frame a picture to read text
01. Simply hold an object
Iris aims to remove the restrictions that current technologies have by allowing users to recognize objects through simply holding or pointing at an item with their hands.
02. Ask for desired information
Users can ask Iris for specific information from the item that they are holding. For example, the ingredient of the sanitizer gel.
03. Receive information directly
Iris’ sound system and haptic feedback allow users to ask and receive the information through headphones if they need privacy.
After doing literature review, we gained a general understanding of how blind people use the Internet. With that knowledge, we spoke with experts in this field to expand our understanding of blind people’s online shopping behaviors.
The key takeaways from expert interviews are:
• Blindness is a spectrum, narrow down the target user.
• Don’t assume how the blind use technology.
• Don’t exclude those who have lower tech proficiency.
• Examine the current accessibility guidelines.
• Start talking with the users NOW!
9 Phone Interviews with Directed Storytelling
At first, we wanted to understand our user's online shopping journey by collecting first-hand data of blind users' past experiences, opinions, and attitudes through interview. We also wanted to identify any advancements and frustrations that exist within the current technologies
5 Remote Moderated Studies
Through Zoom screen sharing, we observed participants online shopping behaviors such as browsing and selecting both familiar and unfamiliar items. Our goal was to gain a deep understanding of blind people’s online shopping behaviors and the pain points of using the tools, to spot any missing features that could ease their process of online shopping.
Intentional Shopping For Efficiency
Most blind people are intentional shoppers, who are not interested in browsing items without purpose or taking website’s recommendations as reference.
Familiarity and Recurring Items
Familiarity with items and websites during shopping is crucially important to blind customers.
Relevant and Specific Information
Blind users are specific about relevant information about products: they use screen readers to skim websites through only reading key elements.
Inter-dependent with Assistive Technology
Most blind people like using VUI devices for short commands to complete every day tasks, such as setting alarms, checking time and the weather.
I design the dialogue flow of recognizing an object and text, identifying specific information from an object. The conversation, saving items on the glasses for easy future recognition, and calling family and friends through the glasses. Possible error cases are also highlighted in these user journeys.
In order to validate the scenarios people would like to use our product, We conducted an online survey that got 35 responses, to understand importance of each use cases, and explore scenarios they encounter that we might have missed.
Our objective for testing was to understand the conversational and interactive experience, instead of the ergonomic design of the wearable, which is why we choose to use a head-mount to stabilize the smart phone on the participant
We shipped the headset to 3 participants. We video-called them and once they wore the headset with a phone, we could see the items they were holding,
We use Wizard of Oz techniques and I acted as Iris, following the dialog flows that we prepared and spoke the information that is in users’ camera’s frame.
While my teammates act as the facilitator and note taker. With that, we tested both success and error cases with the participants.
Simply holding or pointing at an item with their hands and ask for desired information about the object. The camera on the glasses will take in the images to extract information being asked and provides an efficient experience to users with real-time feedback.
We leverage users’ existing habit of using a screen reader, to jump between different paragraphs. User can tap or say the request verbally to skip or revisit any part of the document. Users can also adjust Iris’ pace of speaking to a comfortable level to read long documents faster or slower.
With Iris, users can save an item with a specific name, for faster recognition in the future, such as medicine bottles that they use often. For staple items, they can also use Iris to recognize it, and add it to the linked shopping account.