Usability and Inclusive Design
A Low Fidelity Prototyping Tool for Visually Impaired Users
Discovering an accessible design tool to include visually impaired users early on in the design process.
My Role: UX Visual Collaborator and Testing Analyst: Throughout the process, I collaborated closely with Rachel to translate her vision into tangible prototypes and findings that could be effectively communicated without relying on visual cues. This collaborative effort not only facilitated the development of a practical tool but also underscored the importance of inclusive design methodologies in creating solutions that cater to diverse user needs right from the inception of a project.
Methods: Walkthrough Analysis and Think Aloud Protocol Interviews, Low Fidelity Prototyping, Usability Testing
Backstory: As a blind Interaction Designer, Rachel Magario observed that prototyping for accessibility and the inclusion of visually impaired users typically occurred in later stages of the design process, primarily to integrate voice-over technology. Recognizing the importance of including these users right from the start, Rachel embarked on a mission to create a low-fidelity prototyping tool as part of her Interaction Design Thesis Research. Given her own visual impairment, Rachel relied on my assistance to develop visuals and convey research findings through non-visual means.
The inspiration for Rachel's project stemmed from RouteMe, a Campus Wayfinding initiative designed to aid individuals with navigating interior spaces. This project served as a catalyst, prompting Rachel to explore and innovate in the field of accessibility-focused design. By starting her research at the initial stages of design, Rachel aimed to address the gap she had observed in inclusive design practices, ensuring that the needs of visually impaired users were considered comprehensively from the outset.
Observation and Evaluation Stage
Identifying barriers visually impaired users face when navigating unfamiliar environments.
To explore these challenges, Rachel employed Walkthrough Analysis and conducted Think Aloud Protocol Interviews with visually impaired students attempting to locate rooms on campus. This methodology was selected to understand the resources and mental models these users relied upon for spatial navigation.
Key discoveries from the research included the diverse range of strategies employed by visually impaired students to navigate physical spaces. These strategies encompassed reading braille, seeking directions from others, tactile exploration of room signs to locate room numbers, tracing walls to understand spatial layout and identify entrances, and using canes or service animals for assistance.
In my role, I documented user interactions through video recordings and analyzed footage to capture gestures, non-verbal cues, directional preferences, and interaction with tactile graphics. This analysis culminated in the compilation of a comprehensive library of gestures, which would serve as a resource for informing the design of accessible interaction methods in future projects.
Through this rigorous observational and evaluative process, Rachel and I not only gained insights into the practical challenges faced by visually impaired individuals but also developed a understanding of the varied strategies used for spatial orientation.




Make and Iterate Stage
Create a prototyping tool to include visually impaired users in the early stages of the design process.
The idea for a low-fidelity prototyping tool struck Rachel while using a technology called PENfriend to label groceries. This device utilized RFID stickers as voice-activated labels readable by a handheld device. Rachel saw potential in leveraging existing technology, minus the expensive software programming, to create a low-fidelity prototyping solution tailored for the visually impaired.
Following the guidance from the RouteMe Prototype project, we developed a series of low-fidelity prototypes and conducted real-time usability testing to assess their effectiveness in aiding users' navigation through spaces.
Prototype Modeling Included:
Design 1: Paper prototype held together by staples.
Cons: Paper crumpled under the pressure of the PENfriend.
Design 2: Foam backed paper prototype.
Cons: Bulky and difficult to hold, and folded too easily causing user error in navigation.
Design 3: Card stock only prototype held with rings.
Cons: Unable to tell where the prototype started and finished.
Design 4 (Winner): Transparency paper, foam board prototype held with rings.
Advantages: This final design used transparency sheets for increased durability and enabled re-use of RFID stickers. A thin piece of foam board was added to indicate when volunteers had reached their desired location.
Additional discoveries emerged during testing. For instance, we replaced RFID stickers with a more cost-effective velcro-backed adhesive method, which enhanced reusability and allowed for real-time adjustment of features.
Throughout this process, my responsibilities included physically constructing each prototype, evaluating their practicality, and analyzing recorded footage to understand user interactions beyond visual cues. This iterative approach not only refined the prototyping tool but also deepened our understanding of how to effectively incorporate accessibility features early in the design phase.




Reflect – Test Stage
The inclusion of visually impaired users early on in the design process led to more satisfying experiences.
By involving visually impaired users from the outset, users were able to navigate to their desired locations seamlessly and appreciated being part of the design process from the beginning. Including visually impaired users early in the design process also resulted in clearer voice-over assistance prompts and instructions, which significantly enhanced usability and accessibility. Moreover, exploring gestures as part of the interaction design process led to the development of a library of potential features, enriching the overall functionality of the product.
Deliverable: PENfriend Paper Prototype Tool.