Human-Computer Interaction and Accessibility Researcher
📚 Blind-Parent/Sighted-Child Co-Reading
We know: Reading books together is a common activity for many parents and their children.
But: For blind parents with pre-literate children, it may be the case that neither reading partner is able to read printed text alone.
So, we wondered: How do blind parents currently read with their children?
To find out: We conducted a content analysis of posts in an online network of blind parents, discussing and recommending strategies for reading with young children.
We found: Blind parents had a variety of innovative techniques for making shared book reading accessible that could inform the design of interactive technologies and groupware, even outside the context of shared reading. Additionally, blind parents and their pre-literate children often scaffolded each other’s reading abilities, such that they could read print together, even if neither could read print alone.
👨👩👧👦 Voice Assistants in Mixed-Visual-Ability Homes
We know: Voice assistants, like Google Assistant, Siri, and Alexa, are particularly accessible to people who are blind and, often, particularly frustrating to people who are sighted.
But: Many voice assistants are used in homes where blind and sighted people live together.
So, we wondered: How do these seemingly contrasting opinions of voice assistants affect adoption and use in families where some members are blind and others are sighted?
To find out: We conducted pair interviews with cohabiting intimate partners, in which one partner had a visual impairment and the other did not.
We found: Together, partners weighed a variety of concerns about adopting voice assistants in their homes, including privacy and safety of their children. But, blind partners were typically the primary user of these devices, which involved both positive and negative responsibilities.
🗣 Accessibility of Voice Assistants for Blind Users
We know: Voice assistants use voice-in/audio-out interaction paradigms which make them uniquely accessible, compared to most mainstream devices, for people who are blind.
But: Voice assistants are not designed explicitly to be accessible and likely have room for improvement.
So, we wondered: What accessibility issues do voice assistants have for users who are blind?
To find out: We conducted a content analysis of podcasts hosted by, or directed at an audience of, blind technology users which discussed popular voice assistants.
We found: Despite the accessibility of their core interaction mechanisms, the wider ecosystems in which voice assistants exist–like their mobile companion apps and the physical infrastructures in which they are located–produce a variety of accessibility issues for blind users.
We know: Seeking and synthesizing information is a core component of developing software, and blind and low-vision people who use screen readers have different strategies for finding digital information than their sighted peers.
But: Researchers have not explored whether these differences in information seeking strategies impact blind and low-vision developers’ workflow and/or the overall accessibility of coding.
So, we wondered: How and where do blind and low-vision seek information while coding? What accessibility issues exist within highly-technical information sources?
To find out: We conducted remote observations and interviews with professional software developers who use screen readers to code.
We found: Blind and low-vision developers seek information in many of the same sources as their sighted peers, and accessibility issues in technical information sources were similar to issues common in non-technical information sources. But, examining coding as an information seeking activity highlighted opportunities for understanding accessibility within the process of coding, like information transfer and the impact of social others on development teams, that are not apparent when examining accessibility of individual tools.