https://itp.nyu.edu/classes/applications2023/2023/10/06/response-to-week-5-speaker-chancey-fleet-and-group-experiences/
Experience Response
I'm impressed with the experiential formats presented by Groups 7 and 8. From past experiences, I've observed that one of the most significant design challenges is facilitating active engagement among a large group, such as an entire class, especially within a short timeframe of 15 minutes. Both Group 7 and Group 8 skillfully navigated this challenge. Group 7 implemented a real-time voting game that enabled immediate feedback, while Group 8 cleverly adapted from Group 1 by involving a select few in a competitive activity, with the remainder of the class voting on the outcome. This strategy ensured sustained attention and interaction throughout the session.
Additionally, Group 8's use of a screen reader to audibly convey the rules was a masterstroke. It proved to be more effective than a student speaking into a microphone or merely displaying instructions on a screen. This method greatly enhanced my comprehension, allowing me to grasp the rules swiftly.
I also appreciate the choice of visuals for the three distinct rounds. The inclusion of analog paintings and sculptures from various epochs and a photograph of our classroom, taken from behind, created a rich tapestry of engagement. This latter image introduced a dramatic flair, eliciting an emotional peak in the experience. Furthermore, witnessing the real-time interpretations of the photo by AI was enlightening. It evoked reflections on the nature of perception; while we can never truly discern how another human visualizes an image based on our description, this activity provided a glimpse into AI's "cognitive" process, almost as if we could converse with the AI and immediately understand its "thoughts".
Speaker Response: Chancey Fleet
I previously worked as a software engineer for Google Maps. Whenever we planned to launch a new feature, we'd allocate one to two weeks specifically for accessibility testing and addressing any related issues. Google has a clear and well-established guideline for mobile and web feature accessibility. We would routinely assess our new features against this benchmark. However, as an engineer for a product, we seldom considered refining the Google accessibility standard itself. This talk has been enlightening, offering deeper insights into the challenges faced by visually impaired individuals when navigating digital platforms. It's given me a fresh perspective on my past experiences.
Several points from the discussion truly resonated with me:
Firstly, our approach to accessibility testing has been flawed. In the past, we predominantly used the latest phone models for testing, which doesn't reflect the reality for many visually impaired users. As Chancey highlighted, 70% of visually impaired individuals lack a college degree, and many face financial constraints, preventing them from accessing high-end devices. Our focus should shift towards testing on older phone models or even basic phones like the Obama phone, which more accurately represent the devices they use.
Secondly, I was taken aback by the apparent lack of training for Obama phone customer service representatives regarding assisting visually impaired users. The inability of these reps to guide users to accessibility settings is a significant oversight. For those unfamiliar with these challenges, the urgency to learn and adapt may not be evident. Hence, there's a critical need to bridge this gap by training customer service personnel, ensuring that crucial information reaches these users effectively.
Another revelation was how a seemingly simple feature, like Siri's accelerated speech on iOS 17, could be groundbreaking for the visually impaired community. While this might seem like a straightforward software addition, its delayed inclusion underscores how such vital needs can be overlooked. This revelation underscores the vast difference in how we perceive the digital realm compared to those with visual impairments. Emphasizing user feedback and actively engaging with the visually impaired community can provide invaluable insights for enhancing accessibility guidelines and testing methods.
Furthermore, numerous technological advancements, such as YouTube tutorials or ChatGPT, remain inaccessible to visually impaired users due to overlooked accessibility layers. This discussion serves as a wake-up call for software developers, reminding us of a significant community that we've inadvertently sidelined. I fervently hope that this dialogue can permeate larger tech companies, influencing engineers and reshaping their everyday priorities.