The challenges of creating interfaces are not just about issues of technological potential, they are about our humanity and the our ability to empathize and then extrapolate insights to design and create products that lessen work, improve communication, alleviate disparity and provide worthwhile idioms that allow us to connect across the breadth and diversity of human experience. In no realm of User Experience Design does the territory of general usability get more attention than with the topic of accessibility and the question of how to imagine, design and develop devices and interfaces for those who have difficulty seeing, moving, hearing or understanding interactive informational engagements. Going out on a limb here with a nod to Nathan Shedroff’s treatise on sustainable design practices Design Is the Problem; Section 508 Law, Web Content Accessibility Guidelines and the implementation of Web ARIA are each and all—equivalency hacks and are not sustainable responses to the challenge of humane design. Despite the fact that such standards mark an important social acknowledgement, they are developed, writ and distributed in a way that re-enforces the concept that making the way things work for me, work for other in the same way, is good design. An approach that diminishes and marginalizes a better understanding about the differences between how one who is sighted approaches riding a bike differently than say, a former colleague of mine born with a severe visual impairment, that rode his bike and used echo location to avoid moving cars, pedestrians and other bikers. (!) To be clear, Accessibility standards like Section 508 do matter; they hold organizations legally responsible for acknowledging, providing access and generalizing the experiences they provide. All the same, for someone who experiences the world with visual impairments, ‘access’ to to the language and culture of the visual world is not the problem, it is the assumption that accessibilityto the visual world is what is sought after when nothing could be further from the truth. What is sought, is a means to fully live, operate and experience the world without reliance on the protections and interpretations of third parties, a critical factor in the emotional well being that Standards Boards, Governments and Organizations have yet to articulate in a more sophisticated and meaningful manner.
Over the last decade, while Standards Committees have continued to focus on the generalized access model of engagement and service delivery, in the Assistive Technology Sector, there has been a developmental focus away from software toward hardware. A change in disposition away from interpretive controllers and toward—innovations that allows people the ability to direct their interaction and degree of engagement in the world. OrCam, brainchild of Amnon Shashua, is one such device, and while you’ve been reading up on Google Glass and Edward Snowden, his Israeli based company has been working to realize a wearable device that doesn’t encumber the user, relies on body motions and sound as the interface and empowers the wearer to engage with the world as they wish to direct themselves in it, no longer reliant on network capacity and second hand schemas as a gateway of engagement. By simply donning the OrCam, the wearer points their finger at what they do not recognize in their visual field and the device describes the object allowing the wearer to make use of the information as they see fit. A Mental Model that provides the advantage of providing a See n ‘Say the wearer can use to guide themselves, over the tired—and mostly useless concept of someone reading a map to them. An evolutionary leap forward in the world of Assistive Technology, and a challenge to those who do not require them, to conceptualize the issues in terms of context and not, merely content.
Watch Interview Amnon Shashua from Bloomberg News:
>>>Learn More About OrCam
>>>Q&A About OrCam
>>>Read Study Guidelines Are Only Half the Story: Accessibility Problems Encountered By Blind Users on the Web by Christopher Power et al.