How can information architecture improve the lives of the visually impaired?

Stella Park
2 min readJan 4, 2022
A hand reaching for an Amazon smart speaker

Information is everywhere.

Everywhere you look- your eyes are combining with every bit of your senses to take in more information to process between your neurons, synthesizing this information into digestible knowledge for your brain.

But what about the 12 million people 40 years and over who are visually impaired, living in the United States alone?

This is where information architecture joins forces with UX (user experience).

What is information architecture?

Just as a map or blueprint sets out the rules for the guidelines of a geographical location or project, information architecture has classically been known to logically organize different bits information (ontology), such as labels or tags into larger taxonomies through related classifications or hierarchies. In return, this information gets organized into something like a well-organized mindmap. In fact, this is precisely what I would compare a taxonomy to. One category branches off into two, which in turn branch off into 2, respectively. These continue to branch off into smaller categories which are all kept within the confines of the product and help to catalog the information.

How can this help the visually impaired?

While new voice technology like Alexa by Amazon or Siri by Apple bear the brunt of modern jokes, there is beauty to these brilliant tools. While most commands understood by Alexa or Siri are quite simple, non-visually impaired people can also benefit from further developed vocal information architecture.

For instance, have you ever been driving and wanted to listen to a specific podcast episode or play a certain Youtube video? Depending on when you played that episode (it seems Siri is quick to open up the most recently played episode), you may or may not have achieved your goal. This is where information architecture (IA) could help.

Just as IA needs guidelines to act as guardrails, voice assistants also act within certain guidelines (they’ll always be listening but are only summoned upon speaking the wake word).

Thus, if we were to create voice assistant guidelines to navigate within a certain app, it could be as simple as, “Siri, open up the latest episode on habits from Hidden Brain in Apple Podcast.”

Topic — “on”

Title — “from”

App selection— “in”

These sort of prepositions, though small, may be vastly important for proper vocal recognition and effective voice assistants. If you are an owner of an Amazon Alexa, Google Home, or use your own mobile voice assistant, how many times have you had to say your commands in a slightly different way to get the same result for a more complex request?

Until there is a more foundational improvement to voice assistants that recognize the pronouns in between commands, knowing and using this structure can provide designers and users of voice assistants a more efficient and enjoyable user experience.

--

--

Stella Park

My weekend escapades include matcha lattes and libraries with the smell of old books.