Quantcast
Channel: Philip Jenkinson – Media Access Australia
Viewing all articles
Browse latest Browse all 10

Why usability testing is so crucial

$
0
0

Web and digital accessibility is not rocket science. And you don’t have to be a Big Bang Theory type to get it done. Making digital communications accessible to people with disability is about following a series of processes so that no-one is excluded, and one of the most important components in ensuring accessibility for people of all abilities is usability testing.

A woman using a laptop

Woman uses a laptop

Digital agencies, government utilities, and other organisations often employ automated tools and sometimes conduct manual accessibility audits to evaluate how accessible a web user experience may be, and this provides important learnings. However, just undertaking these two testing modalities can leave gaps in the true experiential journey within a website.

Acknowledging that every person’s experience of disability will be different, based on the severity of impairment and if a person has multiple disabilities, there is a wide variety of combinations of assistive technologies and viewing platforms that are available.

Accessibility, usability, and inclusive design are all closely related and usability testing can go beyond an audit’s objective findings and provide subjective and personal responses of web processes. Evaluations of a user’s experience are most effective when they have been tested with real end-users – and even better if the testers are living with disability.

Why? Because around 20% of the population has some form of disability (vision, hearing, cognitive, mobility) and at least 5% more have low levels of literacy and/or English as a second language. So testing with this in mind provides valuable insights into how to clearly communicate and engage with people, that will benefit not just 25% of Australians, but the entire community.

Did you know that texting on mobile phones began as an accessibility feature for Deaf people and those with a hearing impairment? Very soon afterwards, user testing confirmed that indeed most people in the community would use and value this new communication technique. Then texting was adopted rapidly and became totally mainstream.

The limitations of automated tools and manual testing.

Automated accessibility tools empower developers to diagnose accessibility issues in their web projects relating to Web Content Accessibility Guidelines (WCAG) 2.0 Level AA conformance. The tools are able to provide quick overviews of accessibility issues found within web pages.

This is particularly important for government web projects as basic compliance with WCAG AA standards is required under the Disability Discrimination Act. As a result, some digital agencies and organisations believe they have met accessibility requirements by only just using these tools, regardless of the tool’s reported result. While automated tools are fantastic for identifying issues, they will not provide the best advice on how to remediate these issues. Some of the shortcomings of relying solely on automated accessibility tools are highlighted in a recent article Why automated tools need manual testing to deliver what you want.

Manual evaluations by accessibility experts are sometimes conducted alongside automated checks, to provide deeper and more thorough analysis of accessibility issues. The experts are able to provide actionable and detailed remediation suggestions. A typical accessibility audit conducted by Media Access Australia involves evaluating the mark-up and user interface elements on a page, testing various screen readers while using the site, and navigating through the site using only keyboard controls.

While an auditor can read through a site using a screen reader with their eyes closed, they’ve already seen the site and have a rough, visual model in mind of the interface. We can tell you whether or not your site is compliant, but your best bet for understanding the usability of a site would be with a real user.

Goal-based testing with real users can be the missing piece of the puzzle.

Media Access Australia’s usability testing involves identifying a range of processes across a site and asking a variety of people with different experiences of disability to complete those processes without knowledge of the site infrastructure beforehand. Testers use assistive technology as they require and have an observer on hand to record their feedback and assist or guide them if a problem is unsolvable.

Unlike manual evaluations, where auditors are intentionally finding failures and improvements relating to the 38 WCAG success criteria, usability testing is singularly goal-based and provides a unique and more accurate insight into whether or not a site will be usable. This results in a realistic representation of what a real user’s experience might be when they are using the site outside of a test environment.

Sometimes when Media Access Australia conducts usability testing, we encounter usability issues that may not have been identified in any accessibility audit or automated assessments. This is because usability issues, while still related to WCAG AA compliance, do not always warrant a failure. Usability testing reveals the different ways that people with different disabilities interact with technology, and what their mental models may be relating to different processes on a page, and these insights are vital in creating a truly accessible user experience.

Some recent examples where usability testing made the difference…

  • For an airline: when testing a drop-down combo box with a blind user, the 55 city names within had to be manually read through using their screen-reader and the “tab” key, rather than for instance being able to type in the first few letters of a city name to dramatically speed up the process.
  • For a Government utility: A low vision user struggled with an excess of whitespace on the design of a webpage, as it cut out some of the content on the page for them, or made it difficult for them to find things as she completely missed their presence.
  • For a tertiary institution: When going through a process, a user who speaks English as a second language found a course code positioned too close to a course name, making it difficult for them to find a course name that they wanted. The order, spacing, and font sizing could be changed to make this easier to understand for other users.
  • For a Government organisation: When filling out a form, users found that questions and error messages were ambiguous or poorly worded. This affected screen-reader users who had to hear the questions being announced to them, and also users with cognitive disabilities and low literacy background didn’t understand the wording and were confused about “strange headings for the form sections”.

More information is available.

For more information on usability testing, you can check out the Media Access Australia services website, email the Media Access Australia team or call (02) 9212 6242. The team can assist your organisation with usability testing, annual web audits, digital accessibility maturity assessments, accessibility training and more. You can watch a short video on how Media Access Australia can make what you do online accessible. An Audio Described version of the video is also available.


Viewing all articles
Browse latest Browse all 10

Latest Images

Trending Articles





Latest Images