Make your Android apps more accessible with Android Accessibility

Overview

The Android Accessibility feature is an amazing detail that allows users to improve their overall experience just by changing their phone’s settings.

It is pretty incredible to think about how these devices which fit in our pockets, have wide possibilities to aid people with visual, auditory, hearing or any other disabilities such as physical impairments. These devices completely change the way they interact with not only the internet but the world itself.

This is why it’s very important that, in today’s world, as developers we make our apps more accessible. Keeping that in mind, apart from having a greater color contrast for better readability there are two major Accessibility features in Android that we need to focus on for advanced voice controls in the phone:

  1. ScreenReader/Select to Speak
  2. TalkBack

These features basically provide text to speech with many other options when enabled, but both of these features function a bit differently.

For better understanding, the article is divided into two sections. First which shows how to enable and use the features in your device and second which gives an idea of how to develop your app to be compatible with these Accessibility features.

Enabling and Using Accessibility Features

Configuring:

  1. Open your device’s Settings app
  2. Tap Accessibility, then tap Select to Speak

    Accessibility Features
    Accessibility Features
  3. Turn on Select to Speak shortcut

How to use:

  1. To start Select to Speak, use your Select to Speak shortcut (i.e The little guy at the bottom)Android-nav
  2. Tap an item on your screen, like text or an image
  3. To hear multiple items, drag your finger across the screen
  4. To hear everything, tap Play

TalkBack:

Configuring:

  1. Open your device’s Settings app
  2. Select Accessibility, then select TalkBackTalkback
  3. Select Use TalkBack
  4. Set up a shortcut for talkback for ease of use in the same settings screenTalkback-Settings

How to use:

  1. Open the app where you want to use talkback
  2. Use the shortcut to turn on talkback
  3. Press the view you want to hear
  4. Double tap if the view is clickable

Additional features:

  1. Use two fingers to scroll up or down while in talkback mode
  2. Can also swipe left or right to scroll up or down while in talkback mode
  3. When a view is selected by tapping on it, swipe up or down for additional information/features for that particular selected view

When a view is selected, swipe up and right in a single swipe to open Local Context Menu.

Local Context Menu

Making Apps Compatible with Accessibility Features

Now that we’ve understood how to use these accessibility features, let’s understand how the android variables behave with these features. This is important especially when we want to manipulate our app’s fields accordingly to give the user correct feedback upon the usage of the Accessibility feature.

Keep in mind that some fields are required to add a content description for TalkBack or ScreenReader to say something in response. This can be done by:
android:contentDescription=”@string/desc”

Here is an additional tip: ImageViews need to have a description since unlike TextViews they don’t have any text content. Lack of this text for ImageViews may cause a warning which will read “Missing ContentDescription”. This can be removed by adding:
android:importantForAccessibility=”no”
But remember to remove it if you decide to add a ContentDescription for that ImageView.

Below are some of the most used variables in XML and their behaviour towards these Accessibility features:

VariableBehaviour in ScreenReaderBehaviour in TalkBack
TextViewScreenReader only reads the content on the screen rather than reading the contentDescription provided programmatically.TalkBack reads the contentDescription provided by us. If none provided, will read the text set for the TextView.
EditViewIf the EditView is empty, the ScreenReader will read the description provided by us. If the content like the OTP is later added by the user it goes ahead and reads that content.

If contentDescription is also not provided it will completely ignore the editview/otpview.

If the EditView is empty, the screen reader will read the contentDescription provided by us. If the content like the OTP is later added by the user it goes ahead and reads that content.

If contentDescription is also not provided it will try to recognise the type of item touched and TalkBack reads it back(Eg.: “Editbox”).

ImageViewScreenReader reads the contentDescription provided by us. If none provided, will ignore the imageview.TalkBack reads the contentDescription provided by us. If none provided, will ignore the imageview.
WebviewScreenReader recognises all the elements in the webview and reads it out for the user.Recognises all the elements in the webview and reads it out for the user and continues to have double click to open functionality.
ButtonsScreenReader reads the contentDescription provided by us. If none provided, will read the text set for the button.TalkBack reads the contentDescription provided by us. If none provided, will read the text set for the button and TalkBack recognises it as a button.
LinksScreenReader does not recognise links, only reads the content provided.TalkBack recognises links present in textviews, notifies the user with a chime and upon opening Local Context Menu, shows the link available in the menu.

Summary

Many new devices, OS designs and various web browsers have made major improvements in accessibility for better content reach. But when we get started with developing these features in our app from the get go, we may significantly improve our ability to design better apps.

These are some of the main attributes that I’ve learnt about accessibility but that’s not all! Make sure to go through Google’s Accessibility checklist for a complete guide on how to make your app more accessible.

Be curious and keep learning. Thank you!