Developer Console

Implement VoiceView Accessibility Features (Fire TV)

The VoiceView screen reader built into Amazon Fire TV enables visually impaired users to navigate the Fire TV user interface. With VoiceView, visually impaired users are able to use the standard Fire TV navigation mechanisms — Up, Down, Left, and Right buttons on the remote — to move the input focus around the screen. When the focus changes, VoiceView automatically speaks the currently focused item. This functionality is similar to the "Standard Navigation" mode of the previous version of VoiceView.

Several key changes were made to the VoiceView screen reader on Fire TV in the December 2016 software update. This page discusses user experience (UX) recommendations for implementing support for VoiceView in your app to help you create the best experience for users of your Fire TV app who are blind.

Checklist for VoiceView Implementation

Use the following checklists to ensure that your Amazon Fire TV app interacts with the VoiceView screen reader to meet the standards recommended by Amazon.

Technical Implementation Recommendations

Your app should implement the following aspects of VoiceView:

  • Verify that your app's UI is compatible with the Android Accessibility Framework.
  • Implement Content Descriptions so that VoiceView can read descriptions for images, buttons, and other on-screen objects.
  • Add Usage Hints to list and grid content so that VoiceView can help guide the interactions of visually impaired users.
  • Implement Described By for static content, so that VoiceView can read that content when the associated object gains focus.

User Experience (UX) Recommendations

Your app should implement the following UX recommendations for VoiceView:

  • VoiceView reads the content for items that gain focus, including all information that is visually represented onscreen, such as a title or episode number.
  • VoiceView reads Orientation Text and Usage Hints to help users learn how to navigate the screen UI.
  • VoiceView reads static content either automatically through Described By text, manually when a user either presses the Menu button, or manually when a user enters Review Mode and steps through on-screen items.
  • VoiceView reads setup information, including activation URLs and codes.
  • VoiceView supports features and tasks such as program selection, program information, playback, and CC settings.

Overview of VoiceView Behavior

Users can navigate the Amazon Fire TV user interface and VoiceView-enabled apps using the Menu and other buttons.

Using the Menu button with VoiceView

Use the following conventions for the Menu button when VoiceView is enabled:

  • To enable or disable VoiceView, press the Back and Menu buttons at the same time for two seconds.
  • When VoiceView is enabled, VoiceView controls the Menu button and the Play/Pause button (when VoiceView is speaking).
  • The system or app receives double-press events from the Menu button.
  • Single-pressing the Menu button initiates the reading of information from the screen in the following order:
    1. Usage Hints
    2. Orientation Text
    3. Described By text
    4. All other static content

Using the Play/Pause, Rewind, and Fast Forward buttons to control speech

Use the following navigation conventions when VoiceView is enabled:

  • When VoiceView reads Usage Hints, Orientation Text, Described By text, and all other static content, the user can control navigation using the Rewind and Fast Forward buttons.
  • When a user accesses a screen for the first time, VoiceView automatically reads the Orientation Text.
  • When a user moves focus to a control after pausing, VoiceView reads Usage Hints and Described By text.
  • When speaking, the Play/Pause button silences VoiceView.
    • If a movie or music is playing and VoiceView is speaking, pressing Play/Pause one time silences VoiceView.
    • After VoiceView is silenced, pressing Play/Pause a second time pauses the playing media.
  • VoiceView does not read Described By content as part of static content when a user presses the Menu button.

VoiceView's Review Mode allows a user to explore the grid layout of an Amazon Fire TV screen in detail — similar to using linear navigation with a screen reader on a tablet. Use the following conventions for using Review Mode in your app:

  • Press-and-hold the Menu button to enter Review Mode, which will be announced by VoiceView.
  • The Left and Right buttons on the directional controller control linear navigation.
  • If a user presses Select when a non-actionable item is in focus, VoiceView speaks "Item not selectable."
  • Press-and-hold the Menu button to exit Review Mode.
  • After exiting Review Mode, the focus for accessibility returns to the previous cached location of keyboard focus, and VoiceView repeats an announcement of that focus.

Using Review Mode with WebViews

A WebView can represent a complex UI within an application, containing both actionable items, such as links, and non-actionable items, such as static text and images. VoiceView supports WebViews because they are frequently used by many apps in the Amazon Appstore. In a VoiceView-enabled WebView, a user can use Review Mode to navigate all items on-screen including actionable items and gauge the context of static content items. Described By content is not available in WebViews.

When you first enable Review Mode, the level of granularity defaults to moving by individual controls. Press Up or Down to cycle through the available granularity options, such as character, word, control, or window. Granularity options in web content include link, list, or heading. These reading granularities allow the user to do things like spell the name of an actress, or navigate more efficiently through web content. Use the following conventions to navigate the various granularity options:

  • Press Left to return to the previous item at the currently selected granularity.
  • Press Right to move to the next item.
  • The granularity level is reset to "control" when you enter Review Mode.

Implementing VoiceView

This section discusses guidelines for implementing VoiceView for the following types of content:

  • Static/non-focusable content
  • Orientation text
  • Usage hints

Implementing support for VoiceView on Static (Non-focusable) Content

VoiceView provides two mechanisms to navigate static (non-focusable) content so that visually impaired users can access this content: (1) using the Menu, Fast Forward, and Rewind buttons, and (2) Review Mode (discussed below). VoiceView supports markup that apps can use to associate this static (non-focusable) content with a focusable item.

Use Described By when a focused item has additional details on the screen that are not included in the items content description. These details will be spoken after the user pauses briefly on that item. Continuous moving between items would not speak the details until the user pauses on an item.

The following examples explain how a user would navigate static content in two different scenarios:

Example 1

Consider the Amazon Fire TV launcher screen used to navigate a movie catalogue. The text at the top of the screen updates to show the title, description, rating, and other information about the item currently in focus. Because this non-focusable content updates each time the user selects a movie, the node containing the selected movie should be described with the describedBy extra. This will cause this information to be spoken automatically after a brief pause on the item.
Example 2

Consider a Fire TV movie details view where the movie title, year, duration, star ratings, etc. are all static content. The user presses Menu to prompt VoiceView to speak this content. The user then navigates using the Fast Forward and Rewind buttons.

To associate a piece or container of static content with a focusable View:

  1. Set the key for com.amazon.accessibility.describedBy in the extras bundle from the AccessibilityNodeInfo for that View.
  2. Set the value to be a string containing a space-delimited list of the view Ids of the containers or views which contain the description of this item.

    When VoiceView encounters an item with the com.amazon.accessibility.describedBy key set, it will request a list of AccessibilityNodeInfo objects for the view Ids specified by the com.amazon.accessibility.describedBy value. VoiceView then reads the appropriate text or content descriptions, depending on your verbosity settings.

Code Sample:

// You can set extras on a button which is described by some
// static text elsewhere on the screen as follows.
Button button = (Button) findViewById(R.id.button);
button.setAccessibilityDelegate(new View.AccessibilityDelegate() {
  public void onInitializeAccessibilityNodeInfo(View host, AccessibilityNodeInfo info) {
    super.onInitializeAccessibilityNodeInfo(host, info);
    info.getExtras().putString("com.amazon.accessibility.describedBy", R.id.movie_title_1 + " " + R.id.actors_1 + " " + R.id.description_1);
    info.setEnabled(host.isEnabled());
  }
});

Implementing Orientation Text for VoiceView

To help new users understand how Amazon Fire TV screens are laid out, VoiceView supports Orientation Text, which is localized text that VoiceView reads the first time a container with that text is seen, or when the user pressed the Menu button. Apply orientation text on the container View of a region in which a high level description of the general layout will be useful to the user when first encountered.

Example 1

The first time a user lands on the main Fire TV home page, VoiceView reads, "Main Menu. Contains top-level choices such as Search, Home, Movies, and Settings. When an item is selected, the lower portion of the screen updates to contain related content."
Example 2

Consider a screen of your app which has a LinearLayout containing many Views of movie cover art. To communicate the purpose of this screen to a blind customer, you can set the Orientation Text on the AccessibilityNodeInfo of the LinearLayout to explain, "This screen contains a list of movies. Navigate to each movie to hear a plot synopsis. Select a movie to play it."

The key for the com.amazon.accessibility.orientationText extra configures the Orientation Text for an AccessibilityNodeInfo.

  • The first time a user encounters an AccessibilityNodeInfo containing Orientation Text, VoiceView reads the text.
  • On subsequent visits, users can request context information by pressing the Menu button, which causes VoiceView to read the Orientation Text.

Implementing Usage Hints VoiceView

For users without visual impairment, the layout of a screen provides visual cues as to how to navigate and interact with that screen. For example, if item A is located above item B on a screen, the user intuitively knows to press the Down button to navigate from A to B. However, visually impaired users might require additional hints to aid their interactions with a screen. To help with this issue, VoiceView supports a set of extras that define Usage Hints to help with navigation within a screen.

Example 1

Consider a screen that has a horizontal list of movies. Including a Usage Hint of "Use Left and Right to move between movies." is best practice.
Example 2

Consider a screen with a multi-row grid of content, and the first of those rows is in focus, for example "Customers Also Watched". In this case, use com.amazon.accessibility.usageHint.remote with the text such as "Press Left and Right to find an item, press Up and Down to move between collections of items." Set this extra on a grid's container View, the parent of all the rows and columns of the grid.

Avoid setting usage hints for the individual objects within a grid, which would result in unnecessary repetition of the Usage Hint while navigating. A Usage Hint is read only when entering a container and not repeated while moving between items inside the same container.

Key Value Example
com.amazon.accessibility.usageHint.remote
(Fire TV only)
String describing how to use or navigate the item using a remote. "Press left and right to find an item."
com.amazon.accessibility.usageHint.touch
(Fire Tablets only)
Description of how to interact with an item using a touch screen. "Double tap to select. Double tap and hold for options."

Code Sample:

// You can set extras on a list to provide a usage hint as follows.
ListView listView = (ListView) findViewById(R.id.my_list);
listView.setAccessibilityDelegate(new View.AccessibilityDelegate() {
  public void onInitializeAccessibilityNodeInfo(View host,
    AccessibilityNodeInfo info) {
      super.onInitializeAccessibilityNodeInfo(host, info);
      info.getExtras().putString("com.amazon.accessibility.usageHint.touch",
       getString(R.string.double_tap_hint));
      info.getExtras().putString("com.amazon.accessibility.usageHint.remote",
       getString(R.string.press_select_hint));
      info.setEnabled(host.isEnabled());
 }
});

Last updated: Oct 29, 2020