Dictating Text

 

The quickest and easiest way to enter text into an edit field is by speaking into your device. Google voice recognition is surprisingly accurate, and for major languages, it works both on- and offline. This post covers how to dictate text instead of typing.

 

Dictation

 

To start dictation, explore to the Voice Input button, located above the P on the Google keyboard, and double-tap. The device chimes to indicate it is listening. Speak in a conversational tone, enunciating clearly, as if you were on a job interview. When you finish speaking, wait for another chime, which indicates the device has stopped listening.

 

Dictation Tips

 

For best results, keep the following in mind when using voice recognition:

  • Speak as long as you want. The device keeps listening as long as you keep talking. It stops listening after about 5 seconds of silence, and it may stop sooner if you pause frequently.
  • Speak punctuation. Say the names of punctuation marks, pausing briefly before and after the punctuation and dropping the tone and volume of your voice slightly (e.g., I like cake, comma, ice cream, comma, and cookies, period). Experiment with various symbols and with phrases like “blank line” and “new paragraph.” These vary with the system language. If you don’t say the names of punctuation marks, no punctuation is added to the text you dictate.
  • Speak in a relatively quiet environment. If there’s background noise, speak close to the microphone, which is usually near the bottom of the phone, where your mouth would be if you were taking a call. If TalkBack is likely to interrupt dictation with incoming notifications or other status messages, turn the screen reader volume down. Volume should be low enough that your voice is definitely louder, but high enough for you to hear it.
  • Cover the device speaker if TalkBack speaks what is being typed. If you’re running Android 4.1, TalkBack speaks all the new text that appears in the edit field as you dictate. This creates a loop: TalkBack says X; it appears in the edit field; TalkBack says it again; it appears again; and so on. Sometimes the solution to this problem is to simply turn TalkBack down to a fairly low volume. Other times, you need to cover the audio speakers with the pad or side of your index finger after the listening chime is played. Speakers are usually on the back of the device near the bottom. A third option is to use earphones when dictating.

 

Stop dictation

 

In relatively quiet environments, the stop-listening chime is played about 5 seconds after you stop speaking. In noisier environments, however, the chime may never play on its own. To stop dictation, do one of the following:

  • Use the TalkBack Back gesture (down then left). This stops dictation without playing a stop-listening chime. It also removes the keyboard from the screen. You can double-tap the Back button instead, but this produces verbal feedback that is added to the dictation.
  • Double-tap the Pause Listening button. This is a relatively large area that is located near the bottom of the screen, above the home button. Explore to it; then practice double-tapping it. When you do, the keyboard does not return; instead, what appears is a dictation screen.

 

Resume Dictation

 

When dictation stops on its own or as a result of double-tapping the Pause Listening button, the dictation screen appears. It contains the following items, which cannot be swiped to:

  • A language dropdown list (below the edit field)—tap to choose the recognition language.
  • Tap to speak button (below the language dropdown)—tap to continue dictation. You may move the cursor to another part of the edit field before tapping this button. The Pause Listening button is located in the same area.
  • Keyboard Entry (to the left of the Language dropdown)—double-tap to return to the keyboard. This button sometimes needs to be tapped repeatedly.

 

Ensure Offline Speech Recognition

 

To use dictation without an internet connection, download the recognition data for your language.

  • Tap the Search item at the top of the home screen. When the app launches, use the Back gesture or Back button to remove the on-screen keyboard.
  • Tap Menu in the lower right-hand corner of the screen, above the Android buttons.
  • Tap Settings>Voice>Offline Speech Recognition.
  • Tap the All Languages Dropdown; then tap the desired language, and follow any prompts.

Voice recognition is now available offline. You don’t have to do anything special to use it.

Typing with the On-Screen Keyboard

Typing with the on-screen keyboard is easy, but many new users find it challenging and slow. If you’re new to on-screen typing, you may want to start with short writing tasks, like search strings, contacts, and text messages. You can use voice dictation for longer tasks, like emails and notes to self, until you feel more comfortable with the keyboard. This chapter describes how to use the Google keyboard, but most of the information applies to other on-screen keyboards as well.

 

When you find an edit field you want to type in, double-tap it to bring up the on-screen keyboard. TalkBack says something like, “Showing text keyboard.” It’s a good idea to slide your finger to the lower third of the screen to find out if the keyboard is already available.

 

To type letters, simply slide to the letter and lift your finger. This is the only typing mode in Android.

 

To type capital letters, first touch the shift key, located in the lower left-hand corner of the screen near the letters Z and A. TalkBack says, “Shift enabled.” Touch the letter you want to capitalize; then continue typing.

 

To type in all-caps, slide your finger to the shift key and hold it there for a couple of seconds. TalkBack says, “Shift enabled,” immediately followed by “Caps lock enabled.” Type the letters you want to capitalize. When you want to return to lower case, briefly touch the shift key.

 

To type numbers, punctuation, and other special characters, touch the Symbols key, located in the lower left-hand corner of the screen below the shift key. In Symbols mode, the Symbols key becomes the Letters key, and the shift key becomes the More Symbols key. Touch More Symbols to access additional special characters, and touch Letters to return to the main typing keyboard. If you press the spacebar after typing a symbol, the keyboard returns to letters mode.

 

To type accented characters and numbers, slide your finger to the closest character and hold your finger there for a couple of seconds. TalkBack says, “Alternative characters are available.” Wait a moment, then slide your finger up to explore the character set. The popup that shows the characters is small. If it contains lots of options, as for the letter A, items are arranged in one or two rows of four or five characters. If the character set is made up of one or two items, as for the letter T, items are immediately above the character. Slide your finger to the desired option and lift. If you slide out of the popup area before making your selection, the popup is dismissed, so you need to start over. Keep in mind that TalkBack doesn’t always announce the character you typed correctly; just trust that it worked.

 

To delete characters, touch the Delete key, located in the lower right-hand corner of the screen near M and L. This key deletes the character to the left of the cursor.

 

To move to the next edit field, when working with a series of edit fields, you can sometimes touch the Next button, located in the lower right-hand corner of the screen below the Delete key. If no Next button is present, remove the keyboard with the Back button or Back gesture and double-tap on the next edit field.

 

To remove the on-screen keyboard, go back. You can double-tap the Back button at the very bottom of the screen, or you can use the down-then-left right-angle gesture. TalkBack says, “Keyboard hidden.”

Gestures for Explore by Touch

TalkBack is the screen reader developed by Google for Android. Technically, it’s an accessibility service that provides spoken feedback. The part of the accessibility service that enables blind users to touch the screen without inadvertently activating controls is called Explore by Touch. This post discusses how to enable and disable TalkBack and how to use the Explore by Touch gestures, frequently referred to as TalkBack gestures.

 

Enabling and Disabling TalkBack and Explore by touch

 

To turn TalkBack on or off, go into Settings>Accessibility>TalkBack. The TalkBack screen has some text explaining what TalkBack is, a Settings button, and an on/off switch. Touch the on/off switch, which is located in the upper right-hand corner of the screen, below the notification bar. If you tap the switch on, the next screen shows a vertical list of the things TalkBack has access to (e.g., the text you type) as well as the Cancel and OK buttons. Touch OK in the lower right-hand corner to start speech. If you tap the switch off, TalkBack warns you that spoken feedback will stop; tap OK in the lower right-hand corner to disable speech. Explore by Touch is turned on and off automatically.

 

To turn Explore by touch on or off, go into Settings>Accessibility>TalkBack>Settings, and tap the Explore by Touch item to check it. This shouldn’t be necessary as Explore by touch is turned on and off automatically with TalkBack; however, the two features are controlled independently as low-vision users and people with some print disabilities may want speech without the added gestures.

 

To go through the Explore by Touch tutorial, go into Settings>Accessibility>TalkBack>Settings and touch the Explore by Touch tutorial item. Once in TalkBack Settings, you need to swipe up with two fingers to scroll to find the item. The tutorial starts as soon as you double-tap the item.

 

Starting with Basic Gestures

 

These simple gestures are used to navigate your Android device. Use a touch that is light and fairly rapid. If you’re swiping, scrolling, or using right-angle or two-part gestures, the touch and movement are even lighter and faster, as if you were brushing dust or fine crumbs from a tabletop. Keep in mind that these gestures are available only when TalkBack and Explore by touch are on:

 

  • Explore – Slide one finger over the screen in any direction. TalkBack announces text and controls as your finger moves over them.
  • Swipe (right or down) using one finger – Quickly slide a finger over the screen to move to the next item. This is comparable to pressing tab on a Windows or Apple computer.
  • Swipe (left or up) using one finger – Quickly slide a finger over the screen to move to the previous item. This is comparable to pressing shift+tab on a Windows or Apple computer.
  • Double-tap (anywhere on the screen) – Tap the screen twice with one finger to activate the last item you heard. By default, this is how you click controls, check and uncheck boxes, and open dropdown lists. If you’re new to touch-screens, the double-tap is a rapid sequence. Think of a two-syllable word, like “double” or “Android;” each tap corresponds to a syllable.
  • Single-tap (directly on the item) – Explore to the item and tap it once to activate. This is an alternative method for clicking controls, checking and unchecking boxes, and opening dropdown lists. To set this mode, go into Settings>Accessibility>TalkBack>Settings and check Single-tap mode. You can also swipe and double-tap anywhere when in this mode.
  • Long-press (by double-tapping on an item, keeping your finger on the screen after the second tap) – Slide a finger directly to an item, tap twice, and after the second tap, hold your finger on the screen instead of lifting. This gesture is used to bring up additional options (comparable to pressing the context key in Windows). It also opens and closes the on-screen keyboard (if you tap on an edit field), and it enables you to drag items on the home screens. You can often long-press anywhere, but long-pressing directly on an item produces the most predictable results.
  • Scroll vertically (by swiping up or down using two fingers) – Slide two fingers up or down to scroll through a list. To scroll successfully, move focus to the list by sliding a finger to an item in the list before beginning the two-finger swipe. To move to the screen below, swipe up, and to move to the screen above, swipe down. To remember, imagine you’re handling something on a roll (e.g., toilet paper, paper towels, or fabric). Short swipes scroll the list by an item or two; several long, rapid swipes move through a couple of screens. Be sure to scroll in the work area only. Starting the two-finger swipe too close to the bottom of the screen may inadvertently launch Google Now, and starting the two-finger swipe too close to the top of the screen may inadvertently open Notifications.
  • Scroll horizontally (by swiping left or right using two fingers) – Slide two fingers right or left to change pages and screens. Scroll in the work area of the screen, or slide a finger to an item in the list before scrolling horizontally. Scrolling right to left moves to the screen on the right, and scrolling left to right moves to the screen on the left. To remember, imagine you’re turning the pages of a book. In most cases, you can swipe anywhere in the work area, but if your goal is to dismiss a notification or clear an item from recent apps, you need to explore to the item and two-finger swipe horizontally exactly over it.

 

Using Two-Part Gestures

 

These gestures enable you to move to specific parts of the screen. Remember to use a touch that is light and fairly rapid. If you’re swiping, scrolling, or using right-angle or two-part gestures, the touch and movement are even lighter and faster, as if you were brushing dust or fine crumbs from a tabletop. Keep in mind that these gestures are available only when TalkBack and Explore by touch are on:

 

  • Swipe up then down with one finger in a single fluid motion – By default, this gesture moves focus to the first item on the screen i.e., what appears in the upper left-hand corner). Alternatively, this gesture can be used to change granularity (i.e., transition to the next reading level); to use this gesture for changing granularity, go into Settings>Accessibility>TalkBack>Settings>Manage Gestures>Two-Part Vertical Gestures.
  • Swipe down then up with one finger in a single fluid motion – By default, this gesture moves focus to the last item on the screen i.e., what appears in the lower right-hand corner). Alternatively, this gesture can be used to change granularity (i.e., transition to the previous reading level); to use this gesture for changing granularity, go into Settings>Accessibility>TalkBack>Settings>Manage Gestures>Two-Part Vertical Gestures.
  • Swipe right then left with one finger in a single fluid motion –This gesture moves to the next vertical or horizontal page. When scrolling vertically, the last item on one page becomes the first item on the next page. For example, if you use this gesture to move through a list of 35 items, TalkBack may announce, “Items 1 through 12 of 35.” After swiping right then left once, TalkBack may say, “Items 12 through 23 of 35.”
  • Swipe left then right with one finger in a single fluid motion –This gesture moves to the previous vertical or horizontal page. When scrolling vertically, the first item on one page becomes the last item on the previous page. For example, if you use this gesture to move through a list of 35 items, TalkBack may announce, “Items 12 through 23 of 35.” After swiping left then right once, TalkBack may say, “Items 1 through 12 of 35.”

 

Moving on to Right-Angle Gestures

 

These gestures provide quick access to commonly used functions. They combine a vertical line and a horizontal line. For example, the right-then-down gesture involves a horizontal line that goes from left to right followed by a vertical line that goes from top to bottom. Experiment to find the gesture that works best for you. Some people draw crisp right angles while others draw sloppy curves; some find they’re more successful with large motions while others get results with small ones. Many people notice that gestures work differently on different devices. Whatever the case, use a touch that is light and fairly rapid. If you’re swiping, scrolling, or using right-angle or two-part gestures, the touch and movement are even lighter and faster, as if you were brushing dust or fine crumbs from a tabletop. Keep in mind that these gestures are available only when TalkBack and Explore by touch are on:

 

  • Go Home with up then left – Using one finger, swipe upward then to the left to move to the Home screen. This is the same as double-tapping the Home button at the bottom of all screens.
  • Go back with down then left – Using one finger, swipe downward then to the left to return to the previous screen. This is the same as double-tapping the Back button at the bottom of all screens.
  • Go into notifications with right then down –Using one finger, swipe to the right then downward to pull down the notification shade. This is the same as touching the notification bar, then sliding two fingers down the screen. To close notifications, use the Back gesture or button.
  • Go into Recent Apps with left then up – Using one finger, swipe to the left then upward to open the Recent apps screen. This is the same as double-tapping the Recent Apps button at the bottom of all screens.
  • Go into the Global context Menu with down then right – Using one finger, swipe downward then to the right to go into a TalkBack specific menu that enables you to read the entire screen, spell the last utterance, read from the current position, jump to specific words and other text, suspend TalkBack, and go into TalkBack Settings. Draw a circle, lifting your finger when you hear the option you want. The Global Context menu is covered in more detail in another post.
  • Go into the Local Context Menu with up then right – Using one finger, swipe upward then to the right to go into a TalkBack specific menu that enables you to move the text cursor to the beginning or end of the edit field, change granularity, select text, and locate features like links. Draw a circle, lifting your finger when you hear the option you want. You may need to draw a second circle to continue the operation. The Local Context menu is covered in more detail in another post.

 

Two other right angle gestures are available: right then up and left then down. You can assign these to Read from Top and to Read from Next. You can also assign any of the commands covered in this section to any of the right-angle gestures. To assign or modify right-angle commands, go into Settings>Accessibility>TalkBack>Settings>Manage shortcut gestures, tap on the gesture, and check the preferred function.

 

Modifying Standard Commands

 

As is clear from this post, Explore by touch gestures are not the same as standard Android gestures. They’re not so different, however, something to keep in mind when reading mainstream blogs, forums, and articles and when helping sighted friends and family members with their devices. Though they don’t hold true one hundred percent of the time, The following two tips are good rules of thumb for translating standard gestures to TalkBack gestures.

  • Swipe with an extra finger – If a mainstream tutorial tells you to swipe with one finger, try swiping with two. If the tutorial tells you to swipe with two fingers, swipe with three.
  • Double-tap more than once – if a mainstream tutorial tells you to double-tap on something, try double-tapping twice. If the tutorial tells you to triple-tap, double-tap three times.

 

 

Screen Layout

People new to touch-screen devices sometimes feel overwhelmed. The layout feels random, so locating controls takes longer than it should. This post describes the general layout of most screens and provides tips for finding controls and other content.

5-Part Layout

The Android screen is divided into 5 horizontal layers. The top 2 and bottom 2 are narrow, like ribbons or strips of tape, and the middle layer is large, taking up most of the screen. The sections are as follows:

  1. At the top edge of the screen is the notification bar. This area lists status information like network, signal strength, battery level, and time and date. It also mentions special events, like missed calls, new messages, and update alerts. The information is read all at once. To listen to individual notifications, go into the notification area, a topic covered in a later post. The notification bar is about half an inch or 1 cm tall.
  2. Below the Notification bar is a row of controls. This is sometimes called the Search bar because Search is generally one of the items found here. The specific controls vary though they tend to be the buttons and dropdowns used most often. For example, email apps usually place the Compose, Reply, and Delete buttons here, and podcatchers usually place buttons for favoriting and subscribing here. Up to 5 controls can appear here, and in some apps, there may be 2 rows of controls. 3 of the controls commonly found in the search bar are Navigate Up (left-hand corner), which takes you to the previous screen in the app; Search (right of center), which allows for searching within the app; and More Options (right-hand corner), which displays more app related options. When app tutorials refer to buttons found at the top of the screen, they are generally referring to this area. This screen layer is about half an inch or 1 cm tall.
  3. The bulk of the screen is the work area. This is where most of the content of an app is located. In email apps, a list of messages is here; in reading apps, a list of books or the page being read is here; in a recording app, the recording and playback controls are here; and in a memo app, the edit field for writing is here. Items are most often arranged in a vertical list. Another common configuration is to have one or two rows of buttons in the lower half of the screen. This layer covers most of the screen.
  4. Below the work area and near the bottom of the screen is a row of controls. This is occasionally called the Dock or Favorites Bar because the home screens have a fixed set of controls here. The specific controls vary though they tend to be buttons and dropdowns that are frequently used. For example, email apps usually place buttons for archiving and marking as read or unread here, and contacts apps usually place the New Contact button here. Up to 5 controls can appear here, and in some apps, there may be 2 rows of controls. Search is occasionally placed here as is a More Options button (right-hand corner). When app tutorials refer to buttons found at the bottom of the screen, they are generally referring to this area. This screen layer is about half an inch or 1 cm tall.
  5. At the bottom of the screen are the Android buttons. These are Back, Home, and Recent Apps. Older Android versions include Menu/Options and Search, and newer Android devices running custom skins also include the Options/Menu button here. On some devices, these buttons need to be double-tapped to be activated, even if single-tap mode is enabled. On other devices, these buttons are activated as soon as they’re touched, so users need to learn their locations and activate them through muscle memory.

Tips for Working with the Screen

The following observations may be helpful:

  • To learn an app screen, pretend you’re reading a braille book. Start at the left end of the Search bar, and slide a finger to the right. When you get to the right edge of the screen, move back to the left edge, dropping your finger slightly as if you were on the next line of braille. Then sweep your finger from left to right, and repeat the process until you reach the level of the Favorites Bar. This gives you a general idea of what is on the screen, how information is laid out, and how the app works. It also gives you a sense of how accessible the app is (e.g., button labels, add placement, readable text). If you have trouble sweeping in a straight line, place another finger along the right edge of the screen so you can sweep toward it.
  • To double-check content, swipe through the screen. Swipes are quick vertical or horizontal strokes on the screen. Each swipe right or down moves focus to the next item, like pressing tab on a Windows or Apple computer. Each swipe left or up moves focus to the previous item, like pressing shift+tab on a Windows or Apple computer. This helps you identify anything you may have missed while exploring the screen. It also tells you which items are focusable and which are not.
  • To make sure a list is just a list, slide your finger down the center of the screen; then slide your finger down near the left and right edges. Sometimes list items are associated with additional controls. For example, on the screens that list available keyboards and text to speech engines, a Settings button is to the right of each keyboard and engine, and in some Twitter clients, buttons for replying to the tweet or checking the person’s profile are often found to the left or right of the tweet.
  • To complete an action, check for Cancel or OK in the lower half of the screen, or check for Done in the top or bottom right-hand corner. Most of the time, when changing a setting either for the device or within an app, you don’t need to do anything to save your setting. Its saved automatically when you go back, home, or somewhere else, like to Notifications or Recent Apps. Occasionally, however, you need to tap OK or Done, for example, when changing a ringtone in any app or setting a date or time in the calendar. Generally speaking, Cancel (left) and OK (right) appear somewhere in the lower third of the screen, and Done appears at the level of the Search or Favorite Bar, almost always at the right edge of the screen.
  • To find controls in More Options, check the lower third of the screen. Typically, when you tap a More Options button, you are presented with a list of controls. In newer apps, these controls tend to appear as a vertical list, but on many apps that have been around a long time, controls tend to be located near the bottom of the screen, often in 2 rows of 3 or 4.

Slide to Unlock

The single action you do most often with your Android device is unlock the screen. This post explains how to wake up your device and unlock the screen using the slide-to-unlock method.

The cool thing about Android is that you interact with your device by touching the screen. You can slide one or two fingers over it, or you can tap it. This is efficient, especially once you have a general idea of how the screen is laid out.

There are times, however, when you don’t want the screen to respond. When the phone is in your pocket, for example, you don’t want to accidentally dial or text anyone, and you don’t want to add esoteric events to your calendar. So most of the time, the screen is unresponsive or off.

To use the device, you need to turn on the screen. By default, this is a two-step process, in which you wake up the device, then unlock the screen.
1. To wake up the device, simply press the power button. When you do, TalkBack announces the time, and if you slide a finger over the screen, you may hear status information near the top, or you may hear information about unlocking the screen about two thirds of the way down. If you do nothing, the device goes back to sleep after a few seconds, playing a chime and announcing, “Screen off.”
2. To unlock the screen, place a finger about two thirds of the way down the screen and slide it to the right. In other words, divide the screen into three equal parts (top, middle, and bottom). Your finger should be roughly between the middle and bottom sections and halfway between the left and right edges. If you’re in the right place, TalkBack says, “Slide to unlock.” Depending on your device, TalkBack may say something like, “Slide right to unlock. Slide left for camera.” If TalkBack says something very different, move your finger around the screen until you hear the unlock message. On most devices, you can slide your finger right, up, or left, but it’s a good idea to get into the habit of sliding right because this is the gesture used to answer incoming calls. If you are successful, you hear a click, like an opening car door, and you can hear the contents of your home screen if you slide your finger around.

If Step 2 doesn’t work, try the following:
• Slide right with two fingers. On some devices, a two-finger slide is required.
• Lift your finger, press the power button twice (to put the device to sleep and wake it up again), and start over, careful not to move your finger around too much. Sometimes as you’re sliding your finger around the screen, TalkBack thinks you want to go into one of the context menus. If this happens, you hear a whirr, and TalkBack tells you to explore in a circle. The quickest way to correct TalkBack is to press the power button twice, then touch the screen again, either exploring in very small increments or lifting your finger an putting it down again.

Setup

This post assumes two things:
• Your device is running pure Android, and
• You’re on the setup screen (i.e., no one else has started the setup process).
These setup instructions apply to most Android phones and tablets. One well-known exception is devices running Samsung Touchwiz. Regardless, it’s a good idea to have a sighted person nearby in case bad things happen and speech dies.

Ingredients

To set up your Android device, you need the following:

• Android phone or tablet (for obvious reasons)
• WIFI network and WIFI password (for signing into Google)
• Headset or speakers with 3.5 mm wired connector (for listening to password fields) – All Android devices include a set of earbuds. Get them out and have them ready to use.
• Gmail address and password (for connecting your device to Google services) – You can create a Gmail account during the setup process, but things go more smoothly if you do this ahead of time by visiting
https://accounts.google.com/SignUp
You may need to solve a captcha as part of the Gmail account creation process.
• Sighted friend or family member (for cases when the shirt hits the fan)

Preparation

These instructions are general because the process varies slightly. They assume your device comes with Android 4.1 or higher.

1. Plug your device into a power source. Insert the small end of the micro USB cable into the host port, usually on the bottom panel of the device. Plug the large end of the USB cable into the power adaptor. Plug the adaptor into an electrical outlet.
2. Press the Power button for about 5 seconds and wait another 5 seconds or so for the device to turn on.
3. Place 2 fingers in the general center of the screen, holding them slightly apart (with about half an inch or 2 cm of space between them). Don’t move them, even a little. Just keep them there until the phone speaks.
4. Follow all directions, including the first one, which tells you to keep your fingers on the screen until accessibility is enabled.
5. Go through the accessibility tutorial, which guides you through the various gestures associated with Talkback and Explore by Touch. Android gestures involve a touch that is faster and lighter than other touch-screen devices, so understand that you’re practicing the touch as well as learning the gestures.
6. When prompted to do so, select your WIFI network by double-tapping it, and enter your WIFI password. The screen includes a Show Password checkbox. Double-tap to check it, and enter your password.
7. When prompted to do so, enter your Gmail address and password. Start with your user name. Then touch the Next button (near the Delete key), or double-tap Back to remove the on-screen keyboard and double-tap the Password field. At this point, exploring the keyboard with your finger produces multiple instances of the word “dot.” Plug your earphones or speaker in. Now the keyboard announces letters and other characters as you explore, but the word “dot” as you lift your finger. This is normal. Depending on your device, you may hear the password when you touch the password field or you may hear multiple instances of the word “dot.” Double-tap the Login buttons when you’re done.
8. Finish the setup process by following any additional prompts.

Serving Instructions

Before you start enjoying your Android device:
• Go to the Play Store, search for TalkBack, and install it. This not only insures you’re running the latest version of TalkBack, but also tells the Play Store to check for TalkBack updates in future.
• Go through the TalkBack/Explore by Touch tutorial again. This introduces you to any changes to TalkBack. Find it at Settings>Accessibility>TalkBack>Settings. Swipe up with 2 fingers multiple times to scroll to the Explore by Touch tutorial option, and double-tap it.

Notes:

A. If you want to know more about using the on-screen keyboard, refer to the post on typing.
B. If you have another Android device and a sighted person helps you with setup, the device starts talking as soon as you’re logged in.

Terms to Know

This blog explains how to use an Android device with TalkBack enabled. It includes information for using the device in general and for using specific apps. For the most part, the same instructions apply to phones and tablets.

What follows is a list of terms that appear on this blog. You may want to refer to it as you read the individual posts:

Accessibility service = a complex program that is always running and provides additional information to people with disabilities. Examples of accessibility services are TalkBack (which reads the screen), BrailleBack (which enables braille input and output), and Just Speak (which provides hands-free use).
Android buttons = Home, Back, and Recent Apps, three buttons that appear at the bottom of all Android screens. Some devices also have a Menu/Option button, but it’s not officially one of the Android buttons. Visually, these may disappear until the bottom of the screen is touched, but they are always available.
App = application, a program that does something specific, like list current weather and include a five-day forcast.
App drawer = the complete list of all the applications/programs on the device.
Device = phone or tablet.
Explore = a method for reviewing the contents of the screen by moving a finger over it systematically, actually touching controls and other information.
Favorites bar = a line of controls located on the home screen between the work area and the Android buttons. It contains buttons to frequently used apps, like Phone and Browser. Most apps present controls at the level of the favorites bar.
More Options = a button usually located at the right edge of the screen at the level of the search or favorites bar, which provides additional options (e.g., pressing More Options in the calendar app brings up buttons for creating a new event, refreshing the screen, adding other calendars, and so on). Though there is generally only one More Options button (more often at the top right), some screens have two, one at the top and one at the bottom. This button is sometimes also referred to as Options or Menu.
Notification bar = the top line of the screen, which contains status information (e.g., battery level and signal strength) as well as information about new events (e.g., new emails, missed calls, reminders)
Notification shade = a full screen showing each notification on a new line. Notifications often include buttons (e.g., an email notification typically includes buttons to delete or reply to the message, and a missed call notification typically includes buttons for returning the call and checking call details). To access notifications, we pull the notification shade down or we open the notification shade. This screen is often simply called Notifications.
Search bar = a line of controls located on the home screen between the work area and the notification bar. It contains the Search and Voice Search buttons. Most apps present controls at the level of the search bar.
Swipe = a method for reviewing the contents of the screen by making quick horizontal or vertical movements over the screen, without necessarily touching controls and other information.
System settings = the app that controls device configurations (e.g., adjust the screen timeout, change TTS, set a lock screen password). This is often simply called Settings and is not to be confused with the settings of specific apps.
Tap = activate a control. When TalkBack is on, this usually means “double-tap.” If Single-Tap Mode is enabled, tapping once is sufficient. For more information about double-tapping, refer to the post on accessibility gestures.
Widget = a tiny program that does one specific thing, usually to help a more complex program. A widget in a music player enables users to play and stop the current track without going into the music app, and a weather widget displays the current temperature at all times.
Work area = the main part of the screen, where most controls and information is presented.

A Brief History of Android Accessibility

In the beginning (okay, 2009), there was Android. It was called cupcake (1.5). It had no accessibility, and that was not good.

Then came the others:

  • 1.6 – Donut
  • 2.0 & 2.1 – Éclair
  • 2.2 – Frozen Yogurt or Froyo
  • 2.3 – Gingerbread
  • 3.0, 3.1, & 3.2 – Honeycomb
  • 4.0 – Ice Cream Sandwich
  • 4.1, 4.2, & 4.3 – Jelly Bean

All had accessibility, and that was good.

Enough already!

This is a rough outline of changes to Android accessibility. If anyone has corrections or additions to make, feel free to post them to the comments.

The Early Years

Basic accessibility was introduced in Donut (1.6), expanding gradually through Gingerbread (2.3). The first level of accessibility was delivered by screen readers:

  • TalkBack (2009) by Google’s Eyes-Free Project.
  • Spiel (2009) by Nolan Darilek.
  • MobileAccessibility (2011) by Code Factory, which combines a screen reader with a suite of simplified apps.

All three screen readers offered roughly the same level of accessibility. With them, blind and low-vision Androidites could use their D-pads or trackballs to navigate to all focusable items and activate buttons, checkboxes, and other objects. They could also enter and review text. Initially (1.6) text input was possible only with a hardware keyboard, and text review involved moving focus away from then returning to an edit field, listening to the entire block of text, and counting presses of the arrow keys or deleting to a desired point. Over time (by 2.2 as Darilek, Code Factory, and Google developed virtual keyboards), however, on-screen typing and voice input became available. So did aural text review in edit fields and use of a virtual D-pad.

Though devices were quite usable in Donut through Gingerbread, creativity and workarounds were part of the eyes-free Android adventure. They were needed to deal with unlabeled buttons and edit fields, unfocusable objects, inaccessible web views, and the disruptions caused by device manufacturer overlays. Despite general improvements partially brought about through apps that took advantage of accessibility API’s, blind and low-vision users continued to choose devices that sported a hardware keyboard (strongly recommended) and a joystick (required) that could be located by touch (e.g., 4 arrow keys, a D-pad, a Trackball, or a trackpad).

The Middle Years

A significant accessibility improvement was released with 3.0 (Honeycomb), the OS version for tablets. It was the introduction of Enhanced Web Access, the injection of self-voicing java scripts that make web views accessible. In Gingerbread and earlier, developers needed to do lots of coding manually, an undertaking that must have seemed above and beyond the call of duty, assuming more than a few even knew of their existence and purpose, but with Enhanced Web Access, many apps became screen-reader friendly without any developer intervention, email readers and web browsers being common and notable exceptions.

More exciting was the introduction of touch exploration in Ice Cream Sandwich (4.0). For the first time, eyes-free Android users could touch the screen and get spoken feedback about what they were touching without inadvertently phoning home or starting up the music player. They could simply slide a finger over the screen, and when they heard the screen reader announce the right button or checkbox, they could lift their finger and tap the object directly. Hardware keyboards and physical navigational controllers were no longer needed, but an accessible virtual D-pad was still required for text review in edit fields.

Ice Cream Sandwich also included system-wide font adjustment for low-vision users. In earlier Android versions, font size could be increased in some apps, not all or even most, and it could only be resized to levels set by developers who mostly have little experience with people who read really large text.

An additional feature introduced in Ice Cream Sandwich, which sounds minor but isn’t, was the ability to set up the device and turn on accessibility without sighted assistance. End users could now draw a clockwise rectangle on the activation screen to start TalkBack before setup. Though welcome by most eyes-free users, the gesture caused a stir as many had trouble getting it to work.

The Present

Each version of Jelly Bean (4.1, 4.2, and 4.3) has included important improvements to eyes-free accessibility.

4.1 had lots of new goodies.

Accessibility focus was easily the biggest and flashiest addition. Eyes-free users could swipe vertically or horizontally to hear the next or previous focusable object, then double-tap anywhere on the screen to activate it, an approach many found easier than locating one button in a crowd.

Other screen-reader gestures became possible as well. Gestures could be used to go back, home, to notifications, and to recent apps. Gestures could also be used for reviewing text in and out of edit fields, reading continuously, and quickly jumping to part of the screen.

Braile support became available to eyes-free users. They could read on-screen content, except in web views and a few other environments such as Play Books, and they could input text using braille displays. Earlier versions of Android did include braille through Code Factory’s MobileAccessibility product, but the option became available to all Android users in Jelly Bean, first through an accessibility service called BrailleBack, developed by Google, and later through the BRLTTY project, developed by Dave Mielke.

Screen magnification, which made a brief appearance in an earlier TalkBack beta, was cooked into the operating system in 4.1. Low-vision users could seriously zoom in on parts of the screen, pan, and enjoy other features of magnification software, whether a screen reader was being used or not.

Finally, a new gesture for turning on accessibility at the setup screen was introduced in response to the hubbub over Google’s first offering. Now eyes-free users had the option of drawing the infamous rectangle or holding two fingers down on the screen after pressing the power button. While there were a few hitches—such as problems produced by inadvertently placing the fingers on the language item or disruptions caused by manufacturer overlays—the new gesture was well received.

4.2 had more goodies.

The ability to restart accessibility/TalkBack was the most important of these. If the Accessibility Shortcut gesture was enabled in Settings>Accessibility, eyes-free users could restart TalkBack by waking up their devices and holding two fingers on the screen. The significance of this feature is huge, especially when sharing a device with a sighted family member, friend, or main squeeze.

4.3 has lots of beautiful tweaks. Since they’re not listed in one tidy place, eyes-free users are comparing notes to get a full list. Two are worth mentioning at this time:

Accessible text selection has been long awaited. Now eyes-free users working in edit fields can select chunks of text for copying, cutting, and pasting. Previously, text selection was an all or nothing deal.

Enhanced Web Access has been the other interesting development. Since their coming into being in Honeycomb, the java web scripts have been an option in Settings>Accessibility, which end users have needed to activate. In 4.3, the option has disappeared from the Accessibility screen, suggesting that the scripts are part of the operating system itself.

Who was your first?

What is Android?

Android is a touch screen based operating system (OS) for cell phones and tablets. It includes off-the-shelf accessibility for blind and low-vision users via services like

  • TalkBack, which speaks the content of the screen.
  • Explore by Touch, which provides an accessibility layer so eyes-free users can run one or more fingers over the screen to interact with their devices.
  • BrailleBack, which enables input and output with a Bluetooth braille display.
  • Large fonts, which adjusts font size system wide.
  • Screen magnification, which zooms in on parts of the screen for better viewing.

Though these features also benefit end users with other accessibility needs, this blog focuses on eyes-free and low-vision use.

Eyes-free users shopping for a new device should keep in mind that All devices running pure Android are accessible, but accessibility varies on devices with custom UI’s, like Samsung Touchwiz, Motorola Motoblur, and HTC Sense. Of these manufacturer overlays, Touchwiz (its newer versions) is the most accessible, having features some eyes-free users prefer over stock Android. Motoblur is also relatively unintrusive, becoming less of an issue with each version of Android. Sense is now and has always been the most troublesome, requiring that blind and low-vision users find alternative apps for such basics as the dialer and contacts. Other manufacturers tend to be less disruptive with their custom skins, often doing little more than changing the launcher, which can easily be replaced. So when researching devices, it is important to find out whether a manufacturer UI is part of the installation.

Another thing to take into account is that newer is better when it comes to Android versions. Android accessibility, though very good, still has a few areas that are works in progress. Text selection is one example. Through 4.2, the only way to select chunks of text while editing is to use a physical keyboard. In 4.3, however, it is possible to select text with the touch screen alone. Hence, for the best accessibility experience, eyes-free users should invest in the newest version of Android (4.3 at this time).

This blog covers Jelly Bean, with occasional references to earlier versions of Android. Most posts provide step-by-step instructions on how to perform tasks on a stock/pure Android device. Users of customized phones and tablets will be able to follow along as differences are relatively small.

For information about the accessibility of devices running Gingerbread (2.3) and earlier, visit the Accessible Android blog on Blogger

http://AccessibleAndroid.blogspot.com

which is the previous incarnation of this site.