Example of Single Switch Scanning to Access A Dynamic Display Voice Output Communication Device

Example of Single Switch Scanning to Access A Dynamic Display Voice Output Communication Device
After watching the video, “I Love Assistive Technology.” you may be interested in learning more about “Mr. P’s” communication device and how he is able to communicate using it.

Mr. P communicates using an ECO-14 manufactured by the Prentke Romich Company (PRC). The ECO-14 falls under the category of “high-tech dynamic display voice output communication devices.”

Dynamic display voice output devices come from the manufacturer with preprogrammed vocabulary so they are ready to be used with little customization needed. Different manufacturers organize vocabulary in different ways and vocabulary is organized by each manufacturer using “language systems” that are designed to increase the speed and ease of communication.

These language systems dictate how the vocabulary is organized within the device. For example, a well known manufacturer, DynaVox employs a language system called Gateway which organizes single meaning icons using a set of most frequently used core vocabulary words in combination with fringe vocabulary organized in page sets or folders. In contrast, the Prentke Romich Company (PRC) employs a language system called Minspeak which involves coding a small number of semantically rich icons, in different configurations, to speak different words.

Both DynaVox and PRC devices come with several page sets, with varying levels of complexity, that organize vocabulary in an effort to suite each user’s individual cognitive and linguistic needs.

Now that you know more about language systems used by dynamic display output devices, I will explain how Mr. P uses his ECO-14 to communicate.

Mr. P’s ECO-14 uses a preprogrammed overlay (or page set) called Unity 45 1-hit. This is the most basic Minspeak overlay, and as its name implies, there are 45 icons on the Unity 45 1-hit core page; no coded icons are used.

Mr. P is not limited to only 45 words, however. Unity 45 1-hit is a dynamic display which means portions of the page change to provide access to more vocabulary. For example, when the apple icon is selected, the device speaks the word “eat.” At the same time, the top row of icons (i.e., the activity row) changes to show Mr. P’s favorite food choices so he can quickly select what he wants to eat. Additionally, Mr. P has access to category pages which act like a dictionary for fringe vocabulary words and conversation based pages which provide vocabulary for various topics of conversation.

You now have a better understanding of the type of language Mr. P has access to so let’s discuss how he builds simple phrases and sentences on his ECO-14 device.

Mr. P does not have voluntary use of his upper extremities, so pointing to a communication device (i.e., direct selection) is not an option for him. Instead, he accesses his communication device through a method of selection called scanning.

He has a single switch positioned at the right side of his head. (If you look carefully in the video you can see the switch’s mounting system on the right side of his headrest.)

There are several different types of scanning patterns available to choose from. Mr. P’s scanning pattern is called 4 quadrant, column/row scanning. This means that the communication device is set up to scan through the page in 4 portions. In the video you can see that, first, the device scans left to right through three equal portions of symbols on the bottom 3 quarters of the screen then it goes up to the message bar in the top quarter of the screen which is considered the 4th quadrant on the screen.

Mr. P must wait until the word he wants to say is in the currently highlighted quadrant. At this point, he hits the switch with his head before the selector moves on to the next quadrant. Next, the device scans through that quadrant using a column/row pattern. This time, Mr. P must wait until the column he wants is highlighted before hitting the switch again to select that column. Next, each row is highlighted until it lands on the picture he wants and he hits the switch to FINALLY select a word.

As you can see, in the area of the video where Mr. P. demonstrates how he uses his device in “real-time,” scanning is a slow process but it yields BIG results!
Without scanning, Mr. P would not be able to access his ECO-14 voice output communication device and thus, independent communication would not be possible.

Because communication can be slow, the device is equipped with notebooks that store large amounts of text that are spoken all at once by selecting the “speak notebook” icon. Mr. P used the notebook feature on his communication device to deliver his speech on this video.

The speech was developed through an extensive interview process with Mr. P. Following the interview, I wrote a draft of the speech based on the information Mr. P provided and he helped to edit the final version of the speech. I then entered the speech into three different notebooks on his communication device so he could easily access all three parts of the speech that you see in the video.

Notebooks are a wonderful tool for AAC users to use for presentations as well as in everyday life to store messages to be spoken at a later time.

In summary, there are a variety of high-tech dynamic display voice output communication devices on the market each with their own method of vocabulary organization. Individuals with physical impairments like Mr. P are able to access these devices through a variety of access methods including scanning via a single switch. Some devices also contain additional features to improve speed of communication such as the notebook feature which enables Mr. P to ‘speak’ an entire notebook’s contents with the activation of only one icon or symbol.

To learn more about partner assisted visual scanning, read my article,"What is Partner Assisted Visual Scanning?"


References:



  1. Bruno, J. (n.d.). Who can use Gateway To Language and Learning? In http://www.gatewaytolanguageandlearning.com. Retrieved July 17, 2009. Available from http://www.gatewaytolanguageandlearning.com/index.html.



  2. Semantic Compaction Systems. (2009). What is Minspeak? In www.minspeak.com. Retrieved July 17, 2009. Available from http://www.minspeak.com/what.php.


Share on Facebook
<- back