Ispeak (activity)

From OLPC
Revision as of 12:47, 28 April 2008 by Tony37 (talk | contribs)
Jump to navigation Jump to search

This bundle File:IconSpeak.activity-1.xo is an initial prototype of the OLPCIconSpeak.activity. Caution: It is likely to undergo rapid and radical change.The activity is based on Speak. The goal is to allow the user to build a message from an array of icons representing words. The message is then spoken using speech synthesis.

The user of this activity clicks on a button in an array. A button may have an image (icon) or label (text). If the button corresponds to a category, they are replaced by the icons/text from the next level down. In this case the first icon will be an up-arrow. Clicking on this icon will return the user to the previous level. If the button is corresponds to a word - the word is copied to the output message. The word corresponding to a higher level can be selected by pressing the space bar while the mouse left button is down; otherwise, the next lower level will be displayed. Finally, the user can press the 'play' button and the message will be spoken by eSpeak.

The activity reads an .xml format file: asl.xml. This file specifies the icon and word correspondence. For example,

<ispk>
  <category icon = 'someimage.gif'>text
    <word icon = 'word.gif'>word
    </word>
  </category>
</ispk>

In the activity is a set of icons. The example set are in .gif format. However, .svg or .png formats could also be used. They are drawings based on the American Sign Language [[1]]. The activity looks for the icons in the icons folder. It reads a file: asl.xml which specifies the icons and the corresponding words.

This version is replaced by File:Ispeak-1.xo