Talk:Accessibility: Difference between revisions
Gobo Fraggle (talk | contribs) No edit summary |
|||
(11 intermediate revisions by 6 users not shown) | |||
Line 11: | Line 11: | ||
Dave |
Dave |
||
: The good news: The XO's X Window system has stickykeys built in (among other accessibility features), and I have gotten it working in emulation. The bad news: I have only managed to do this via a command-line executable. I don't think the OLPC team has gotten to the point where this is user-configurable—I understand they are still working out a configuration system—but it wouldn't be too hard. In the meantime, here is what I did: |
|||
== Keyboard -Dave == |
|||
:# Download the accessx source code [http://www.cita.uiuc.edu/software/accessx/freewareaccessx.php here]. |
|||
:# Build it on a Linux machine with a compiler (I used my Ubuntu 7.10 box). |
|||
:# Copy the "ax" executable to the XO (I copied it to a website, then downloaded it via wget). |
|||
:# Open the [[Developer Console]] on the XO. |
|||
:# chmod the "ax" program to make it executable, then run "./ax +stickykeys -stickytwokeydisable". |
|||
: It shouldn't be too hard to port the executable's functionality to a Python activity, and perhaps I'll give this a try if someone thinks this would be useful before the OLPC team implements their own solution. |
|||
: Regarding the mousing surface: When I was talking to an XO demoer several months ago, he said that the central touch pad is the only part of the pad surface that can be used to move the mouse. The left and right pads are intended to act like a graphics tablet for drawing activities, but there aren't any activities that use them yet. I'm not sure why the whole surface cannot be used to use the mouse. —[[User:Leejc|Joe]] 00:11, 29 October 2007 (EDT) |
|||
: A further note: As part of its accessibility features, X also makes available something called MouseKeys, which lets the user manipulate the mouse pointer with the numeric keypad. It's not clear how well this would work on the XO, however, as I don't think any of the keys on the [[OLPC Keyboard layouts | XO keyboard]] are configured to send numeric keypad events. (Although, IIRC from a mailing list post a while ago, the game controller/buttons may be mapped to these keys...) —[[User:Leejc|Joe]] 13:13, 30 October 2007 (EDT) |
|||
Hi Dave, |
|||
Did anyone answer your questions? |
|||
If so, what did they tell you? |
|||
Tanya |
|||
September 25, 2007 8 PM |
|||
== Augmentative and Alternative Communication == |
== Augmentative and Alternative Communication == |
||
I am a speech pathologist in the U.S. and I'm very interested in the XO as a communication device for children with severe speech disabilities. I'm glad that there is already work on text-to-speech modules. Some software to enable graphical representation of language (since many children without speech are also unable to read) is the next step. I'm not sure where to start, though. I've got some interest at my university already. |
I am a speech pathologist in the U.S. and I'm very interested in the XO as a communication device for children with severe speech disabilities. I'm glad that there is already work on text-to-speech modules. Some software to enable graphical representation of language (since many children without speech are also unable to read) is the next step. This program might be a starting place: [http://www.pvoice.org/]. The CVS has the source that the developer claims will run (sortof) on Linux now. I'm not sure where to start, though. I've got some interest at my university already. |
||
Physical access is going to be an issue. I hope the XO2 has a touch-screen. |
Physical access is going to be an issue. I hope the XO2 has a touch-screen. |
||
[[User:Gobo Fraggle|Gobo Fraggle]] 12:23, 28 October 2007 (EDT) |
[[User:Gobo Fraggle|Gobo Fraggle]] 12:23, 28 October 2007 (EDT) |
||
: When the author of pvoice says it might run on Linux, I assume he means that the user interface will run. I haven't looked at the application (and accessibility is hardly my area of expertise), but a cursory glance of the web page reveals that the program uses Perl and wxWindows for its interface—both of which are highly portable (e.g. to Mac and Linux), but neither of which are shipping with the XO laptop, so some amount of porting would be necessary. But the big problem is that pvoice uses a text-to-speech engine that is built into Windows: the hardest part of porting the application would be coming up with an equivalent, free, multicultural text-to-speech engine for the XO... Or, I suppose, one could just make recordings of lots of words. In any case, a fair amount of redesign would be necessary. But you are right that pvoice provides a great starting point, as it provides word categories, pictures, and a UI prototype. —[[User:Leejc|Joe]] 13:33, 30 October 2007 (EDT) |
|||
: Thanks for taking a look at that, Joe. [[User:Gobo Fraggle|Gobo Fraggle]] |
|||
: Regarding the free multicultural text-to-speech engine ... the XO currently ships with espeak, which has many language accents. See [[Speech_synthesis]] for details. [[User:Jminor|jminor]] 00:25, 13 February 2008 (EST) |
|||
We need to be careful when pre-installing word categories and pictures, too. These tend to be culturally specific and if they are too far removed from the everyday experiences and cognitive conceptualizations of the kids in the local cultures, we might as well be using something completely abstract (like the written word) since guessability would be broken. See the research by Erna Alant [http://www.google.com/scholar?hl=en&lr=&q=erna+alant&btnG=Search] for more discussion on that and on the use of high-tech AAC in Africa. |
|||
The easiest speech solution is to use spoken recordings for everything, but I think we'll fill the memory pretty quickly that way. We might be able to get some charity or grants for distribution of SD cards for these particular users. |
|||
[[User:Gobo Fraggle|Gobo Fraggle]] 23:53, 25 November 2007 (EST) |
|||
:I've been thinking about this for a while, and at least for the higher level of alternative communication, the [[Speak]] activity looks like it would be something we should look into developing. Better yet if we can add some sort of menu with common phrases and maybe some small prediction engine. It seems like it's definitely better than the spoken recordings solution. --[[User:Neskaya|Neskaya]] 23:59, 12 February 2008 (EST) |
|||
Hi, |
|||
My name is Becky and I'm in the same boat as Dave. I posted my problem at anther site to solicit any and all help: |
|||
http://olpcnews.com/forum/index.php?topic=2063.0 |
|||
Please see and comment on the OLPCIconSpeak project: |
|||
[[Speech_synthesis#OLPCIconSpeak.2C_under_development]] |
|||
Thanks! |
Latest revision as of 01:54, 21 May 2009
Hi. I have an XO B2 machine here for trial. I also have a physical disability and use a mouth stick. I'm finding huge hurdles to get anywhere without sticky keys. Can anyone tell me a) if the sticky keys functions are installed and if so b) how I can get them turned on?
If they are not included. a) how can I get them and install/run them so I can use the machine and b) how can we make sure these functions are included and easily accessible in the release?
Secondly. The wiki says that B2 machines should have a working resistive touch pad that works - the two outer pads. These ones don't respond to stylus on my mouthstick. As I cannot use my hands this is also proving an access problem.
Is this likely a software update issue? I'm trying to get my way around the system with the access hurdles to find out what v system it has and to get some help if it needs updating.
Some dialog would be good with someone working on accessibility issues.
Dave
- The good news: The XO's X Window system has stickykeys built in (among other accessibility features), and I have gotten it working in emulation. The bad news: I have only managed to do this via a command-line executable. I don't think the OLPC team has gotten to the point where this is user-configurable—I understand they are still working out a configuration system—but it wouldn't be too hard. In the meantime, here is what I did:
- Download the accessx source code here.
- Build it on a Linux machine with a compiler (I used my Ubuntu 7.10 box).
- Copy the "ax" executable to the XO (I copied it to a website, then downloaded it via wget).
- Open the Developer Console on the XO.
- chmod the "ax" program to make it executable, then run "./ax +stickykeys -stickytwokeydisable".
- It shouldn't be too hard to port the executable's functionality to a Python activity, and perhaps I'll give this a try if someone thinks this would be useful before the OLPC team implements their own solution.
- Regarding the mousing surface: When I was talking to an XO demoer several months ago, he said that the central touch pad is the only part of the pad surface that can be used to move the mouse. The left and right pads are intended to act like a graphics tablet for drawing activities, but there aren't any activities that use them yet. I'm not sure why the whole surface cannot be used to use the mouse. —Joe 00:11, 29 October 2007 (EDT)
- A further note: As part of its accessibility features, X also makes available something called MouseKeys, which lets the user manipulate the mouse pointer with the numeric keypad. It's not clear how well this would work on the XO, however, as I don't think any of the keys on the XO keyboard are configured to send numeric keypad events. (Although, IIRC from a mailing list post a while ago, the game controller/buttons may be mapped to these keys...) —Joe 13:13, 30 October 2007 (EDT)
Augmentative and Alternative Communication
I am a speech pathologist in the U.S. and I'm very interested in the XO as a communication device for children with severe speech disabilities. I'm glad that there is already work on text-to-speech modules. Some software to enable graphical representation of language (since many children without speech are also unable to read) is the next step. This program might be a starting place: [1]. The CVS has the source that the developer claims will run (sortof) on Linux now. I'm not sure where to start, though. I've got some interest at my university already.
Physical access is going to be an issue. I hope the XO2 has a touch-screen.
Gobo Fraggle 12:23, 28 October 2007 (EDT)
- When the author of pvoice says it might run on Linux, I assume he means that the user interface will run. I haven't looked at the application (and accessibility is hardly my area of expertise), but a cursory glance of the web page reveals that the program uses Perl and wxWindows for its interface—both of which are highly portable (e.g. to Mac and Linux), but neither of which are shipping with the XO laptop, so some amount of porting would be necessary. But the big problem is that pvoice uses a text-to-speech engine that is built into Windows: the hardest part of porting the application would be coming up with an equivalent, free, multicultural text-to-speech engine for the XO... Or, I suppose, one could just make recordings of lots of words. In any case, a fair amount of redesign would be necessary. But you are right that pvoice provides a great starting point, as it provides word categories, pictures, and a UI prototype. —Joe 13:33, 30 October 2007 (EDT)
- Thanks for taking a look at that, Joe. Gobo Fraggle
- Regarding the free multicultural text-to-speech engine ... the XO currently ships with espeak, which has many language accents. See Speech_synthesis for details. jminor 00:25, 13 February 2008 (EST)
We need to be careful when pre-installing word categories and pictures, too. These tend to be culturally specific and if they are too far removed from the everyday experiences and cognitive conceptualizations of the kids in the local cultures, we might as well be using something completely abstract (like the written word) since guessability would be broken. See the research by Erna Alant [2] for more discussion on that and on the use of high-tech AAC in Africa.
The easiest speech solution is to use spoken recordings for everything, but I think we'll fill the memory pretty quickly that way. We might be able to get some charity or grants for distribution of SD cards for these particular users. Gobo Fraggle 23:53, 25 November 2007 (EST)
- I've been thinking about this for a while, and at least for the higher level of alternative communication, the Speak activity looks like it would be something we should look into developing. Better yet if we can add some sort of menu with common phrases and maybe some small prediction engine. It seems like it's definitely better than the spoken recordings solution. --Neskaya 23:59, 12 February 2008 (EST)
Hi,
My name is Becky and I'm in the same boat as Dave. I posted my problem at anther site to solicit any and all help:
http://olpcnews.com/forum/index.php?topic=2063.0
Please see and comment on the OLPCIconSpeak project: Speech_synthesis#OLPCIconSpeak.2C_under_development Thanks!