[laptop-accessibility] Screen reader software -- any progress?

Tim hobbs tim.thelion at gmail.com
Sat Feb 9 21:09:01 EST 2008


I am strongly opposed to the use of a vanilla Activity-Speech
dispatcher communicating for the following reasons:
1.  Visually impaired and cognitively impaired persons *must* have a
standardised keyboard navigation system which is both intuitive(with a
low learning curve), and well intigrated into the speech server,
Speech dispatcher does not mandate such a thing.
2.  There is no geometric or locational information sent with the
text.  I for one have very low visual acuity, this means that my eyes
don't stay in one place and have difficulty tracking from line to
line.  I very successfully use a setup where the current line is
positive polarity, and the background text is negative
http://www.timthelion.com/emacs-current-line-polarity.png ,  I work
even better if the current line of text is magnified.  A plug-in to
speech dispatcher which instead of speaking or outputting to brail,
magnified text at an appropriate location on the screen is impossible
without that location being tracked by the Activity.

I have written a proposal for a different protocol which solves these
problems. http://wiki.laptop.org/go/Accessibility_Line_Based_Interface
I urge you to look at this and consider what it provides that speech
dispatcher does not, namely:

1. A very simple intuitive, "navigational" instead of "magic hotkey"
keyboard accessibility setup.

2. Geometric information about text.

Otherwise, I believe the two standards/protocols to be functionally
identical so there should be no real need to change course.  Maybe my
ideas can be implemented as easily as having SayText take (text, x, y,
w,h) and implementing line based keyboard accessibility which triggers
such events in Sugar Activities.

Thankyou,
Timothy


On 2/7/08, Duane King <dking at pimpsoft.com> wrote:
> Brad,
>  Another advantage to using the speech dispatcher and its API is that a lot
> of
> wok can be leveraged from it; There is an entire community gathered around
> it
> already, and its already used in production systems by blind computer
> professionals like myself, hence why I suggested it to Hemant off the list.
>
> I would personalky love it if more people joined our little co-oped efforts;
> and by all means your group - or any others - are more then welcome to do
> so.
>
> - Duane
>
> On Saturday 09 February 2008 09:20:51 am Hemant Goyal wrote:
> > Brad,
> >
> > Maybe we should coordinate our efforts. We are presently working to bring
> > speech synthesis capabilities on the XO.
> >
> > We have made significant progress and are documenting our results here :
> > http://wiki.laptop.org/go/Screen_Reader [It is slightly outdated wrt the
> > DBUS speech server, as we are planning to dump that approach and instead
> > use speech-dispatcher]
> >
> > At present Assim Deodia is working to improve the eSpeak phoneme data for
> > better voice quality/output on the XO, and I am working on integrating
> > speech-dispatcher in the XO as a means of providing a simple to use speech
> > synthesis api.
> >
> > I have opened a Ticket which might interest you :
> > http://dev.laptop.org/ticket/6284
> >
> > Best,
> > Hemant
> >
> > This posting is very new and, at the moment, consists only of a block
> >
> > > diagram of the approach I am proposing.  I have several pages of
> > > narrative in the works at the moment and I am hoping to post the first
> > > version of it to the Free Speech wiki article sometime this weekend.  In
> > > the meantime, please check out the links in the "See also" section of
> the
> > > article.  Those linked-to articles contain links to other speech-related
> > > efforts currently underway for the XO (for example, eSpeak).
> > >
> > > Cheers,
> > >
> > > Brad
>
>
>
> _______________________________________________
> accessibility mailing list
> accessibility at lists.laptop.org
> http://lists.laptop.org/listinfo/accessibility
>


-- 
-
Tim
tim.thelion at gmail.com


More information about the accessibility mailing list