Controlling the Glide/Pressure sensor
Mike C. Fletcher
mcfletch at vrplumber.com
Thu Jan 10 10:46:40 EST 2008
I have a student who's interested in doing a term project on the UI for
the track sensor. I've put together this quick summary. Deadline looms
for starting the project, so if people have "don't do that" or "we've
already done that" feedback, please speak up ASAP.
Background:
* XO has two different devices, resistive glide-sensor and
pressure-sensitive tablet
o Both of these are currently showing up as "core pointer"
events in X AFAIK
o Changes between pressure and glide-sensor activity have the
potential to cause "jumps" of the cursor (absolute versus
relative mode)
* There is currently no UI to map the pressure-sensitive tablet's
area into a particular area on the screen (nor, AFAIK an API to
accomplish the mapping)
o Use case: use the entire drawing area to draw into a small
area of a drawing in Paint
* Activities currently do not have control over the mapping of the area
o Use case: in a penmanship course, collect samples of the
child's letters in special widget areas within a "test",
focusing a new area should remap the pen to that area
Trackpad UI Design Requirements:
* API for configuring the resistive/pressure sensor allowing control
of all the major features
o Note that there will likely be some X11 hacking here, to get
the pointer mapping working
* Standard UI controls for redirecting input areas
o Standard GTK UI for positioning, and scaling
o Standard GTK widget for e.g. handwritten text entry, provide
access as a bitmap (or a series of strokes optional)
+ Allow for capturing all data (full resolution) or just
scaled data as configuration option
o Intuitive (HIG-compliant) standard mechanisms for
controlling the various configuration parameters
o A 6 year old should be able to figure out how to direct
their drawings, written text and the like into the desired areas
o Standard feedback on where the tablet area is bounded on
screen when drawing with the tablet
* System level (possibly on Sugar's Frame) trigger to bring up the
control mechanisms (optional)
o Most pen-aware applications will likely use internal logic
and the API to determine position and the like, but a
general trigger to the functionality should be useful for
non-pen-aware activities
* Paint Controls
o Work with Paint's authors to provide intuitive controls to
make using the pen/tablet intuitive within the context of paint
Obviously we would need to find a machine to work on to make the project
feasible. I'll see if we can repurpose one that's local to the task.
Discussions welcome,
Mike
--
________________________________________________
Mike C. Fletcher
Designer, VR Plumber, Coder
http://www.vrplumber.com
http://blog.vrplumber.com
More information about the Devel
mailing list