[sugar] Sugar <==> Etoys bridge.
Simon Forman
forman.simon at gmail.com
Sat Mar 17 05:44:06 EDT 2007
Sugar <==> Etoys bridge.
Or a Possible Way to Make an Etoys-like Environment in Python, Soon.
Before I get to the meat of this post I want to mention up front some
salient points about "xerblin". It's a very simple, usable metaphor,
with very simple linguistic *and* AST representations. It's easy to
teach to kids, baby-boomers and non-computer-literate adults. And it
can represent assembler, C, python, smalltalk, etc... in same system.
It could also switch to Develop to edit its command Words (which are
written in python.)
(A bit more about that last point: Using Develop to edit the basic
primal command Words would be a much easier-to-digest python, um,
environment than an Activity or regular script.
There may be a nice progression there. Using a scholastic metaphor,
xerblin itself is kindergarten, xerblin Words in Develop is First
Grade, editing Activities in Develop is Second Grade, going on from
there you can see where it might go.)
Ok, so, first I'm going to briefly sketch a path from Xerblin UI to
python class statements. (This may not make a ton of sense unless you
can fill in some of the relevant context yourself, sorry.) Then I'm
going to talk about the minimal system you need to have something that
could be called Squeak-like, as I understand it from reading the
documents Bert mentioned last week, and how to get there from "here".
You start with the basic Xerblin UI: you have a Stack (Clipboard) and
there are the "Words" which correspond to tools like grep, bc, man,
etc. but they use the clipboard/stack as a communications bus rather
than pipes et. al.
The words are stored in a Dictionary, keyed to their names, and
there's a simple Interpreter can accept strings and execute them.
Here's pseudo-code for a xerblin interpreter:
while True:
for word in raw_input():
Dictionary[word].execute(Stack)
The Words are structured into more complex commands by means of four
"ComboWords": Sequential, Loop, Branch, Parallel. When Words get
composed into more complex commands the resulting data-structure can
be drawn as a tree, similar to the ones on the screenshot I mentioned
in a previous post.
To draw that tree I used a scaled down duplicate of [some of] the
Oberon system. Briefly, each widget controls dispatch of messages to
its contained child widgets. There's a single handler function (in
each widget) that accepts messages and runs them through a if..else
dispatch tree and acts on them and/or passes them on to its children.
But we can use a xerblin instance as the dispatch "engine" for each
widget; the Words in the Dictionary act as a widget's
methods/messages. Given a rich enough set of Words including ones to
act on the Canvas (draw, etc..) we have, if you squint, a primitive
Squeak-like environment (more below.)
To make widget factories it would help to have a specification
language like the MakeWords little language but more expressive...
maybe we can use python itself.
Xerblin instances have Dictionaries inside them, a lot like the
__dict__ attributes in python objects. Once you have an object with a
dict in it you can have a translator "read" it and emit a python class
statement, or read a class statement to generate an object (or object
factory.)
If you run the emitted python specs directly on python rather than
converting them into xerblin trees the widget with xerblin interpreter
becomes a regular python instance with __dict__ and the Stack becomes
each method's arguments, locals, and return values.
If we're careful and the good kind of clever, we can have xerblin
repr's of live python objects that can perform a variety of
interesting automated tasks on those objects. This might be for the
"View Source" button somehow.
On to Squeak-like behaviour, by way of answering jm3's question about
media authoring with xerblin..
Authoring in Xerblin. Simple answer, clone Squeak. Smalltalk to
Xerblin translator? Much more interesting and immediately useful than
a language scanner/parser/translator would be a widget that could take
a live squeak object (and it's minimal environment) and transmutes it
into something made out of xerblin words and Interpreter instance(s).
Probably a cooperative task between Squeak and xerblin, or better yet,
make something in smalltalk to emit "MakeWords" scripts.
An important thing to grasp that squeak et. al. do is to scoop up ALL
"media" expression (output forms: visual, auditory, ...) into ONE
representation, eliminating distinct "applications" and making
"authoring" ONE activity of mapping from datastructures-and-code to
visual representations.
(I was worried about the "best" ways of doing this and how to
determine them when it dawned on me, 10**8 children... a rich, fun
region to explore and enjoy... If we can just give them a useful,
minimal set of "legos" they'll take care of finding the good mappings
on their own. Start with the simplest building blocks. Currently
available are xerblin and Cairo API.)
The data-structures are self-referentially defined and built out of
self-describing object/behaviour bundles in a way analogous to the way
your body is built out of self describing object/behaviour bundles of
molecules called cells. (The visual representations are calls to the
Cairo API, orchestrated in the time domain. Basically animated SVG.)
I think this can be done with xerblin. We need a way to translate
ensembles of Squeak-like xerblin entities into python source code, and
vice versa, that preserves comments and formatting.
And we need a way to translate Squeak/Smalltalk into those xerblin
entities so we don't have to reinvent all those marvellous wheels. We
should be able to either scan and parse smalltalk or make Squeak emit
some sort of intermediary form.
And in the meantime, we need a minimal DS+code-to-visual
representation toolkit for bootstrapping and later teaching.
Those probably sound like crazy tasks. I'm going to give it a try. :-)
Thanks,
~Simon
More information about the Sugar
mailing list