[Wikireader] [Foundation-l] dumps

Samuel Klein sj at laptop.org
Wed Feb 25 03:50:02 EST 2009


Compression allowing random access is definitely the way to go for
large selections.

Ángel, that's an interesting reader you wrote.  I cc: a list for
offline wikireaders (most designed around mediawiki).  A similar idea
is in use by schools across Peru[1] to provide offline access to the
Spanish Wikipedia, based on wikipedia-iphone code:
   http://dev.laptop.org/git/projects/wikiserver

It doesn't have the windows/IE dependency but leaves out many of your
features like special pages, full template support, and categories.

SJ

[1] the same schools want offline access to images, so a smarter
reader that knows to look in turn locally / at a server / online to
find images is desired.


On Tue, Feb 24, 2009 at 5:34 PM, Ángel <keisial at gmail.com> wrote:
> Anthony wrote:
>> I've looked at the numbers and thought about this in detail and I don't
>> think so.  What definitely *would* be much more user friendly is to use a
>> compression scheme which allows random access, so that end users don't have
>> to decompress everything all at once in the first place.
>
> I did make indexed, random-access, backwards compatible, XML dumps.
> http://lists.wikimedia.org/pipermail/wikitech-l/2009-January/040812.html
>
> Wouldn't be hard to plug into the dump process (just replace bzip2 on a
> new DumpPipeOutput) but so far nobody seemed interested on it.
>
> And there's the added benefit of the offline reader I implemented using
> those files.


More information about the Wikireader mailing list