[Wikireader] Because I'm to impatient to wait to see SJ at home...

Samuel Klein meta.sj at gmail.com
Sun Jul 13 00:32:40 EDT 2008


A question -- who on the list is coming to Wikimania this year?  We should
set up a workshop time to discuss the current wikireader tools and their
roadmaps.  If you wish you could be here but want people to talk about
/your/ latest work anyway, please send me a one-paragraph description of
where your work is at and how mediawiki devs could help.

Mako and I are in Alexandria from tomorrow night on; you can reach us
through his cell : 206-409-7191

Cheers,
SJ

On Sun, Jul 13, 2008 at 12:01 AM, Daniel Clark <dclark at pobox.com> wrote:

> Below is a dump of an early version of some doc regarding creating and
> editing local copies of a mediawiki (specifically the one we use at
> the FSF for Sys Admin doc); I thought it might be interesting to some
> people on the list, and also I'm wondering if I've missed anything. I
> didn't try OLPC WikiBrowse, as it looks very XO-specific, at least in
> terms of doc (which mentions Sugar as a dependency).
>
> It's interesting to see the IBM project, given that IBM Lotus Notes
> has had this kind of thing as its killer feature for many years; in
> fact the reason I'm so interested in getting this set up is because I
> remember how useful it was to have a local copy of critical Sys Admin
> doc when I was with IBM Lotus software. Are the extreme blue students
> working on that project in the Westford (or is it Littleton now?) or
> Cambridge offices (where Notes is developed)?
>
> Weird how this list/topic is combining 3 out of 3 jobs I've had since
> high school...
>
> --
> Danny Clark # Sys Admin, Free Software Foundation
> # http://www.fsf.org # http://opensysadmin.com
>
> = mvs - A command line Mediawiki client =
> It would be really nice if this supported some form of recursion...
> All these tools are way to "you are only going to use this with
> wikipedia, so we can't possibly provide features that would be useful
> for smaller wikis" oriented...
>
> == Basic Use ==
> '''Install:'''
>  aptitude install libwww-mediawiki-client-perl
>
> '''Initial Setup:'''
>  mvs login -d cluestick.office.fsf.org -u USERNAME -p 'PASSWORD' -w
> '/index.php'
>
> Where USERNAME is your username (note that mediawiki autocapitalizes
> this, so for example this would be Dclark, not dclark) and PASSWORD is
> your mediawiki password (note that this is a very insecure way to pass
> a password to a program, and should only be used on systems where you
> are the only used or you trust all other users).
>
> '''Example Checkout:'''
>  mvs update User:Dclark.wiki
>
> == See Also ==
> * [http://wikitravel.org/en/User:Mark/WWW-Mediawiki-Client
> User:Mark/WWW-Mediawiki-Client<http://wikitravel.org/en/User:Mark/WWW-Mediawiki-ClientUser:Mark/WWW-Mediawiki-Client>]
> - mvs author's page on mvs
> * [http://search.cpan.org/~markj/WWW-Mediawiki-Client/bin/mvs<http://search.cpan.org/%7Emarkj/WWW-Mediawiki-Client/bin/mvs>CPAN >
> Mark Jaroski >  WWW-Mediawiki-Client >  mvs]
> * [
> http://en.wikibooks.org/wiki/OpenSolaris/Reference_Manual#wiki_page_batch_update_by_mvs_perl_module
> wiki page batch update by mvs perl module] - has useful looking mvs
> makefile
>
> = Flat Mirror of Entire Wiki =
>
> == Google Gears ==
> If you have [http://gears.google.com Google Gears] installed, you will
> see a "gears localserver" box on the lower left-hand side of the
> cluestock mediawiki screen, under the "navigation", "search", and
> "toolbox" boxes. This is done with the
> [http://wiki.yobi.be/wiki/Mediawiki_LocalServer Mediawiki LocalServer:
> Offline with Google Gears] extention. The
> [
> http://andreas.schmidt.name/blog/2007/10/google-gears-hack-mediawiki-offline-functionality-in-less-than-one-hour.html
> original version] provides slightly more clear install doc. In
> general, put the .js files with the other .js files, in the common
> skins directory.
>
> After creating the local store and waiting for it to finish
> downloading, you will be able to go offline and browse the wiki -
> however search and "Special:" pages will not work in Google Gears
> offline mode, and you will not be able to edit pages in offline mode.
>
> == Local Django-based server ==
> The directions at
> [http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html<http://users.softlab.ece.ntua.gr/%7Ettsiod/buildWikipediaOffline.html>
> Building a (fast) Wikipedia offline reader] produce an environment
> that takes more time to set up than Google Gears, but is arguably a
> bit nicer (including local search of page titles - and shouldn't be
> that hard to extend that to full text).
>
> <pre>
> # THESE ARE NOT STEP-BY-STEP INSTRUCTIONS... interpretation is required.
> sudo aptitude install apt-xapian-index xapian-tools libxapian-dev php5-cli
> wget http://users.softlab.ece.ntua.gr/~ttsiod/mediawiki_sa.tar.bz2<http://users.softlab.ece.ntua.gr/%7Ettsiod/mediawiki_sa.tar.bz2>
> wget http://users.softlab.ece.ntua.gr/~ttsiod/offline.wikipedia.tar.bz2<http://users.softlab.ece.ntua.gr/%7Ettsiod/offline.wikipedia.tar.bz2>
> populate wiki-splits with raw .xml.bz2 dump
> mv mediawiki_sa offline.wikipedia
> Edit Makefile to have line "XMLBZ2 = cluestick-articles.xml.bz2"
> Edit mywiki/gui/view.py 4th line to: return article(request, "Main Page")
> make wikipedia
> # (Then follow directions it spews)
> </pre>
>
> TODO: Set up cron job to produce rsync-able cluestick-articles.xml.bz2
> on a regular basis. Package this up.
>
> == Tried / Don't work or no doc ==
> Some useful doc on how to make perl and python modules into debian
> packages however...
>
> === libmediawiki-spider-perl ===
> [
> http://search.cpan.org/~cselt/Mediawiki-Spider-0.31/lib/Mediawiki/Wikicopy.pm<http://search.cpan.org/%7Ecselt/Mediawiki-Spider-0.31/lib/Mediawiki/Wikicopy.pm>
> CPAN > Emma Tonkin >  Mediawiki-Spider-0.31 >  Mediawiki::Spider]
> <pre>
> sudo aptitude install dh-make-perl fakeroot dpkg-dev build-essential
> sudo aptitude install libwww-perl libhtml-tree-perl libhtml-tree-perl
> libhtml-tree-perl
> sudo apt-file update
>
> wget
> http://search.cpan.org/CPAN/authors/id/C/CS/CSELT/HTML-Extract-0.25.tar.gz
> tar -pzxvf HTML-Extract-0.25.tar.gz
> dh-make-perl HTML-Extract-0.25
> cd HTML-Extract-0.25
> fakeroot dpkg-buildpackage -uc -us
> cd ..
> sudo dpkg -i libhtml-extract-perl_0.25-1_all.deb
>
> wget
> http://search.cpan.org/CPAN/authors/id/C/CS/CSELT/Mediawiki-Spider-0.31.tar.gz
> tar -pzxvf Mediawiki-Spider-0.31.tar.gz
> dh-make-perl Mediawiki-Spider-0.31
> cd Mediawiki-Spider-0.31
> fakeroot dpkg-buildpackage -uc -us
> cd ..
> sudo dpkg -i libmediawiki-spider-perl_0.31-1_all.deb
> </pre>
>
> It's unclear what to do after this. Emailed author. Should bug ward
> when gets back.
>
> === fuse-mediawiki ===
> Mediawiki FUSE filesystem: git clone git://repo.or.cz/fuse-mediawiki.git
> <pre>
> sudo aptitude install git-core gvfs-fuse fuse-utils fuse-module python-fuse
> git clone git://repo.or.cz/fuse-mediawiki.git
> cd fuse-mediawiki.git
> mkdir cluestick-fuse
> python fuse-mediawiki.py -u Dclark
> http://cluestick.office.fsf.org/index.php cluestick-fuse
> </pre>
>
> This works, but brings up a nonsense file system that you can't cd
> into beyond one level or ls in. It seems to be under active
> development, so probably good to check back in a few months.
>
> === wikipediafs ===
> [http://wikipediafs.sourceforge.net/ WikipediaFS] - View and edit
> Wikipedia articles as if they were real files
> <pre>
> sudo aptitude install gvfs-fuse fuse-utils fuse-module python-fuse
> python-all-dev
> sudo easy_install stdeb
> wget http://internap.dl.sourceforge.net/sourceforge/wikipediafs/
> tar xvfz wikipediafs-0.3.tar.gz
> cd wikipediafs-0.3
> vi setup.py # Edit so version is correct
> stdeb_run_setup
> cd deb_dist/wikipediafs-0.3/
> dpkg-buildpackage -rfakeroot -uc -us
> sudo dpkg -i ../python-wikipediafs_0.3-1_all.deb
> man mount.wikipediafs
> </pre>
>
> This is sort of useless for the purpose of this section, as it
> requires the user to get a specific set of pages before going offline.
> Didn't spend enough time with it to see if it worked as advertised.
>
> === wikipediaDumpReader ===
> [
> http://www.kde-apps.org/content/show.php/Wikipedia+Dump+Reader?content=65244
> Wikipedia Dump Reader] - KDE App - Reads output of dumpBackup.php
> <pre>
> cd /usr/share/mediawiki/maintenance
> php dumpBackup.php --current | bzip2 > cluestick-articles.xml.bz2
> </pre>
>
> Too wikipedia-specific. Didn't work with cluestick dump at all.
>
> === kiwix ===
> <nowiki>#kiwix</nowiki> on freenode / svn co
> https://kiwix.svn.sourceforge.net/svnroot/kiwix kiwix
> * No documentation or response from IRC channel. No doc in svn.
>
> == See Also ==
> * [
> http://en.wikipedia.org/wiki/Wikipedia:Text_editor_support#Command_line_tools
> Wikipedia Command_line_tools page]
> * The [http://lists.laptop.org/pipermail/wikireader/ Wikireader
> Archives] - Mailing list for discussing wiki readers.
> * [http://wiki.laptop.org/go/WikiBrowse WikiBrowse]: a self-contained
> wiki server
> * [http://wiki.laptop.org/go/Wikislices Wikislices]: collections of
> articles pulled from a MediaWiki for WikiBrowse
> _______________________________________________
> Wikireader mailing list
> Wikireader at lists.laptop.org
> http://lists.laptop.org/listinfo/wikireader
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.laptop.org/pipermail/wikireader/attachments/20080713/ad68c144/attachment-0001.htm 


More information about the Wikireader mailing list