A question -- who on the list is coming to Wikimania this year? We should set up a workshop time to discuss the current wikireader tools and their roadmaps. If you wish you could be here but want people to talk about /your/ latest work anyway, please send me a one-paragraph description of where your work is at and how mediawiki devs could help.<br>
<br>Mako and I are in Alexandria from tomorrow night on; you can reach us through his cell : 206-409-7191<br><br>Cheers,<br>SJ<br><br><div class="gmail_quote">On Sun, Jul 13, 2008 at 12:01 AM, Daniel Clark <<a href="mailto:dclark@pobox.com">dclark@pobox.com</a>> wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">Below is a dump of an early version of some doc regarding creating and<br>
editing local copies of a mediawiki (specifically the one we use at<br>
the FSF for Sys Admin doc); I thought it might be interesting to some<br>
people on the list, and also I'm wondering if I've missed anything. I<br>
didn't try OLPC WikiBrowse, as it looks very XO-specific, at least in<br>
terms of doc (which mentions Sugar as a dependency).<br>
<br>
It's interesting to see the IBM project, given that IBM Lotus Notes<br>
has had this kind of thing as its killer feature for many years; in<br>
fact the reason I'm so interested in getting this set up is because I<br>
remember how useful it was to have a local copy of critical Sys Admin<br>
doc when I was with IBM Lotus software. Are the extreme blue students<br>
working on that project in the Westford (or is it Littleton now?) or<br>
Cambridge offices (where Notes is developed)?<br>
<br>
Weird how this list/topic is combining 3 out of 3 jobs I've had since<br>
high school...<br>
<br>
--<br>
Danny Clark # Sys Admin, Free Software Foundation<br>
# <a href="http://www.fsf.org" target="_blank">http://www.fsf.org</a> # <a href="http://opensysadmin.com" target="_blank">http://opensysadmin.com</a><br>
<br>
= mvs - A command line Mediawiki client =<br>
It would be really nice if this supported some form of recursion...<br>
All these tools are way to "you are only going to use this with<br>
wikipedia, so we can't possibly provide features that would be useful<br>
for smaller wikis" oriented...<br>
<br>
== Basic Use ==<br>
'''Install:'''<br>
aptitude install libwww-mediawiki-client-perl<br>
<br>
'''Initial Setup:'''<br>
mvs login -d <a href="http://cluestick.office.fsf.org" target="_blank">cluestick.office.fsf.org</a> -u USERNAME -p 'PASSWORD' -w<br>
'/index.php'<br>
<br>
Where USERNAME is your username (note that mediawiki autocapitalizes<br>
this, so for example this would be Dclark, not dclark) and PASSWORD is<br>
your mediawiki password (note that this is a very insecure way to pass<br>
a password to a program, and should only be used on systems where you<br>
are the only used or you trust all other users).<br>
<br>
'''Example Checkout:'''<br>
mvs update User:Dclark.wiki<br>
<br>
== See Also ==<br>
* [<a href="http://wikitravel.org/en/User:Mark/WWW-Mediawiki-ClientUser:Mark/WWW-Mediawiki-Client" target="_blank">http://wikitravel.org/en/User:Mark/WWW-Mediawiki-Client<br>
User:Mark/WWW-Mediawiki-Client</a>] - mvs author's page on mvs<br>
* [<a href="http://search.cpan.org/%7Emarkj/WWW-Mediawiki-Client/bin/mvs" target="_blank">http://search.cpan.org/~markj/WWW-Mediawiki-Client/bin/mvs</a> CPAN ><br>
Mark Jaroski > WWW-Mediawiki-Client > mvs]<br>
* [<a href="http://en.wikibooks.org/wiki/OpenSolaris/Reference_Manual#wiki_page_batch_update_by_mvs_perl_module" target="_blank">http://en.wikibooks.org/wiki/OpenSolaris/Reference_Manual#wiki_page_batch_update_by_mvs_perl_module</a><br>
wiki page batch update by mvs perl module] - has useful looking mvs<br>
makefile<br>
<br>
= Flat Mirror of Entire Wiki =<br>
<br>
== Google Gears ==<br>
If you have [<a href="http://gears.google.com" target="_blank">http://gears.google.com</a> Google Gears] installed, you will<br>
see a "gears localserver" box on the lower left-hand side of the<br>
cluestock mediawiki screen, under the "navigation", "search", and<br>
"toolbox" boxes. This is done with the<br>
[<a href="http://wiki.yobi.be/wiki/Mediawiki_LocalServer" target="_blank">http://wiki.yobi.be/wiki/Mediawiki_LocalServer</a> Mediawiki LocalServer:<br>
Offline with Google Gears] extention. The<br>
[<a href="http://andreas.schmidt.name/blog/2007/10/google-gears-hack-mediawiki-offline-functionality-in-less-than-one-hour.html" target="_blank">http://andreas.schmidt.name/blog/2007/10/google-gears-hack-mediawiki-offline-functionality-in-less-than-one-hour.html</a><br>
original version] provides slightly more clear install doc. In<br>
general, put the .js files with the other .js files, in the common<br>
skins directory.<br>
<br>
After creating the local store and waiting for it to finish<br>
downloading, you will be able to go offline and browse the wiki -<br>
however search and "Special:" pages will not work in Google Gears<br>
offline mode, and you will not be able to edit pages in offline mode.<br>
<br>
== Local Django-based server ==<br>
The directions at<br>
[<a href="http://users.softlab.ece.ntua.gr/%7Ettsiod/buildWikipediaOffline.html" target="_blank">http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html</a><br>
Building a (fast) Wikipedia offline reader] produce an environment<br>
that takes more time to set up than Google Gears, but is arguably a<br>
bit nicer (including local search of page titles - and shouldn't be<br>
that hard to extend that to full text).<br>
<br>
<pre><br>
# THESE ARE NOT STEP-BY-STEP INSTRUCTIONS... interpretation is required.<br>
sudo aptitude install apt-xapian-index xapian-tools libxapian-dev php5-cli<br>
wget <a href="http://users.softlab.ece.ntua.gr/%7Ettsiod/mediawiki_sa.tar.bz2" target="_blank">http://users.softlab.ece.ntua.gr/~ttsiod/mediawiki_sa.tar.bz2</a><br>
wget <a href="http://users.softlab.ece.ntua.gr/%7Ettsiod/offline.wikipedia.tar.bz2" target="_blank">http://users.softlab.ece.ntua.gr/~ttsiod/offline.wikipedia.tar.bz2</a><br>
populate wiki-splits with raw .xml.bz2 dump<br>
mv mediawiki_sa offline.wikipedia<br>
Edit Makefile to have line "XMLBZ2 = cluestick-articles.xml.bz2"<br>
Edit mywiki/gui/view.py 4th line to: return article(request, "Main Page")<br>
make wikipedia<br>
# (Then follow directions it spews)<br>
</pre><br>
<br>
TODO: Set up cron job to produce rsync-able cluestick-articles.xml.bz2<br>
on a regular basis. Package this up.<br>
<br>
== Tried / Don't work or no doc ==<br>
Some useful doc on how to make perl and python modules into debian<br>
packages however...<br>
<br>
=== libmediawiki-spider-perl ===<br>
[<a href="http://search.cpan.org/%7Ecselt/Mediawiki-Spider-0.31/lib/Mediawiki/Wikicopy.pm" target="_blank">http://search.cpan.org/~cselt/Mediawiki-Spider-0.31/lib/Mediawiki/Wikicopy.pm</a><br>
CPAN > Emma Tonkin > Mediawiki-Spider-0.31 > Mediawiki::Spider]<br>
<pre><br>
sudo aptitude install dh-make-perl fakeroot dpkg-dev build-essential<br>
sudo aptitude install libwww-perl libhtml-tree-perl libhtml-tree-perl<br>
libhtml-tree-perl<br>
sudo apt-file update<br>
<br>
wget <a href="http://search.cpan.org/CPAN/authors/id/C/CS/CSELT/HTML-Extract-0.25.tar.gz" target="_blank">http://search.cpan.org/CPAN/authors/id/C/CS/CSELT/HTML-Extract-0.25.tar.gz</a><br>
tar -pzxvf HTML-Extract-0.25.tar.gz<br>
dh-make-perl HTML-Extract-0.25<br>
cd HTML-Extract-0.25<br>
fakeroot dpkg-buildpackage -uc -us<br>
cd ..<br>
sudo dpkg -i libhtml-extract-perl_0.25-1_all.deb<br>
<br>
wget <a href="http://search.cpan.org/CPAN/authors/id/C/CS/CSELT/Mediawiki-Spider-0.31.tar.gz" target="_blank">http://search.cpan.org/CPAN/authors/id/C/CS/CSELT/Mediawiki-Spider-0.31.tar.gz</a><br>
tar -pzxvf Mediawiki-Spider-0.31.tar.gz<br>
dh-make-perl Mediawiki-Spider-0.31<br>
cd Mediawiki-Spider-0.31<br>
fakeroot dpkg-buildpackage -uc -us<br>
cd ..<br>
sudo dpkg -i libmediawiki-spider-perl_0.31-1_all.deb<br>
</pre><br>
<br>
It's unclear what to do after this. Emailed author. Should bug ward<br>
when gets back.<br>
<br>
=== fuse-mediawiki ===<br>
Mediawiki FUSE filesystem: git clone git://<a href="http://repo.or.cz/fuse-mediawiki.git" target="_blank">repo.or.cz/fuse-mediawiki.git</a><br>
<pre><br>
sudo aptitude install git-core gvfs-fuse fuse-utils fuse-module python-fuse<br>
git clone git://<a href="http://repo.or.cz/fuse-mediawiki.git" target="_blank">repo.or.cz/fuse-mediawiki.git</a><br>
cd fuse-mediawiki.git<br>
mkdir cluestick-fuse<br>
python fuse-mediawiki.py -u Dclark<br>
<a href="http://cluestick.office.fsf.org/index.php" target="_blank">http://cluestick.office.fsf.org/index.php</a> cluestick-fuse<br>
</pre><br>
<br>
This works, but brings up a nonsense file system that you can't cd<br>
into beyond one level or ls in. It seems to be under active<br>
development, so probably good to check back in a few months.<br>
<br>
=== wikipediafs ===<br>
[<a href="http://wikipediafs.sourceforge.net/" target="_blank">http://wikipediafs.sourceforge.net/</a> WikipediaFS] - View and edit<br>
Wikipedia articles as if they were real files<br>
<pre><br>
sudo aptitude install gvfs-fuse fuse-utils fuse-module python-fuse<br>
python-all-dev<br>
sudo easy_install stdeb<br>
wget <a href="http://internap.dl.sourceforge.net/sourceforge/wikipediafs/" target="_blank">http://internap.dl.sourceforge.net/sourceforge/wikipediafs/</a><br>
tar xvfz wikipediafs-0.3.tar.gz<br>
cd wikipediafs-0.3<br>
vi setup.py # Edit so version is correct<br>
stdeb_run_setup<br>
cd deb_dist/wikipediafs-0.3/<br>
dpkg-buildpackage -rfakeroot -uc -us<br>
sudo dpkg -i ../python-wikipediafs_0.3-1_all.deb<br>
man mount.wikipediafs<br>
</pre><br>
<br>
This is sort of useless for the purpose of this section, as it<br>
requires the user to get a specific set of pages before going offline.<br>
Didn't spend enough time with it to see if it worked as advertised.<br>
<br>
=== wikipediaDumpReader ===<br>
[<a href="http://www.kde-apps.org/content/show.php/Wikipedia+Dump+Reader?content=65244" target="_blank">http://www.kde-apps.org/content/show.php/Wikipedia+Dump+Reader?content=65244</a><br>
Wikipedia Dump Reader] - KDE App - Reads output of dumpBackup.php<br>
<pre><br>
cd /usr/share/mediawiki/maintenance<br>
php dumpBackup.php --current | bzip2 > cluestick-articles.xml.bz2<br>
</pre><br>
<br>
Too wikipedia-specific. Didn't work with cluestick dump at all.<br>
<br>
=== kiwix ===<br>
<nowiki>#kiwix</nowiki> on freenode / svn co<br>
<a href="https://kiwix.svn.sourceforge.net/svnroot/kiwix" target="_blank">https://kiwix.svn.sourceforge.net/svnroot/kiwix</a> kiwix<br>
* No documentation or response from IRC channel. No doc in svn.<br>
<br>
== See Also ==<br>
* [<a href="http://en.wikipedia.org/wiki/Wikipedia:Text_editor_support#Command_line_tools" target="_blank">http://en.wikipedia.org/wiki/Wikipedia:Text_editor_support#Command_line_tools</a><br>
Wikipedia Command_line_tools page]<br>
* The [<a href="http://lists.laptop.org/pipermail/wikireader/" target="_blank">http://lists.laptop.org/pipermail/wikireader/</a> Wikireader<br>
Archives] - Mailing list for discussing wiki readers.<br>
* [<a href="http://wiki.laptop.org/go/WikiBrowse" target="_blank">http://wiki.laptop.org/go/WikiBrowse</a> WikiBrowse]: a self-contained<br>
wiki server<br>
* [<a href="http://wiki.laptop.org/go/Wikislices" target="_blank">http://wiki.laptop.org/go/Wikislices</a> Wikislices]: collections of<br>
articles pulled from a MediaWiki for WikiBrowse<br>
_______________________________________________<br>
Wikireader mailing list<br>
<a href="mailto:Wikireader@lists.laptop.org">Wikireader@lists.laptop.org</a><br>
<a href="http://lists.laptop.org/listinfo/wikireader" target="_blank">http://lists.laptop.org/listinfo/wikireader</a><br>
</blockquote></div><br>