Talk:Wikislices

From OLPC
Revision as of 14:18, 24 March 2008 by Sj (talk | contribs) (merging talk from article, moving old minutes to subpage)
Jump to: navigation, search

see Talk:Bundles for scripts used here

Universalism

The question of universal use of this content needs to be considered. Do we run this project under OLPC entirely? Or do we try to create logical bundles for anyone with a wikireader? What are our ideas that may differ from other Wikipedians?

Meeting minutes 2008-02-2?

care of mel

Meeting minutes moved to Wikislice meetings/2008-02.


Meeting notes 2/21/08

Overall goal of meeting: wiki-hacking session to improve on the tools that Zdenek and others are currently using to make & refine wikislices. Held in #olpc-content on freenode.

Wikipedia snapshots

Developing snapshots of Wikipedia at every order of magnitude from 10MB to 100GB.

snapshot tools

We need...

  • libraries for producing different styles of wikipedia snapshots (wikitext, html, txt, pdf) from categories (special:export), topics/pages (wikiosity), and index pages (wikislices)
  • libraries that can do intelligent things with metadata from history and wikimedia-commons pages
  • libraries that support no-image/thumbnail/mid-res image selection
  • libraries that recalculate blue v. red links given a wikislice

Wiki format glue

We need glue code/scripts to interface between similar projects : WP WikiReaders, Wikibooks, wikipedia wikislice projects, webaroo wikislices, kiwix snapshots, schools-wikipedia snapshots, ksana snapshots, WP 1.0 revision-vetting --- at least at the level of sharing index selections and a list of "good revisions" for included articles.

Offline readers

As a recent focal point, Zvi Boshernitzan and Ben Lisbakken have both made offline wikipedia-readers using Google Gears that are pretty fast and offer some nice features in terms of letting you select a set of articles, cache them locally, and browse an index. We talked last week about how to integrate Gears more tightly into a browsing experience, with hopes of pursuing a prototype withing a couple of weeks. It would be helpful to inform such a client with lists of good revisions of articles, such a those Martin Walker and Andrew Cates have developed for their own projects... and to plan for it to support offline editing as well as reading, using synchronization tools such as Mako's distributed wiki client.

What can people do?

  • wikipediaondvd - Pascal and Guillame are trying to help


Older notes

Code libraries

  • KsanaForge and their KsanaWiki project have a set of scripts that process raw xml dumps from MediaWiki. They are working on producing read-only flash drives and SD cards for distribution.
  • Linterweb, developer of one of the freely-available static selections of Wikipedia, has an open source toolchain for building it; they are also working on wiki search engines (see Kiwix) and have offered to help build the local-filesystem search for the journal.
  • The Moulinwiki project and Renaud Gaudin have a toolchain from processing html output from the MediaWiki parser. They are now combining forces with Linterweb.
  • PediaPress has an "mwlib" library for parsing mediawiki text which is freely available
  • the "Wikipedia 1.0" team and Andrew Cates (user:BozMo on en:wp) is using their own scripts to generate and review static collections from a list of constantly changing wiki articles.