Design considerations, specs
4 use cases to consider.
 Das Wiki an sich
A wiki is the part of each of the following options for synchronization. A wiki in the context of a distributed wiki system is what is traditionally called a server. Users will use a browser interface but will traditionally interface to an wiki on a local machine. Wikis might also be run on other collaborators machines or on shared servers. The wiki's job is to store pages locally, to disply them, and to provide an ability to edit them.
Nice to haves within any of the wiki options include:
- WYSIWYG editing (with editors with native wikitext/MW markup support).
- Native integration with the XO journal
- MoinMoin wiki-based wiki client
- Update: A working Moin-via-browser implementation is working for a class project. Next version: a separate Moin activity. 08:50, 2 March 2007 (EST)
- A from-scratch system
- Update: supersimple version with one-page synch but no history or metadata is now working on openmako. A full text-based implementation is coming next week. 08:50, 2 March 2007 (EST)
 Simple synchrony : lightweight wiki linked to central wiki
Pages are imported into the light wiki from a central wiki. (they can be imported to the central wiki from elsewhere, when identified as useful.) There they can be modified/revised, through a series of changes. at some point later, the user can push the page back to a central wiki.
There's no history imported from the central wiki to the light wiki... the series of changes made locally produce a set of metadata that should include
- a set of standard fields per revision [standard across different source wiki platforms... the central wiki needs to use this standard when importing]
- a marker to note the page has been imported [and from where]
On local commits, this should hide/lowlight local history... after resolving any conflicts, the latest version from the central wiki should overwrite the local copy. [you don't want to confuse history from the server with local history]
Set up a client (cf. mvs/libwww-mediawiki-client-perl, FUSE Wikipedia filesystem, etc.) that grabs a page or set of pages, stores a single local copy, and lets you edit locally.
Commits of local changes to the central wiki are clustered into atomic commits, aggregating all interim changes into one revision. As a result, metadata about interim revisions [list, timestamps, users, edit summaries, diff sizes, ...] should be stored on a canonically-named talk or user page and linked to from the edit summary.
Things to further define:
- how do we define the set of pages to sync automatically or with computer assistance?
- build a simple/graphical interface for synch commands (synch, check, local-Commit [start a new atomic commit-chain], cf. Tortoise-SVN)
- standard & format for metadata about interim edits/revisions
- graphical client design
- metadata about type of markup [different by source wiki].
- pywikipediabot framework (horrible, possible for April)
 Complex synchrony: not needed yet
Put more complex thoughts in this section... we don't need more than simple synchrony for launch work or the UNICEF wiki.
 Global wiki for materials; UNIwiki coordination
Define a hosting solution to host above material
- semi-offline edits of shared resources, books in progress, &c
- user-generated material, local and often non-academic
- project-specific materials, supported and coordinated by an external group
Define namespaces that can be used for individual works / books / projects; and future functionality for local links within a work, skins for each work, and more.
Quantify sustainability of hosting.
- What are long-term storage and bandwidth needs?
 Mediawiki/other extensions
What extensions to mediawiki and other platforms are needed to make this work well? Consider:
- A patch that says "there are merge issues to work out" in the right way while there are outstanding merge conflicts from uploads [i.e., until an authorized user does the merge or says it need not be done].
- A patch that handles a series of atomic commits piped in all at once from a remote wiki. When I make 500 local page edits with my friends, I should have the option of pushing 1 to 500 different changes to the global page [with perhaps some rate limiting, or some client verification, to avoid spam].
 3/10-11 : discussion @ UNICEF
Sunday @ UNICEF ... it gets hot here on weekends so dress cool
1200 - Introductions, timelines, and scope + UNICEF team, structure + OLPC " " + UNICEF timelines + OLPC Timelines : Mar 30, July 15 1300 - Usability 1430 Break 1445 - Wiki platform comparisons - Socialtext - MediaWiki, DekiWiki - MoinMoin, OLPCWiki 1530 - Language and web design; global wiki 1630 - Synchronization (tiered repository) - use cases 1745 - Division of labor, results for tomorrow - Draft handout for Monday (finish overnight)
Monday @ UNICEF HQ : 9th floor conf [Tech and granular breakouts]
Monday @ UNICEF HQ, 44th st b/t 1st and 2nd ave : 9th floor conf [Tech and granular breakouts]
0900 - Introductions and overview
+ Summary of previous day's sessions + Timelines and open issues
1000 - UNICEF goals and projects
+ Initial project goals (4 countries) + Coordination: language, with various client devices
1030 - break
1100 - Specifications, timelines (Ivan joins by phone)
+ UNICEF project design; VOY + Globaltext Wiki requirements + OLPC Wikireader requirements + Eduvision reader & annotations
1200 - working lunch (Journal and bookreader coordination)
1315 - Wiki platform integration
+ XO wikireader + External wikis + Tasks - revised specs and milestones
1430 - UNIWiki, Global wiki, and OLE
+ Definitions, use cases, requirements + Indexing + backups and synchronization
1515 - Break
1530 - Session 1: Local teams and languages
+ language focus: English, Spanish, Arabic + local & other teams
1530 - Session 2: Design and portals
+ World Digital Library design + UNICEF designs + OLPC interface guidelines
1615 - Session summaries
1630 - March 19 and March 30 milestones
+ Wrapup and immediate deadlines + Weekly meetings?