Talk:WikiReaders
Design considerations, specs
4 use cases to consider.
0. wiki
A wiki is the part of each of the following options for synchronization. A wiki in the context of a distributed wiki system is what is traditionally called a server. Users will use a browser interface but will traditionally interface to an wiki on a local machine. Wikis might also be run on other collaborators machines or on shared servers. The wiki's job is to store pages locally, to disply them, and to provide an ability to edit them.
Nice to haves within any of the wiki options include:
- WYSIWYG editing (with editors with native wikitext/MW markup support).
- Native integration with the XO journal
Implementation options:
- MoinMoin wiki-based wiki client
- Update: A working Moin-via-browser implementation is working for a class project. Next version: a separate Moin activity. 08:50, 2 March 2007 (EST)
- A from-scratch system
- Update: supersimple version with one-page synch but no history or metadata is now working on openmako. A full text-based implementation is coming next week. 08:50, 2 March 2007 (EST)
Methods
1. simple synchrony : lightweight wiki linked to central wiki
Pages are imported into the light wiki from a central wiki. (they can be imported to the central wiki from elsewhere, when identified as useful.) There they can be modified/revised, through a series of changes. at some point later, the user can push the page back to a central wiki.
There's no history imported from the central wiki to the light wiki... the series of changes made locally produce a set of metadata that should include
- a set of standard fields per revision [standard across different source wiki platforms... the central wiki needs to use this standard when importing]
- a marker to note the page has been imported [and from where]
On local commits, this should hide/lowlight local history... after resolving any conflicts, the latest version from the central wiki should overwrite the local copy. [you don't want to confuse history from the server with local history]
Set up a client (cf. mvs/libwww-mediawiki-client-perl, FUSE Wikipedia filesystem, etc.) that grabs a page or set of pages, stores a single local copy, and lets you edit locally.
Commits of local changes to the central wiki are clustered into atomic commits, aggregating all interim changes into one revision. As a result, metadata about interim revisions [list, timestamps, users, edit summaries, diff sizes, ...] should be stored on a canonically-named talk or user page and linked to from the edit summary.
Things to further define:
- how do we define the set of pages to sync automatically or with computer assistance?
- build a simple/graphical interface for synch commands (synch, check, local-Commit [start a new atomic commit-chain], cf. Tortoise-SVN)
- standard & format for metadata about interim edits/revisions
- graphical client design
- metadata about type of markup [different by source wiki].
Implementation options:
- pywikipediabot framework (horrible, possible for April)
- Moin?
2. global wiki for materials; UNIwiki coordination
Define a hosting solution to host above material
- semi-offline edits of shared resources, books in progress, &c
- user-generated material, local and often non-academic
- project-specific materials, supported and coordinated by an external group
Define namespaces that can be used for individual works / books / projects; and future functionality for local links within a work, skins for each work, and more.
Quantify sustainability of hosting.
3. Mediawiki/other extensions
What extensions to mediawiki and other platforms are needed to make this work well? Consider:
- a patch that says "there are merge issues to work out" in the right way while there are outstanding merge conflicts from uploads [i.e., until an authorized userdoes the merge or says it need not be done].