Project guidelines: Difference between revisions
(nmav) |
No edit summary |
||
(6 intermediate revisions by 4 users not shown) | |||
Line 1: | Line 1: | ||
{{peer-review-nav}} |
{{peer-review-nav}} |
||
This page contains guidelines for good OLPC projects, software activities or content collections or descriptions for how to engage in collaboration in a class or a school, to follow. They are guidelines to help people pursue the OLPC [[OLPC:Mission|mission]]... |
|||
⚫ | |||
⚫ | |||
* From an educational perspective : see [[#education]] below. |
|||
* From user perspective : please elaborate and discuss new and old activities at the [[Brilliant activities]] page. |
|||
Some proposed criteria for inclusion: |
Some proposed criteria for inclusion: |
||
== |
== Education == |
||
* Epistemological impact—to what degree does this activity positively impact learning? (This is of course the most important criteria.) |
* Epistemological impact—to what degree does this activity positively impact learning? (This is of course the most important criteria.) |
||
Line 20: | Line 17: | ||
== |
== Style == |
||
''see also the [[human interface guidelines]] pages''. |
|||
* System quality — is the activity sufficiently robust in its implementation that it will not compromise the integrity or supportability of the system? Is the overall quality of the implementation adequate to meet our standards? Can the community be engaged in the process of testing and "certifying" and maintaining the activity? |
* System quality — is the activity sufficiently robust in its implementation that it will not compromise the integrity or supportability of the system? Is the overall quality of the implementation adequate to meet our standards? Can the community be engaged in the process of testing and "certifying" and maintaining the activity? |
||
* Sugarized—to what extent has the activity been integrated into Sugar, including UI, Journal, security, internationalization, etc.? Does the activity require the folding in of additional libraries and resources? (This has impact on robustness—positive and |
* Sugarized—to what extent has the activity been integrated into Sugar, including UI, Journal, security, internationalization, etc.? Does the activity require the folding in of additional libraries and resources? (This has impact on robustness—positive and negative—support, bloat, and the overall usability, aesthetics, and perception of quality of the machine.) |
||
negative—support, bloat, and the overall usability, aesthetics, and perception of quality of the machine.) |
|||
* FOSS—is the activity and all of its dependencies free and open? |
* FOSS—is the activity and all of its dependencies free and open? |
||
Line 35: | Line 32: | ||
* Expectations—does the activity meet the expectations of (children, teachers, parents, G1G1 audience, etc.)? |
* Expectations—does the activity meet the expectations of (children, teachers, parents, G1G1 audience, etc.)? |
||
⚫ | |||
⚫ | |||
This tag is a cross between the target audience and an age appropriate rating system. |
This tag is a cross between the target audience and an age appropriate rating system. |
||
Line 56: | Line 51: | ||
A long and emotional mailing list thread centered around the [in]appropriateness of DOOM for many of the xo's users. The ability to link to a list of activities that fall under certain rating subsets is important. |
A long and emotional mailing list thread centered around the [in]appropriateness of DOOM for many of the xo's users. The ability to link to a list of activities that fall under certain rating subsets is important. |
||
== Functional user critera == |
|||
Aside from age, another set of evaluation/marking criteria can be the set of skills a user is expected to have, gain, or improve before/during use of the activity. Some of these include: |
|||
* Literacy (in various languages, on various topics) |
|||
* Fine motor manipulation and reaction time (for keyboard/mouse/peripherals, and software that requires acting within a time limit) |
|||
* Auditory capabilities (both perception and discrimination) |
|||
* Visual capabilities (both perception and discrimination) |
|||
* Memory and cognitive abilities (do users have to be able to remember information taught earlier in the activity? how much concentration does it require and for how long? will children with ADHD, dyslexia, etc. have difficulty playing?) |
|||
* Subject background (does the activity presuppose certain content knowledge - that the user has been exposed to certain topics in mathematics, music, etc?) |
|||
* Abstract thinking (what level of abstraction does the user have to be able to handle?) ''This is vaguely worded and intended to be a reference to the Piagetian shift towards abstract thinking that usually occurs around the age of 7-8, and again in the very early teens. Please rephrase if you can.'' |
|||
== Notes == |
|||
⚫ | |||
⚫ | |||
* Educational guidelines : see [[#education]] above. |
|||
* User review and input : to discuss and review projects for brilliance, see [[OLPC:Featured content]]. |
|||
{{stub}} |
{{stub}} |
||
[[category:peer review]] |
[[category:peer review]] |
||
[[category: |
[[category:Activity development]] |
Latest revision as of 01:45, 19 August 2008
This page contains guidelines for good OLPC projects, software activities or content collections or descriptions for how to engage in collaboration in a class or a school, to follow. They are guidelines to help people pursue the OLPC mission...
Some proposed criteria for inclusion:
Education
- Epistemological impact—to what degree does this activity positively impact learning? (This is of course the most important criteria.)
- Fun—is it fun? engaging?
- Sharable : Is it sharable locally? Over time? Does it lead to long-term collaborations?
- Discoverable : is the core activity discoverable? (This is not to say that it shouldn't be hard work to fully exploit the power of an activity, but it should have a low barrier to entry.)
- Constructive : does it help children learn long-term skills? does it promote an attitude of violence?
Style
see also the human interface guidelines pages.
- System quality — is the activity sufficiently robust in its implementation that it will not compromise the integrity or supportability of the system? Is the overall quality of the implementation adequate to meet our standards? Can the community be engaged in the process of testing and "certifying" and maintaining the activity?
- Sugarized—to what extent has the activity been integrated into Sugar, including UI, Journal, security, internationalization, etc.? Does the activity require the folding in of additional libraries and resources? (This has impact on robustness—positive and negative—support, bloat, and the overall usability, aesthetics, and perception of quality of the machine.)
- FOSS—is the activity and all of its dependencies free and open?
- Extensible—is the activity something the community can extend? Does it span multiple needs? (And does it have—or the potential of having—an upstream community of support?)
- Uniqueness—does the activity add a unique feature to the core?
- Expectations—does the activity meet the expectations of (children, teachers, parents, G1G1 audience, etc.)?
Age rating
This tag is a cross between the target audience and an age appropriate rating system. The rating system could be similar to one of the following:
- ESRB - Entertainment Software Rating Board (United States)
- PEGI - Pan European Game Information (Europe)
- CERO - Computer Entertainment Rating Organization (Japan)
Some have suggested the following, but these ratings are not usually applied to software:
We cannot use the ESRB per se, because we cannot follow the official ratings process (there is no "final copy"), and I doubt whether developers would want to pay their fees.
The PEGI system may be a good fit, since participation is usually voluntary. However, it is not clear what their process is, or whether we can use their ratings without paying a fee.
A long and emotional mailing list thread centered around the [in]appropriateness of DOOM for many of the xo's users. The ability to link to a list of activities that fall under certain rating subsets is important.
Functional user critera
Aside from age, another set of evaluation/marking criteria can be the set of skills a user is expected to have, gain, or improve before/during use of the activity. Some of these include:
- Literacy (in various languages, on various topics)
- Fine motor manipulation and reaction time (for keyboard/mouse/peripherals, and software that requires acting within a time limit)
- Auditory capabilities (both perception and discrimination)
- Visual capabilities (both perception and discrimination)
- Memory and cognitive abilities (do users have to be able to remember information taught earlier in the activity? how much concentration does it require and for how long? will children with ADHD, dyslexia, etc. have difficulty playing?)
- Subject background (does the activity presuppose certain content knowledge - that the user has been exposed to certain topics in mathematics, music, etc?)
- Abstract thinking (what level of abstraction does the user have to be able to handle?) This is vaguely worded and intended to be a reference to the Piagetian shift towards abstract thinking that usually occurs around the age of 7-8, and again in the very early teens. Please rephrase if you can.
Notes
- Security guidelines : (1) file-path compliance; (2) a cryptographic signature; and (3) a permissions declaration. (see "Bitfrost compliance) in mailing list archives
- Style guidelines : see #style above.
- Educational guidelines : see #education above.
- User review and input : to discuss and review projects for brilliance, see OLPC:Featured content.