Innovation in Evaluation
On April 4th and 5th, 2011 the workshop "Innovation in Evaluation" took place at OLPC Cambridge, MA.
Organizers: Claudia Urrea, Walter Bender and Bakhtiar Mikhak
Nearly two million XO laptops have been distributed to children in over 40 countries. In 5 countries, in particular, XO laptops are an embodiment of a deep commitment by a group of politicians, community leaders, and educators to implement disruptive large scale education reform initiatives that will advance their countries into the twenty first century and prepare their children for interconnected global creative knowledge economies. The stakes for the success of these initiatives are high, and local stakeholders as well as numerous international organizations look to these bold experiments with cautious optimism. These programs hold the promise to radically expand and realize the learning and creative potentials of entire nations at all societal levels. As such, arguably, the greatest challenges and opportunities facing these initiatives are in designing and implementing evaluation programs that help make the outcomes visible, understandable and actionable by as broad an audience as possible.
Objectives
The goal of this workshop was to bring together some of the leading practitioners and researchers in learning, education and technology to share and reflect on methodologies and data from active OLPC implementations and to critically review promising approaches to data collection, assessment and decision making in one-to-one computing and learning projects. The facilitators and invited presenters are selected for their expertise in new media and computational literacies and curricula, data visualization and alternative forms of assessment, and educational leadership. Workshop participants include researchers and experts from OLPC laptop initiatives in Uruguay, Paraguay, Peru, Nicaragua and Colombia.
Participants
- María De La Paz Peña
Educational Manager of Paraguay Educa, Paraguay.
- Félix Garrido Ching
Educational Coordinator of Zamora Teran Foundation, Nicaragua.
- Andrés Peri Hada
Director Research, Evaluation, Statistics Department of ANEP, Uruguay.
- Heddy Beatriz Becerra Tresierra
Volunteer program Coordinator, Perú.
- Sandra Barragan
Country Manager OLPC Colombia, Colombia.
Guest Speakers
- Ann Koufman-Frederick
Superintendent of Watertown Public Schools School Leadership, Professional Development, Program Evaluation, Testing and Assessment, Curriculum
- Joan DiMicco
Design of Studies in Social Media Research Scientist, Manager, Visual Communication Lab IBM Research, Cambridge, MA
- Evangeline Stefanakis
Associate Professor and Faculty Fellow with Provost Educational Foundations, Leadership and Counseling program School of Education Boston University
- Margaret Weigel
New digital media, design and research work
- Andres Monroy
PhD candidate at the MIT Media Lab, leader of the Scratch online community
Agenda Day 1
- Opening Session
Welcome and Overview. Introductions and Goals by Claudia Urrea, Director of Learning of OLPC LA
- Thematic Session 1:
Professional development, New Media and Curriculum, and Assessment. How to fit all these elements together? by Ann Koufman-Frederick http://users.rcn.com/koufman/resume/index.html
- Breakout Session 1:
What does our current professional development and educational curricula promote? What can be improved and How?||All
- Thematic Session 2:
Data Visualization: Design of Studies in Social Media Motivations for Social Networking at WorkFile:Dimicco-cscw08-beehive-motivations.pdf by Joan DiMicco https://researcher.ibm.com/researcher/view.php?person=us-joan.dimicco
- Breakout Session 2:
Bring Your Own Data What data are we collecting? How are we sharing the data with stakeholders and reflecting on them?
- Day 1 Closing
Reports from Breakout Sessions and Reflections
Main Ideas Day 1
Participants described the following expected outcomes for the workshop:
- To make explicit what everyone is doing in terms of evaluation, why we need to evaluate.
- To find common indicators, what to measure and to measure competences/skills.
- How can we make the tools (software/journal) easier to collect relevant data?
Why do we need to evaluate? For different audiences:
- Public sectors/goverment needs valid data.
- Donors, private and public sector.
- Ourselves, to get feedback of our work, to know if it is working.
- To tell the story at scale.
What do we need to measure?
- Creativity
- Digitalization
- Technological Fluency
- Cost-Benefit Analysis
- Games-Learning
Other closing ideas:
- Technical infrastructure and professional development are the base of the pyramid of any technology-based education project.
- Innovative assessment measures should be a complementary part of evaluations, standardized tests are still the most trusted source for investors, or decision makers.
- Social networks for teachers are a key ingredient
Agenda Day 2
Topic | Info | Presenter |
Overview of the day | Goals and future actions | Claudia Urrea. Director of Learning of OLPC LA |
Thematic Session 3: Alternative ways of assessment and understanding impact | Failing Our Students, NYTimes Article File:Failing Our Students-Stefanakis.pdf
Multiple Intelligences and Portfolios: A Window into the Learners Mind (book, available on arrival) |
Evangeline Stefanakis
http://www.bu.edu/sed/about-us/faculty/evangeline-harris-stefanakis/ |
Breakout Session 3 | How do we make sense of the whole, by understanding a few? | All |
Scratch Community and Data Collection | Characteristics of Social Networking via Scratch Community http://scratch.mit.edu/
Video of Presentation(in Spanish)File:Andres monroy.mov |
Andres Monroy http://www.mit.edu/~amonroy/ |
Thematic Session 4: Framework for new media literacy | Confronting the Challenges of Participatory Culture: Media Education for the 21st Century http://newmedialiteracies.org/files/working/NMLWhitePaper.pdf
Slideshow of Presentation File:New media literacy.ppt Video of Presentation Part 1 File:Margaret weigel.mov and Part 2 File:Margaret weigel part 2.mov |
Margaret Weigel http://www.margaretweigel.com/ |
Breakout Session 4: Actions to be taken | What is important to measure? Why and how? What data should Sugar collect? How? How do communicate the impact? and to whom? | All |
Day 2 Closing | Reports from Breakout Sessions and Reflections | All |