Innovation in Evaluation: Difference between revisions
(8 intermediate revisions by 4 users not shown) | |||
Line 4: | Line 4: | ||
Nearly two million XO laptops have been distributed to children in over 40 countries. In 5 countries, in particular, XO laptops are an embodiment of a deep commitment by a group of politicians, community leaders, and educators to implement disruptive large scale education reform initiatives that will advance their countries into the twenty first century and prepare their children for interconnected global creative knowledge economies. The stakes for the success of these initiatives are high, and local stakeholders as well as numerous international organizations look to these bold experiments with cautious optimism. These programs hold the promise to radically expand and realize the learning and creative potentials of entire nations at all societal levels. As such, arguably, the greatest challenges and opportunities facing these initiatives are in designing and implementing evaluation programs that help make the outcomes visible, understandable and actionable by as broad an audience as possible. |
Nearly two million XO laptops have been distributed to children in over 40 countries. In 5 countries, in particular, XO laptops are an embodiment of a deep commitment by a group of politicians, community leaders, and educators to implement disruptive large scale education reform initiatives that will advance their countries into the twenty first century and prepare their children for interconnected global creative knowledge economies. The stakes for the success of these initiatives are high, and local stakeholders as well as numerous international organizations look to these bold experiments with cautious optimism. These programs hold the promise to radically expand and realize the learning and creative potentials of entire nations at all societal levels. As such, arguably, the greatest challenges and opportunities facing these initiatives are in designing and implementing evaluation programs that help make the outcomes visible, understandable and actionable by as broad an audience as possible. |
||
Para Espanol, ir a [[Innovación en la Evaluación]]. |
|||
==Objectives== |
==Objectives== |
||
Line 109: | Line 112: | ||
<u>Summary </u> |
<u>Summary </u> |
||
Joan DiMicco shared effective strategies and evaluation methods |
Joan DiMicco shared effective strategies and evaluation methods from her research on social software for IBM. |
||
The goal of creating a social software for the organization is to build a community and simoultaneously to evaluate the system. |
The goal of creating a social software for the organization is to build a community and simoultaneously to evaluate the system. The question "what is the ROI (Return on Investment) of creating a social networking site?" can be translated into "why are people using the site?" which can bring insight on personal benefits. Consequently this can inform the kind of impact this represents for the organization as a whole. Surveys created to measure Social Capital are used to establish a correlation between amount of social capital of user and the actual type of behavior observed on the site, this last obtained by an automized system logging. Many of the ideas presented by Joan could be effective: to allow networking tools for users (teachers, students, technical teams, etc.), and to obtain inmediate data collection of user behavior. |
||
*Breakout Session 2: |
*Breakout Session 2: |
||
Line 118: | Line 121: | ||
*Day 1 Closing |
*Day 1 Closing |
||
Reports from Breakout Sessions and Reflections |
Reports from Breakout Sessions and Reflections |
||
Melissa Henriquez presented a review of the evaluations conducted across the region, [[Evaluations_indicators_review_2009-2010 | "Evaluations of OLPC projects |
|||
A review of 2009-2010 reports"]], in which we summarize the most common indicators measured throughout eight evaluations of OLPC projects as well as most common tools used for data collection. |
|||
==Main Ideas Day 1== |
==Main Ideas Day 1== |
||
Line 177: | Line 183: | ||
<u>Summary </u> |
<u>Summary </u> |
||
Margaret Weigel outlined highlights from the New Media Literacy white paper, from her work as research director for the project "Confronting the challenges of Participatory Culture: Media education for the 21st Century". The proposed skills to be cultivated in an online context are: Play, Performance, Simulation, Appropriation, Multitasking, Distributed Cognition, Collective Intelligence, Judgment, Transmedia Navigation, Networking, and Negotiation. Margaret gave examples of positive and negative ways students can develop these skills, plus she presented clear guidelines for educators to assesss students' performance on each skill. |
|||
*Breakout Session 4: |
*Breakout Session 4: |
||
Line 214: | Line 222: | ||
Drawing on Education: Using Drawings to Document Schooling and Support Change [[File:DrawingOnEducation.pdf]] |
Drawing on Education: Using Drawings to Document Schooling and Support Change [[File:DrawingOnEducation.pdf]] |
||
[[Metrics, Feedback and Evaluation]] |
|||
[[Category:Learning]] |
[[Category:Learning]] |
Latest revision as of 13:19, 7 February 2014
On April 4th and 5th, 2011 the workshop "Innovation in Evaluation" took place at OLPC Cambridge, MA.
Organizers: Claudia Urrea, Walter Bender and Bakhtiar Mikhak
Nearly two million XO laptops have been distributed to children in over 40 countries. In 5 countries, in particular, XO laptops are an embodiment of a deep commitment by a group of politicians, community leaders, and educators to implement disruptive large scale education reform initiatives that will advance their countries into the twenty first century and prepare their children for interconnected global creative knowledge economies. The stakes for the success of these initiatives are high, and local stakeholders as well as numerous international organizations look to these bold experiments with cautious optimism. These programs hold the promise to radically expand and realize the learning and creative potentials of entire nations at all societal levels. As such, arguably, the greatest challenges and opportunities facing these initiatives are in designing and implementing evaluation programs that help make the outcomes visible, understandable and actionable by as broad an audience as possible.
Para Espanol, ir a Innovación en la Evaluación.
Objectives
The goal of this workshop was to bring together some of the leading practitioners and researchers in learning, education and technology to share and reflect on methodologies and data from active OLPC implementations and to critically review promising approaches to data collection, assessment and decision making in one-to-one computing and learning projects. The facilitators and invited presenters are selected for their expertise in new media and computational literacies and curricula, data visualization and alternative forms of assessment, and educational leadership. Workshop participants include researchers and experts from OLPC laptop initiatives in Uruguay, Paraguay, Peru, Nicaragua and Colombia.
Participants
From Country Deployments:
- María De La Paz Peña
Educational Manager of Paraguay Educa, Paraguay.
- Félix Garrido Ching
Educational Coordinator of Zamora Teran Foundation, Nicaragua.
- Andrés Peri Hada
Director Research, Evaluation, Statistics Department of ANEP, Uruguay.
- Heddy Beatriz Becerra Tresierra
Volunteer program Coordinator, Perú.
- Sandra Barragan
Country Manager OLPC Colombia, Colombia.
From OLPC:
- Claudia Urrea
Director of Learning Latin America
- Melissa Henriquez
Educational Coordinator
- Giulia D'Amico
Director of Business Development
Other guest participants:
- Walter Bender
Director of Sugar Labs
- Bakhtiar Mikhak
Director of the Grassroots Invention group (GIG) at the MIT Media Laboratory
Guest Speakers
- Ann Koufman-Frederick
Superintendent of Watertown Public Schools School Leadership, Professional Development, Program Evaluation, Testing and Assessment, Curriculum http://users.rcn.com/koufman/resume/index.html
- Joan DiMicco
Design of Studies in Social Media Research Scientist, Manager, Visual Communication Lab IBM Research, Cambridge, MA https://researcher.ibm.com/researcher/view.php?person=us-joan.dimicco
- Evangeline Stefanakis
Associate Professor and Faculty Fellow with Provost Educational Foundations, Leadership and Counseling program School of Education Boston University http://www.bu.edu/sed/about-us/faculty/evangeline-harris-stefanakis/
- Margaret Weigel
New digital media, design and research work http://www.margaretweigel.com/
- Andres Monroy
PhD candidate at the MIT Media Lab, leader of the Scratch online community http://www.mit.edu/~amonroy/
Agenda Day 1
- Opening Session
Welcome and Overview. Introductions and Goals by Claudia Urrea, Director of Learning of OLPC LA
- Thematic Session 1:
Professional development, New Media and Curriculum, and Assessment. How to fit all these elements together? by Ann Koufman-Frederick
Video of Presentation Part 1 File:Ann Koufman Frederick part 1a.mov, Part 1B File:Ann Koufman Frederick part 1b.mov
Part 2 File:Ann Koufman Frederick part 2.mov, and Part 3 File:Ann Koufman Frederick part 3.mov
Slideshow of presentation File:Ann koufman.ppt
Summary
Ann Koufman-Frederick outlines some of the best practices on assessment of learning results in technology based projects as superintendent of Watertown Public Schools in Massachusetts. Ann describes a hierarchy pyramid in which the foundational elements for Curriculum Application and Assessment are: Access to technology and connectivity (Network infrastructure and hardware), technical services and data management, leadership and administrative support (administrative council), and communication and collaboration strategies. To promote their mission at the distric of Watertown, Ann focuses on two components that encompass all other strategies: 1:1 Access and Professional Learning/Development. Other important tools include: Communication & Collaboration, Student Assessment, Content Creation & Publishing, and 21st Century Learning Environments. To measure learning, Ann mentions examples of innovative performance based assessment tools. Finally as a way to measure their own progress, on a yearly basis, they analyze stages of succesful implementation by evaluating practices and strategies applied within the school year. This particular event (one day retreat with all administrators) provides guidelines for school improvement planning and strategic action plans for the following school year.
- Breakout Session 1:
What does our current professional development and educational curricula promote? What can be improved and How?
- Thematic Session 2:
Data Visualization: Design of Studies in Social Media by Joan DiMicco Motivations for Social Networking at Work File:Dimicco-cscw08-beehive-motivations.pdf
Video of Presentation File:Joan DiMicco part 1.mov Slideshow of presentation File:Presentation-JoanDimico.ppt
Summary
Joan DiMicco shared effective strategies and evaluation methods from her research on social software for IBM. The goal of creating a social software for the organization is to build a community and simoultaneously to evaluate the system. The question "what is the ROI (Return on Investment) of creating a social networking site?" can be translated into "why are people using the site?" which can bring insight on personal benefits. Consequently this can inform the kind of impact this represents for the organization as a whole. Surveys created to measure Social Capital are used to establish a correlation between amount of social capital of user and the actual type of behavior observed on the site, this last obtained by an automized system logging. Many of the ideas presented by Joan could be effective: to allow networking tools for users (teachers, students, technical teams, etc.), and to obtain inmediate data collection of user behavior.
- Breakout Session 2:
Bring Your Own Data What data are we collecting? How are we sharing the data with stakeholders and reflecting on them?
- Day 1 Closing
Reports from Breakout Sessions and Reflections
Melissa Henriquez presented a review of the evaluations conducted across the region, "Evaluations of OLPC projects A review of 2009-2010 reports", in which we summarize the most common indicators measured throughout eight evaluations of OLPC projects as well as most common tools used for data collection.
Main Ideas Day 1
Participants described the following expected outcomes for the workshop:
- To make explicit what everyone is doing in terms of evaluation, why we need to evaluate.
- To find common indicators, what to measure and to measure competences/skills.
- How can we make the tools (software/journal) easier to collect relevant data?
Why do we need to evaluate? For different audiences:
- Public sectors/goverment needs valid data.
- Donors, private and public sector.
- Ourselves, to get feedback of our work, to know if it is working.
- To tell the story at scale.
What do we need to measure?
- Creativity
- Digitalization
- Technological Fluency
- Cost-Benefit Analysis
- Games-Learning
Agenda Day 2
- Overview of the day:
Goals and future actions by Claudia Urrea, Director of Learning of OLPC LA
- Thematic Session 3:
Alternative ways of assessment and understanding impact by Evangeline Stefanakis
Failing Our Students, NYTimes Article File:Failing Our Students-Stefanakis.pdf
Multiple Intelligences and Portfolios: A Window into the Learners Mind (book, available on arrival)
Video of Presentation Part 1 File:Evangeline stefanakis part 1a, Part 1B File:Evangeline stefanakis part 1b,
Part 2 File:Evangeline stefanakis part 2
Summary
Evangeline Stefanakis demonstrated how useful digital portfolios can be as a comprehensive assessment system of student learning. As an addition to formal, curriculum based assesments that can only represent a snapshot in a particular time, we learned how portfolios can make evident students' learning over a period of time. Furthermore, portfolios can improve students' learning and teachers' teaching as a result of having students take ownership of their work, and fostering reflection on learning and on teaching. By incorporating the Multiple Intelligence approach, we engage students on projects that target all aspect, and incorporate different media that allow students to express themselves the way they feel more confident on. Evangeline's presentation gave us clear ideas on how we can improve the Journal on Sugar by taking advantange of its uses and adapting it towards a more structured way to collect data on children's learning.
- Breakout Session 3:
How do we make sense of the whole, by understanding a few?
- Scratch Community http://scratch.mit.edu/
Characteristics of Social Networking and Data Collection by Andres Monroy
Video of Presentation (in Spanish) File:Andres monroy.mov
- Thematic Session 4:
Framework for new media literacy by Margaret Weigel
Confronting the Challenges of Participatory Culture: Media Education for the 21st Century http://newmedialiteracies.org/files/working/NMLWhitePaper.pdf
Slideshow of Presentation File:New media literacy.ppt
Video of Presentation Part 1 File:Margaret weigel.mov and Part 2 File:Margaret weigel part 2.mov
Summary
Margaret Weigel outlined highlights from the New Media Literacy white paper, from her work as research director for the project "Confronting the challenges of Participatory Culture: Media education for the 21st Century". The proposed skills to be cultivated in an online context are: Play, Performance, Simulation, Appropriation, Multitasking, Distributed Cognition, Collective Intelligence, Judgment, Transmedia Navigation, Networking, and Negotiation. Margaret gave examples of positive and negative ways students can develop these skills, plus she presented clear guidelines for educators to assesss students' performance on each skill.
- Breakout Session 4:
Actions to be taken What is important to measure? Why and how? What data should Sugar collect? How? How do communicate the impact? and to whom?
- Day 2 Closing Reports from Breakout Sessions and Reflections
Additional Resources
Newsletter Elementary Curriculum File:MARCH NEWSLETTER final.pdf
EdLeader21 http://www.edleader21.com/
21st Century Skills: Rethinking How Students Learn http://go.solution-tree.com/21stcenturyskills/
Learning Powered by Technology http://www.ed.gov/technology/netp-2010
Literacy21: Learning in a Changing World https://prezi.com/secure/deb482089982c79d2f3d2296e64f99ecd1e6f776/
P21 Mile Guide Self-Assessment http://www.p21.org/mileguide/
Stages of Successful Implementations http://www.fpg.unc.edu/~nirn/implementation/06/06_stagesimple.cfm
The Horizon Report http://wp.nmc.org/horizon2010/
Honeycomb: Visual Analysis of Large Scale Social Networks File:Vanham-honeycomb-interact09.pdf
The Tripod Project. Tripod Survey Assessments:Multiple Measures of Teaching Effectiveness and School QualityFile:TripodProgramassessments-flyer.PDF
Drawing on Education: Using Drawings to Document Schooling and Support Change File:DrawingOnEducation.pdf