Tuesday, June 24, 2014

Exciting Learning from people involved in South African ICT in Education

I've been fortunate to be invited to a small gathering of people working with Coza Cares in the ICT space in South Africa. The luxury of sitting down for two days and listening to people talk about what they are passionate about, is something to truly savour.

I did a presentation on some ideas I have to define and measure learners' 21st Century Skills in the context of the ICT4RED project. I currently have more questions than answers, but I'm sure we will get there soon. Here is a link to a summary table comparing different Definitions of 21st Century Skills.

Other presentations I really enjoyed was
* Barbara Dale Jones on the role of Bridge and learning communities and knowledge management
* Fiona Wallace, on the CoZaCares model of ICT intervention
* John Thole on Edunova's programme to train and deploy youth to support ICT implementation in Schools
* Siobhan Thatcher from Edunova's presentation on Edunova's model for deploying Learning Centres in support of Schools
* Brett Simpson from Breadbin Interactive on the learning they've done on the deployment of their content repository.
*Ben Bredenkamp from Pendula ICT talking about their model for ICt integration and experience of the One Laptop per Child project in South Africa.
* Dylan Busa from Mindset speaking about the relaunch of their website content.
* Merryl Ford and Maggie Verster talking about the ICT4RED project




Wednesday, May 28, 2014

Impact Evaluation Guidance for Non-profits

Interaction has this lovely Guidance note and Webinar Series on Impact Evaluation available on their website.

Impact Evaluation Guidance Note and Webinar Series

With financial support from the Rockefeller Foundation, InterAction developed a four-part series of guidance notes and webinars on impact evaluation. The purpose of the series is to build the capacity of NGOs (and others) to demonstrate effectiveness by increasing their understanding of and ability to conduct high quality impact evaluation.
The four guidance notes in the series are:
  1. Introduction to Impact Evaluation, by Patricia Rogers, Professor in Public Sector Evaluation, RMIT University
  2. Linking Monitoring & Evaluation to Impact Evaluation,  by Burt Perrin, Independent Consultant
  3. Introduction to Mixed Methods in Impact Evaluation, by Michael Bamberger, Independent Consultant
  4. Use of Impact Evaluation Results, by David Bonbright, Chief Executive, Keystone Accountability
Each guidance note is accompanied by two webinars. In the first webinar, the authors present an overview of their note. In the second webinar, two organizations - typically NGOs - present on their experiences with different aspects of impact evaluation. In addition, each guidance note has been translated into several languages, including Spanish and French. Webinar recordings, presentation slides and the translated versions of each note are provided on the website.

Wednesday, May 21, 2014

Resources on Impact Evaluation

This post consolidates a list of impact evaluation resources that I usually refer to when I am asked about impact evaluations. 

This cute video explains the factors that distinguishes impact evaluation from other kinds of evaluation, in two minutes. Of course randomization isn't the only way of credibly attributing causes and effects - and this is a particularly hot evaluation methodology debate.  For an example of why this is sometimes an irrelevant debate - see this write up on parachutes and Chris Lysy's cartoons on the topic.

Literature on the Impact Evaluation Debate

The Impact Evaluation debate flared up after this report, titled "When will we ever learn" was released in 2006. In the States there also was a prominent funding mechanism which required programmes to include experimental evaluation methods in their design, or not get funding (from about 2003 or so).

The bone of contention was that Randomized Control Trials (RCTs) and Experimental methods (and to some extent Quasi Experimental Designs) were held up as the "gold standard" in evaluation. Which, in my opinion, is nonsense. So the debate about what counts as evidence started again. The World Bank and big corporate donors were perceived to push for Experimental Methods, Evaluation Associations (with members committed to mixed methods) pushed back saying methods can't be determined without knowing what the questions are. And others pushed back saying that RCTs are probably applicable in only about 5% of the cases in which evaluation is necessary.

The methods debate in Evaluation is really an old debate. Some really prominent evaluators decided to leave the AEA because they embarked on a position that they equated with "The flat earth movement" in geography.  Here is a nice overview article, (The 2004 Claremont Debate: Lipsey vs. Scriven. DeterminingCausality in Program Evaluation and Applied Research: Should ExperimentalEvidence Be the Gold Standard?) to summarise some of it.
The Network of Networks in Impact evaluation then sought to write a guidance document, but even after this was released, there was a feeling that not enough was said to counter the "gold standard" mentality.  This document, titled "Designing impact evaluations, different perspectives" provides a bit more information on the "other views".

Literature on Impact Evaluation Methods
 If you are interested in literature on Evaluation Methods, look at Better Evaluation to get a quick overview.

I like Cook, Campbell and Shadish to understand experimental and quasi experimental methods, but this online knowledge base resource is good too.
 
For some resources on other more mixed methods approaches to impact evaluation, you need to look at Realist Synthesis, General Elimination Method, Theory Based Evaluation, and something that I think has potential, the Collaborative Outcomes Reporting approach. 


The South African Department for Performance Monitoring and Evaluation's guideline on Impact Evaluation is also relevant if you are interested in work in the South African Context.

Wednesday, April 02, 2014

Getting authorisation to do Research and Evaluation in Schools

 
A colleague working in an educational NGO asked this question, about working in schools in South Africa:

I just wanted to ask a quick question. Do I need to get permission from the relevant Provincial Department of Education to carry out research in schools if the schools are part of a project we’re running? In other words, the district is aware of us and probably interacting with us?
 
My answer: 
I've only done research or evaluations in a few Provinces, not all of them, but in all of those Provinces the Education Departments have guidelines for researchers that require you to fill in forms, submit your research proposal (and sometimes evaluation instruments) for review, and also binds you to some promises about the use of your research or evaluation findings. (E.g. the Province may require copies of reports, may require you to present your findings, etc.) Check any of the Provinces' annual reports to see which Director in the Provincial office is in charge of Research, and lodge your enquiry about requirements there, if you can't find details on the Provincial Education Website.

The officials in Education Districts are often not aware of the Provincial requirements, so one might be able to get away without Provincial authorization, but this is a bad idea for at least two reasons: 

* It helps if the Research Directorate in the Provincial Education Department have your details on their database because it promotes use and coordination of research, and
*It can solve a lot of headaches for you should someone complain about your research going forward. 

Since Education in schools is a Provincial competence, I have been unable to get blanket approval from National Education to work in multiple Provinces - so that meant filling in the different forms and providing the different details to the different Provinces, and following up on the outcome of each of these processes.

Besides Provincial approval, some clients might also require that any human subject research gets vetted by a research ethics approval board, like the ones attached to universities, or science councils. I've only dealt with a few of these, but they mostly require you to prove that you have authorization to conduct the research, so the two goes hand in hand.

Of course approval by the Province and Research Ethics Boards are still not all that you need to do to ensure that you conduct your work ethically - Some fields (E.g. Marketing Research - see the ESOMAR guidelines),  have guidelines about ethics... so it would be good to study these and make sure your practice remains above board.

And then this, of course, is also true:

Live one day at a time emphasizing ethics rather than rules.
Wayne Dyer


 

Thursday, March 27, 2014

I am because you are

In a previous blogpost I reflected on how African values shape my practice of Evaluation.

This week I attended a seminar during which Gertjan Van Stam shared some provocative views on development in Africa. I started reading his book 'Placemark'. I love the way he gives voice to rural Africa. I find it interesting that this Dutch Engineer manages to give voice to Africa in a way that I can relate with.

His beautifully written take on Ubuntu:

I am, because You are

Is it possible that people in rural areas of Africa can connect with people in urban areas around the world?

That one can walk into a scene and meet someone who walks into the same scene, even if it is geographically separated?

That we explore and connect rural and urban worlds worldwide without anyone being forced into cultural suicide?

That we meet around the globe and relate, embrace, love, and build meaningful relationships?

That we find ways to be of significance and support to each other and together shuffle poverty and disease into the abyss?

That we encourage each other to withstand drunkenness and drugs, bullying, self harm, and greed?

That we share spiritual nutrition to deal with wealth, loss, alienation and pain in this generation?

That we unite through social networks, overcoming divides and separations?

That we share ancient, tested, and new resources, opportunities, visions, and dreams that lead to knowledge, understanding and wisdom?

That we collaborate to discuss, and engineer tools, taking into account the integral health of all systems?

That together, South and North, build capacity, mutual accountability, and progress, for justice and fairness?

That I am, because You are?