Wednesday, October 07, 2015

What I'm up to at the 2015 SAMEA Conference

The SAMEA conference is happening from 12 to 16 October and I'm looking forward to it. 

5thSAMEA Conference LogoSince January, I've had to temporarily downscale my professional involvement in the M&E and Educational networks and I had to neglect this little blog a bit because of a second long term development project I took on in January 2015. The project has lovely brown eyes, an infectious laugh and goes by the name of Clarissa. I'm happy to report that no major clashes with the first development project, (Named Ruan) has so far occurred, but its been a bit of an adjustment to balance work, and volunteering, and life in general. 

So what am I up to  at the conference?
I'll be tweeting from @benitaw if you are interested in my perspective of the conference. I will also attend an IOCE stand at the conference, aiming to promote the VOPE Institutional Capacity Toolkit which my consultancy developed under the EvalPartners leadership of Jennifer Bisgard, Patricia Rogers, Jim Rugh, and Matt Galen. This is an online toolkit full of helpful resources aimed to equip VOPEs (Voluntary Organisations for Professional Evaluation) to become more accountable and more active. 

Then, I'll be teaming up with Cara Waller (from CLEAR) and Donna Podems (from OtherWise) in a session for African VOPEs  on Friday 16th October. This is a ‘world-cafĂ©’ style event, from 10 –11:30am, to be held as a joint ‘Made in Africa’ and ‘Discussing the Professionalisation of Evaluation and Evaluators’ stream session.  The aim of the session is to provide a space for those involved with VOPEs in the region (and those with an interest in strengthening African VOPEs) to come together to discuss current topics around building quality supply and generating demand for evaluation in contextually-specific ways. So please come and chat all things VOPE on the day!

Good luck to my colleague Fazeela Hoosen and the rest of the SAMEA board on hosting this year's conference with the DPME and the PSC. I know (and boy.... do I know) it is very hard work. So thanks in advance for all of the hours you are putting in, to make this event happen. 

Thursday, October 16, 2014

True Confessions of an Economic Evaluation Phobic

You know how the forces at work in the universe sometimes conspire and confronts you with a persistent nudge... over an over again? Well this week's nudge was "You know nothing about economic evaluation... do something about it - Other than ignoring it".

Words like "cost-benefit analysis, cost-efficiency analysis, cost-utility analysis"... actually anything with the word "cost" or "expenditure" in it... makes me nervous. So my usual strategy is to ignore the "Efficiency" criterion suggested by the OECD DAC, or I start fidgeting around for the contact details of one of my economist friends, and pass the job along. I have even managed to be part of a team doing a Public Expenditure Tracking Survey without touching the "Expenditure" side of the data.

But then I found these two resources that helped me to start to make a little bit more sense of it all. They are:

The South African Department of Planning Monitoring and Evaluation's Guideline on Economic Evaluation  At least it starts to explain the very many different kinds of economic evaluation you should consider if you work within the context of South Africa's National Evaluation Policy Framework.

And then this.

A free ebook by Julian King that presents a short theory for helping to answer the question "Does XYZ deliver good (enough) value for investment?" - Essentially the question any evaluator is supposed to help answer.

So, now, there is one more topic on my ever expanding reading list! If there is a "Bible" of economic evaluation, let me have the reference, ok?

Friday, September 12, 2014

What if, mid career as a researcher, you become interested in Evaluation?

An old classmate, that took the market research route after completing her Research Psych Master's Degree, asked me for a couple of references to check out if she wanted to develop her evaluation knowledge and skills. What came to mind is the following professional development resources. I'm sure there's many more easily accessible ones, but this is a good start for a list!

Tuesday, August 12, 2014

Further Resources and Links for those who attended the Bridge M&E Colloquium on 12 August 2014

Today, I got the opportunity to present to the Bridge M&E Colloquium on the work I'm doing with the CSIR Meraka Institute on the ICT4RED project. My first presentation gave some background about the ICT4RED project. 

I referred to the availability of the Teacher Professional Development course under a creative commons licence here, - This resource also includes a full description of the micro-accreditation system or Badging system. 

What seemed to get the participants in the meeting really excited is the 12 Component model of the project - which seems to suggest that one has to pay attention to much more than just technology when you implement a project of this nature. My colleagues published a paper on this topic here.

Participants also resonated with the "Earn as you Learn" model that the project follows - If teachers demonstrate that they comply with certain assessment criteria, they earn technology and peripherals for themselves and for their schools. A paper on the gamification philosophy that underlies the course, is available here.  The Learn to Earn model was documented in a learning brief here.

And then I was able to speak a little more about the evaluation design of the project. The paper that underlies this work is available here, and the presentation is accessible below:

I think what sets our project evaluation apart from many others being conducted in South Africa, is that it truly uses "Developmental Evaluation" as the evaluation approach. For more information about this (and for a very provocative evaluation read in general), make sure you get your hands on Michael Patton's book. A short description of the approach and a list of other resources can also be found here.

People really liked the idea of using Learning Briefs to document learning for / from team members, and to share with a wider community. This is an idea inspired by the DG Murray Trust. I blogged about the process and template we used before. An example of the learning brief that the M&E team developed for the previous round, is available here. More learning briefs are available on the ICT4RED blog.

I also explained that we use the Impact Story Tool for capturing and verifying an array of anticipated and unanticipated impacts. I've explained the use and analysis of the tool in more detail in another blog post. There was immediate interest in this simple little tool.

A neat trick that also got some people excited, is how we use Survey Monkey. To make sure that our data is available quickly to all potential users on the team, we capture our data (even data collected on paper) in Survey Monkey, and then share the results with our project partners via the sharing interface on Surveymonkey - even before we've really been able to analyse the data. The Survey Monkey site, explains this in a little more detail with examples.

The idea of using non-traditional electronic means to help with data collection also got some participants excited. I explained that we have a Whatsapp group for facilitators, and we monitor this, together with our more traditional post-training feedback forms, to ascertain if there are problems that need solving. In an upcoming blog post, I'll share a little bit about exactly how we used the WhatsApp data, and what we were able to learn from it.