“There is a great satisfaction in building good tools for other people to use”
– Freeman Dyson, Disturbing the Universe
Current clients (2015)
(I am now working part-time :-)
- Keynote speaker and discussant at Expert Meeting on Impact Measurement, The Hague. “Oxfam Novib has developed an approach to measure impact, called the World Citizens Panel. The World Citizens Panel is a balanced approach of qualitative and quantitative methods with a focus on changes in people’s lives and making these changes visible in an empowering, inclusive and rigorous way”
- Quality Assurance and technical advisory support to Evaluation Department staff, in relation to the macro-evaluation of DFID’s investments in two policy areas: Empowerment and Accountability, and the Strategic Vision for Girls and Women
- Member of the Evaluation Reference Group for the DFID “Work In Freedom” project, which seeks to prevent the trafficking of Women and Girls from Nepal, India and Bangladesh into the garment and domestic work sectors in the Gulf.
- Member of the independent advisory group supporting the providers of Monitoring, Evaluating and Learning support to the International Climate Fund
- Developing and implementing an M&E Framework for the Australia–Mekong Non-Government Organisation Engagement Platform (AM-NEP), based in Hanoi
- Advisory support re network analysis of research team collaboration being undertaken by the Institutional Learning and Change Initiative
- Advisory support to the evaluation unit within SolarAid, London.
- Advisory support on evaluation issues to the Humanitarian Centre, Cambridge.
- Exploration of content analysis methods with MSC stories, with Results In Health, Leiden
- Exploration of methods suitable for evaluating portfolios of projects, with Comic Relief, London
- Synthesis of literature on evaluability assessments. Purpose:To produce a short practical note that summarises the literature on evaluability assessments, and highlights the main issues for consideration in commissioning an evaluability assessment
- Lead adviser on the design and implementation of an evaluablity assessment for evaluations of DFID investments in ‘Empowerment and Accountability’ and DFID’s Strategic Vision for Girls and Women”.
- Working with a team led by Elliot Stern with the task of “Developing a broader range of rigorous designs and methods for impact evaluations”
- Member of Advisory Group for the ODI Policy Influence Monitoring Project – which will seek to monitor, evaluate and build the capacity of 3ie grantees as they seek to ensure that results of their impact evaluations are used by policy makers
- Complex systems tools: a PRF-funded research project – Working with Ben Ramalingam on exploring the use of network models for conceptualising and measuring change in DFID funded women and girls empowerment programmes,
- Building M&E capacity of Gov of Vietnam partners in phase 4 of the AID funded Human Rights Technical Cooperation Project, implemented with the assistance of the Australian Human Rights Commission. This follows on from my earlier involvement in review and design stages of phase 3 and 4 respectively, described below 2010 review of the first three phases of this project, described below.
- Working with Tracey Delaney to review World Vision’s experience with the use of the Most Significant Change (MSC) technique.
- Two one-day workshops for BBC Media Action staff on possible approaches to the evaluation of portfolios of projects, including QCA, Decision Tree modeling, and evaluability assessments
Other ongoing activities
- Managing the website and moderating the associated email lists (2500+ members)
- Contributor to the Better Evaluation website, implemented by RMIT University, ILAC, PACT and ODI.
- Peer review of papers for publication in American Evaluation Journal, Evaluation and Program Planning and Development in Practice
Conference presentations and workshops
- UK Evaluation Society Conference, London March 2014; Panel presentation on the use of evaluability assessments
- European Evaluation Society Conference, Dublin October 2014: Workshop on the use of evaluability assessments, presentation on the triangulation of results of QCA analysis