Our Quality Assurance System is built on the three principles of our partnership: Focus, Collaboration, and Integrity.
Principle One: Focus
We will only ever work where we can truly add value. By choosing to specialise in development and humanitarian evaluation we seek to continuously grow our cadre of talented evaluators. These are people who combine intellectual and ethical rigour with creative passion.
Principle Two: Collaboration
We have a deeply held commitment to partnership as a means for creating change. In fact, even our business structure is a partnership. We believe that two heads are always better than one: our consultants always draw on our wider network to reflect on, enhance, and share their understanding. Our approach to working is to build the capacity of those with whom we work: it can be characterised as that of the facilitator.
Principle Three: Integrity
We recognise the nature of evaluation as an intervention in people’s lives. The responsibility of the evaluator is that of one who can speak truth to power and give voice to the marginalised. We approach evaluation from a perspective of advantaged thinking: recognising the knowledge and capacity of all people as agents of change. We bring integrity not only to our relationships with our clients and their stakeholders, but also to the coherence of our values, methodological designs, and outputs.
The design of our quality assurance system draws from our combined experiences as partners from a broad range of backgrounds: in international development, social enterprise, and the private sector. Our partners have designed and implemented Evaluation Quality Assurance systems, including helpdesks, for UNICEF, IOD PARC and DFID.
We are committed to continuous learning and improvement. This is based on hands-on experience of implementation through our operation of The CHILD Trust and pro bono support of early stage social enterprise start-ups with OpenGround.
We are committed to leading the way in integrating human-centred design into evaluation: enhancing the usability and experience of evaluations for our clients and their stakeholders.
QA Level 1 – Recruitment
Our Partners, Advisors and Fellows are hand picked. We only ever invite people to join us if we have previously worked with them. Each is a unique individual who combines technical expertise with creative passion. All of our team members hold, or are actively working towards, relevant Masters-level qualifications.
Whilst each of our team has different – and complementary – knowledge and skills, we have a shared mindset: an openness to and passion for understanding the Other; a lifelong commitment to social justice; and determination to deliver excellence in our work.
QA Level 2 – Proposals
We approach each potential assignment as an opportunity to enhance the impact our clients can create. If we believe that others can deliver this better than us, then we either refrain from applying, or we partner with others whose capacity we can complement.
For each proposal we draw on international standards, such as UNEG and OECD-DAC, and continuously keep up-to-date with cutting-edge developments in evaluation, development, and social technology. Our Senior Partner for Consulting is a member of the International Development Evaluation Association (IDEAS) and an Observer Member of ALNAP.
QA Level 3 – Evaluation Design
Each and every evaluation is different. We work to ensure that each study’s findings, learning, conclusions and recommendations are clearly located within a rigorous process, which meets international and UN standards.
Our team draw on a wide range of methodologies, designs, and tools to develop solutions that are most appropriate. Wherever possible and appropriate, we use mixed methods to enhance reliability and evaluative depth. Our theory-based, mechanism-based, and difference-based designs are always informed by ethical standards, developed to real-world criteria, and apply participatory principles. We never choose an evaluation design because it is fashionable.
QA Level 4 – Process Monitoring
The Team Leader holds overall responsibility for quality assurance. This is based on close working with each client to further refine the ToR and identify robust mechanisms for validating and communicating emerging findings. We have piloted an innovative approach to iterative reporting and documentation: a single evaluation report that goes through multiple small updates as the evaluation proceeds. These version changes are continuously shared with our clients for comment, leading to strong understanding and ownership of the final report.
We implement systematic time and task management using Asana. Combined with clearly structured evaluation plans, this enables us to attribute time-spent to specific deliverables. For large or complex evaluations, we appoint one of our M&E Fellows to act as the Evaluation Coordinator.
QA Level 5 – Learning and Communication
A non-involved Senior Partner conducts internal quality assurance and learning reviews for each assignment according to an appropriate schedule. This can result in action plans to ensure effective process documentation and the implementation and monitoring of any required changes. The QA review process also benefits from our involvement across multiple sectors and the multiple perspectives this brings.
In 2013 we launched a brand new version of ImpactReady.org with a space dedicated to sharing learning. For each assignment we endeavour to develop a ‘Million Dollar Slide’: one slide that communicates the key insight from the process (we never share insights about our clients’ work, only the process we followed). By making these publically available, we hope to both build the knowledge of the sector and invite feedback and discussion for further improvement.