How to test our innovations? Defining criteria to select the most appropriate methods

Author: Philipp Grunewald & Thomas Baar

The objective of innovation is to add value. Through a pilot you might want to learn whether, and how, your innovation works in the real work. But how do you successfully generate the required evidence? How do you select the most appropriate method(s) to generate evidence during your pilot?

During the Humanitarian Innovation Exchange (26 June 2019) in The Hague, the Isôoko project contributed to a session on “Planning Research in a Pilot Project”. On the basis of a case study provided by us, we reflected together with others how to set up a research design for our trials. 

Humanitarian Innovation Exchange

The Humanitarian Innovation Exchange brought together over 100 supporters and practitioners of humanitarian innovation to share learning and experience, and to evaluate and develop new tools and approaches for humanitarian innovation. 

Drawing on the Humanitarian Innovation Guide and other new resources under development, the event hosted several workshops to focus on key challenges in humanitarian innovation processes such as generating evidence through pilots.

Planning Research in a Pilot Project

The objective of a pilot is to learn whether, and how, your innovation works in the real world. But, how do you select the most appropriate method(s) to generate evidence during your pilot? Despite growing investments in humanitarian innovation, there are very few resources available for selecting appropriate methods by which to generate and apply evidence on innovation. 

In this session, we explored how generating evidence on innovation is different from traditional monitoring and evaluation processes. We focussed on which criteria should be taken into consideration for selecting appropriate research methods for innovation pilots on the basis of two case studies – of which one was provided by ourselves. 

Democratic Reflections

In March 2019, we conducted a first series of trials on the use of Democratic Reflection. This is a tool developed by the Open University to find out how viewers react to what they see and hear in televised election debates. We explore how this tool can be used on top of existing audiovisual content to develop more active listening, empathic engagement and critical thinking traits and capacities. 

Together with Aegis Trust, we tested this tool in relation to some of the audiovisual content developed for their peace and values education programme. We ran several tests with both groups of youth and parents. During this session, we reflected on the research design that we had adopted and which criteria might help us to consider whether this was the most appropriate method. 

Some lessons learned

There were valuable exchanges of experiences and insights among the participants of the workshop.

  • Methodologies often have to respond to competing demands (addressing different priorities and questions from different stakeholders);
  • Finding the most appropriate methods for any trial is a complex process (partially due to the above outlined) influenced by many variables;
  • Focussing on the decisions and practices the trial findings are meant to inform can be very valuable in ending up with a methodology and appropriate methods that work for a specific situation;
  • Frameworks can be invaluable in helping lay (as well as professional) researchers come up with robust methodologies that produce viable findings.

The workshop helped us to develop new resources for assessing the appropriateness of different research methods for advancing both trials and pilots. These resources will be published as part of the Humanitarian Innovation Guide, and relates to our work around Isôoko as a Process. This will also support us in designing subsequent trials and pilots for testing the Isôoko platform in both Rwanda and Kenya. 

Please access the following blog, if you want to read more about this break out session and the Humanitarian Innovation Exchange: Does humanitarian innovation really work? New ways to think about evidence.

Leave a comment

Your email address will not be published. Required fields are marked *