Why is something working? Or not working? Qualitative Comparative Analysis as an evaluation tool provides some useful insights in the combination of factors

How to explain why certain media products (television, radio, printed media, internet, social media) trigger an answer from powerful actors, and why do others not? This was the key question Valérie Pattyn , University Professor at the University of Leiden applied, when she conducted a Qualitative Evaluation for the Hivos Media Program in Kenya and Tanzania. The goal of this program was to increase the accountability of the government and the powerful actors.  The assumption behind the intervention logic was that, if you strengthen investigative journalism, the accountability will be increased. The program intervention was focused at financing critical investigative journalism and providing mentoring programs by coaching and learning by doing to journalists in Kenya and Tanzania.

Valérie Pattyn

 Which factors have an influencing role on the outcomes of the program and which others don’t? Valérie Pattyn applied for her research, the Qualitative Comparative Analysis (QCA). QCA is a social science research method that applies a systematic comparison to case studies. QCA helps to explore why some interventions were successful in achieving a particular outcome while others were not.  The evaluation was focused at generating qualitative information and not at quantitative information, since the intention was to generate explanations why something worked or not worked.  Valérie’s presentation was held and organized by Rutgers (knowledge centre for sexuality) in partnership with the Learning Community Evaluation of Nedworc.  The meeting was visited by M & E officers from the Strategic Partnership Partners -  Dialogue and Dissent and monitoring & evaluation experts from Nedworc.


visitors

Comparing successful versus unsuccessful cases and data collection
The set-up of the evaluation process consisted of 4 steps;
1.       Design of the evaluation;
2.       Data collection;
3.       Data analysis
4.       Interpretation of the findings

In the design stage a number of successful and unsuccessful cases were selected by the multi-disciplinary evaluation team. The condition for selection was to select cases, which had not been influenced by extraordinary contextual factors. So the case studies needed to be comparable, knowing that the conditions in which the media products were applied were similar.  In this case it was decided to do a separate analysis of both the Kenya and the Tanzania case studies, since the political context, educational and media environment in both countries were significant different.  Conditions under QCA means, which media products do generate respons from citizens and under which conditions do media products not generate respons.  With QCA you can investigate (combination of) conditions (factors) which are necessary and/ or sufficient to accomplish desirable outcomes.
This stage was followed by a regional workshop, where during a systematic discussion stakeholders had the opportunity to give constructive input on the conditions that were proposed by the evaluation team. Usually in this stage a shortlist is made between the 3 – 6 conditions, that need to be evaluated. In this stage, it was decided to link conditions to the journalist and the media product.


Conditions linked to the media product

The data collection was done by an extensive survey and some complementary interviews to collect additional narrative information. Some of the interviews were done anonymously since some of the informants  wanted to keep their confidentiality. One part of the data collection was to calibrate the data by coding them between 0 and 1.

Example Salience of a media item



Identifying combination of factors that lead to success
During the data-analysis phase the evaluation team identified the combination of conditions, that were sufficient for the outcome of the project (actor response) and the conditions that were necessary for the actor response.
During this phase specific software was used to transfer all the collected data  in the ‘Truth Table’.  This was done by coding the data from the case studies into a 0 or a 1,  based on the conditions that were calibrated in the data collection phase.

truth table
Paths that both generate success in one case study and failure in another case study, are eliminated in this phase. These are called the the contradictory paths. Finally, based on the processing of the data, the research identified 7 paths of combinations of factors that led to a high response of citizens.


The QCA intervention was finalized with the phase of interpretation of the paths, that led to success. This is the stage were meaning was given to the different paths. It is the stage were interaction with the stakeholders is required to identify and explain the causalities between the different factors that led to success.
For example, in this evaluation it was found that the Journalistic experience (seniority) played  a major role in the ABSENCE of actor response.  This finding generated a lot of questions that needed further discussion and analysis.   Is “Experience not equal to journalistic talent” or do “experienced journalists” feel themselves to mature to be influenced by mentor advice offered by the program’?  So this is the final stage where underlying meaning is generated by discussion, involving the stakeholders. 

Challenges and lessons
Based on the experience gained from this research in Kenya and Tanzania, the team encountered the following challenges and obtained the following lessons;
·         The data collection was a tremendous and a challenging job.  The difference in quality of data,  absence of essential data and the translation of the huge amount of data to summarized data provided the evaluation team a lot of work. The objectivity of data was also a dilemma.  Some factors, such as education level or geographical outreach, were not 100 % objective. Also the heterogeneity of the case studies, made it a challenge to find common criteria for calibration.
·         QCA demands more than a regular evaluation process. It requires a lot of systematic and thorough work.  The method has a danger, that if the evaluation is not well prepared stakeholders will get participation fatigue. Therefore, it is recommended that this evaluation is combined with other methods such a process tracing of typical cases studies. Two weeks, which is the timetable for most external evaluation interventions is too short for conducting the QCA.  The method requires a wider time span and enough financial resources for the implementation, especially for the design and data collection phase.  For this evaluation the time span for completing the research took twelve months.
·         QCA can be applied with a limited number of cases studies, however you still need to select qualifiable cases studies from a medium sized project.  So in this research from the 200 case studies,  60 were selected for further research.
·         QCA is a method which is still under development and in an experimental phase. The number of cases where QCA has been applied are still limited and therefore, best practices are needed to get a better idea where the method can have most benefit and potential.
·         For the donor, the results were still puzzling. What to do with the pattern-findings? More time had to be allocated for the joint-sense making after the analysis.
·         It is difficult to have people to talk about cases that are negative; There were the outcome is absent, it is essential for this method that data is found why something did not work.


Added value and the potential of QCA
Although QCA is a method in development, it has some great potential for further application. The method generates causalities in programs which take place in a complex environment. The method helps to explain why certain combinations of factors work and why they not work. This generates a great potential for learning and evaluation.  The QCA is based on evaluating existing programs by comparing cases, that have been generated by the specific intervention. The method is accessible and transparent. You can consult and download the procedures and software at the website of COMPASSS.   http://www.compasss.org/ .  Another interesting publication you can read is an evaluation about violence against women and girls, where a combination of QCA and process tracing was used. This evaluation implemented by DFID won at the EES Conference the best poster award.  >>>> Read report.  The instructions manual for conducting a QCA is available >>>> Handout QCA.

Simon Koolwijk
Expert participatory video

July, 2016
e-mail.  faccom@xs4all.nl 

Comments

Popular posts from this blog

Seven critical factors for a successful Participatory Strategic Planning (PSP)

Most Significant Change (MSC): How to measure behavioural change?

Making a video about the future of the consultant in international development