14 Process Tracing

Estelle Raimondo


Process tracing is a theory-based evaluation approach. Based on the formulation of a process theory-of-change, it collects evidence to ascertain how the intervention unfolded in a single case and whether it plausibly contributed to change in outcomes. Often described as a qualitative method, process tracing can in fact rely on a diversity of qualitative and quantitative methods. Particularly useful to evaluate complex interventions, it addresses the questions of “under what conditions, how, and why” an intervention worked, rather than how much impact it produced.

Keywords: Qualitative methods, processes, theory-based evaluation, process theory-of-change, causal principles, contribution pathways, evidence, fingerprints, Bayesian reasoning

I. What does this approach consist of?

When evaluators conduct Process Tracing (PT), they behave a bit like “detectives”. When applying process tracing, evaluators are interested in explaining, rather than simply describing, change processes. To put it simply, evaluators seek to trace how the activities of actors/entities and their motivation are interlinked to trigger change in the behavior and action of others. Empirically, process tracing is also akin to “detective work” as it consists of assembling a body of evidence (what D. Beach calls ‘fingerprints’) to ascertain how the intervention unfolded in a single case and whether it plausibly contributed to change in outcomes. In slightly more technical terms, process tracing is a theory-based evaluation approach for studying how interventions worked in actual cases (see separate chapter on theory-based evaluation). As such, process tracing belongs to the family of methods that seek to answer the questions of “how, why and under what circumstances” programs and policies work by studying how they play out in the real-world. Visually, process tracing seeks to understand what is going on ‘in-between’ the arrow linking interventions and results, in a typical theory of change. Its comparative advantage over other methods is in fully opening the black box of change processes.

Process tracing is often considered a “qualitative” approach because it tends to rely on qualitative evidence (from interviews, observations, documents, etc.) but, like many other theory-based evaluation approaches, it resists simple classification and is better described as ‘methods agnostic’. It can accommodate and use a range of methods of data collection and analysis, quantitative or qualitative, in seeking to assemble a body of evidence that is robust enough to adjudicate between the process theory of change under scrutiny and alternative explanations. In addition, more recently, some evaluators have mathematically formalized the use of process tracing through the application of Bayes’ theorem (Befani 2021)

At its core, there are two main phases and a few unique features to process tracing that distinguish it from other theory-based evaluations which we will highlight briefly.

I.I. The first phase of process tracing consists of formulating a process theory-of-change (pTOC).

A pTOC is a detailed theory of how an intervention produced a contribution to an outcome of interest. It means unpacking the activities of actors/entities which together constitute the inner working of programs (the arrow). Actors are the people or organizations doing things, whereas actions are what they are doing. Understanding why the actions of one actor led other actors to do things requires trying to make as explicit as possible what Cartwright and Hardie (2012) term the causal principles.

To do that, initially, the evaluators brainstorm about what ‘contribution’ might have realistically been produced by an intervention and start laying out plausible contribution pathways between them. This might mean drawing from existing theoretical literatures in the social sciences on the topic or from repositories of evaluative evidence in the grey literature. It also means exploiting program and policy documents. In this sense, process tracing does not limit its investigation to the stated policy goals but to plausible intended or unintended pathways to outcomes of interest.

When determining what contribution might have been produced by an intervention, it is also important to explore competing explanations outside of the scope of program activities that could also account for the outcomes.

The number of details provided in a pToC varies. A more detailed pToC is required when the evaluation seeks to produce actionable knowledge that can help with project implementation. In contrast, if the goal is to understand how a type of intervention works across several cases, a simplified, mid-range pToC can be sufficient.

I.II. The second phase consists in testing the pTOC empirically and figuring out how it actually worked in a case.

Process tracing seeks to test and refine its theory by observing how the intervention worked in a single case. In process tracing a granular pToC is used as a scaffolding for the empirical assessment of how a contribution was actually produced. This means that before engaging in actual data collection, evaluators must anticipate the type of plausible “fingerprints” left by the change mechanism and figure out the type of evidence they need or want to see to boost their confidence in their theory. There are two types of useful evidence that the evaluators are looking for. Some evidence “need to be found” to avoid decreasing evaluators’ confidence/disconfirming the pToC (sometimes called “hoop test”). Evaluators are also seeking evidence that they would “love to find” to significantly boost their confidence in the pToC (sometimes called “smoking gun test”).

When thinking about the evidence evaluators would need/love to see, they should cast the net widely in a search for a variety of different potential “fingerprints”. In process tracing, each individual piece of evidence typically tells us little, but combined, they might act as a unique, confirmatory signature that a given action and linkage took place in the case. Working with evidence therefore often involves a form of bricolage (for more on this, see Beach and Pedersen, 2019: 232-233).

Once the data collection has started, a critical assessment of the observations and evidence must take place. Bayesian reasoning is often used as the logical framework to assess the strength (probative value) of the evidence, either in an informal way, similar to how Bayesian reasoning informs evidence evaluation in criminal investigations (e.g. Beach and Pedersen, 2019), or more formally through the application of Bayes’ theorem and estimation of probabilities of finding/not finding evidence (e.g. Befani and Stedman-Pryce, 2017). Essentially, evaluators conducting process tracing must ask the following questions:

  • If expected “fingerprints” are not found, did we have full access to the empirical record, and can we trust that our sources were not hiding something from us?

  • If expected “fingerprints” are found, have we interpreted what our sources have told us correctly in this context, and can we trust them?

II. How is this approach useful for policy evaluation?

When process tracing made its way into evaluation practice, the field of impact evaluation had been dominated by (quasi-)experimental approaches with strong comparative advantages in establishing the average treatment effect of relatively straightforward interventions whose effect could be measured quantitatively. However, the need to expand the evaluators’ toolbox to other approaches that could answer different types of impact evaluation questions and investigate interventions that were more complex and less amenable to quantification and controlled comparisons became increasingly pressing. Process tracing emerged as a useful approach for evaluations that seek to explain change processes and are less concerned with the question of “how much” an intervention impacted a desired outcome, and more with understanding “under what conditions, how, and why” an intervention worked in the real-world.

Process tracing has been used to assess the impact of a range of interventions, but has a comparative advantage over other methods in studying ‘intangible’ or ‘soft’ interventions, such as the influence of knowledge and data work, advocacy and communication campaign, policy dialogue on decision-making, etc. It also works well to assess the impact of interventions that target behavioral changes among participants through sensitization and incentives mechanisms.

Process tracing can be used to serve various decision needs, but it fits particularly well for the adaptive management of interventions, when seeking to test and refine implementation modalities in various contexts. It can also be useful to use process tracing during a piloting or scale-up phase, to gauge whether the change mechanisms are triggered when interventions are replicated or scaled up. It tends to work well as an embedded or retrospective approach.

III. Examples of the use of this approach in the field of development

A few examples of real-world applications of process tracing in evaluation primarily drawn from the development evaluation include: the use of process tracing to assess the sustainability of budget support interventions (Orth et al. 2017), to study the impact of advocacy campaigns on the preservation of biodiversity (D’Errico, et al. 2017), and to understand the contribution of citizen engagement mechanisms in the improvement of public service delivery in the Dominican Republic (Raimondo, 2020).

In this latter example, the evaluation sought to respond to the intensification of aid agencies’ efforts to put citizens front and center in defining their development agenda. The World Bank decided in 2014 to mainstream citizen engagement activities in all of its projects where direct beneficiaries could be identified. In making this policy commitment, the World Bank claimed that engaging citizens was not only the “right” thing to do, but it was also going to improve the effectiveness of its projects. The evaluation selected a typical case of using citizen engagement mechanisms to improve the delivery of health and education services for poor households in the Dominican Republic to test that claim. Unpacking and testing the causal mechanism underlying citizen engagement activities certainly enhanced the evaluation team’s understanding of the behavioral, operational, and institutional inner workings of the intervention and the conditions under which citizen engagement could transmit causal power to change the quality of services. Based on this granular understanding, the evaluation made practical recommendations to the program in terms of how meetings with citizens should be facilitated and by whom to ensure an effective feedback loop and service improvement. However, process tracing needed to be complemented with cross-case comparisons to enhance the generalizability of the findings and their policy relevance for the entire program, which was implemented across regions.

IV. What are the criteria for judging the quality of the mobilisation of this approach?

The quality of process tracing’s implementation hinges on how well theory and empirics are brought together. To arrive at a process tracing with high internal validity, the three following criteria should be kept in mind: (1) a more disaggregated and fine-tuned pToC that captures key episodes and mechanisms; (2) evidence that is highly unique found for each part of the pToC; (3) trustworthy sources and full access to the empirical record. On the other hand, if the pToC is too simple or abstract, if the evidence found is not unique or could be found for other explanations, or if the sources are too weak or not trustworthy, the internal validity will be low.

For some evaluations, it is also important for the lessons drawn from process tracing to travel to other contexts. Process tracing on its own does not have high external validity, but by combining it with cross-case comparisons it is possible to explore whether similar processes also work in other cases across contexts.

V. What are the strengths and limitations of this approach compared to others?

Key strengths of the approach when well implemented: 

  • If the three quality criteria laid out above are met, then applying process tracing significantly bolsters our capacity to establish a strong causal link between interventions and outcomes and at the same time have strong explanatory power behind the ‘how’ and ‘why’ of processes of change.

  • Process tracing provides a clear scaffold for making transparent the process of evidence gathering and assessment as well as triangulating sources of evidence. This process goes far beyond typical case study approaches, and other theory-based approaches. Process tracing makes the theory of change vividly unfold in front of the eyes of the evaluator and allows them to reach strong confidence in their impact/contribution claims.

  • It is also much easier to derive ‘practical lessons’ from a process tracing study than from many other types of evaluation approaches. Because it focuses the evaluator’s mind on causal explanations and the linkages between actions and behavior change, it helps elaborate ideas about how such activities should be tweaked or changed to improve outcomes.

  • Process tracing has a comparative advantage over other (impact) evaluation methods in assessing interventions that are not amenable to quantification or experimentation, such as policy dialogue, the contribution of research, knowledge and data work, advocacy and communication campaigns, etc.

Some (de)limitations of the approach: 

  • Process tracing is not adequate to answer ‘how much of an impact an intervention had on average on an outcome of interest’ and should not be used to fulfill this objective.

  • While it needs not be overly technical, there is a steep learning curve to mastering the ropes of process tracing. Notably, evaluators need to become familiar with setting up ‘empirical tests’ to gauge the probative value (uniqueness and trustworthiness) of their evidence; they need to become more rigorous in how they reconstruct process Theory-of-change and leverage the existing literature to theorize about behavioral change linked to specific actions, etc.

  • On its own process tracing has weak external validity and needs to be paired with a cross-case design, which can become onerous and time-consuming.

Some bibliographical references to go further

Beach, Derek. and Brun Pedersen, Rasmus. 2019. Process Tracing Methods. Ann Arbor: University of Michigan Press.

Befani, Barbara. and Stedman-Bryce, Gavin. 2017. “Process Tracing and Bayesian updating for impact evaluation”. Evaluation, 23(1): 42-60.

Befani, Barbara. 2021. Credible Explanations of Development Outcomes: Improving Quality and Rigour with Bayesian Theory-Based Evaluation. Report 2021:03, Expert Group for Aid Studies (EBA), Sweden.

Cartwright, Nancy. and Hardie, Jeremy. 2012. Evidence-based policy: A practical guide to doing it better. Oxford: Oxford University Press.

D’Errico Stefano. and Befani, Barbara. and Booker, Francesca. and Guiliani, Alessandra. 2017. Influencing policy change in Uganda: an impact evaluation of the Uganda Poverty and Conservation Learning Group’s work. PCLG Research Report. https://www.iied.org/sites/default/files/pdfs/migrate/G04157.pdf

Orth, Magdalena. and Schmitt, Johannes. and Krisch, Franziska. and Oltsch, Stefan. 2017. What we know about the effectiveness of budget support. Evaluation Synthesis, German Institute for Development Evaluation (DEval), Bonn.

Raimondo, Estelle. 2020. “Getting Practical with Causal Mechanisms: The Application of Process-Tracing under Real-World Evaluation Constraints.” New Directions for Evaluation, Fall 2020: 45-58.


Icon for the Creative Commons Attribution-ShareAlike 4.0 International License

Policy Evaluation: Methods and Approaches Copyright © by Estelle Raimondo is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book