Strategic evaluation

A paradigm shift in evaluation thinking

Duignan’s strategic evaluation (Duignan: 1997, 2004) approach represents a shift in evaluation thinking away from just focusing on evaluating individual programs to thinking in terms of meeting the strategic knowledge needs of an organization, sector, region or country. Of course, evaluating individual programs, where they have been identified as high-priorities for evaluation, is still a part of the strategic evaluation approach.  

The demands of making this conceptual shift should not be underestimated. It is best seen as a decades-long process involving changes in: stakeholder’s thinking; evaluation’s theoretical underpinnings; organizational and institutional culture; organizational values; governance; strategy formulation; resource allocation; professional and sector capacity building and evaluators’ on-the-ground practice. 

The strategic evaluation approach is particularly useful for anyone working at an organizational or sector level who is charged with evaluating organizational-wide, sector-wide, region-wide, country-wide or even multi-country-wide activity composed of a number of different individual programs. 

Reconceptualizing evaluation as being about meeting the strategic knowledge needs of an organization, sector, region or country

Currently much evaluation planning starts from the point of view of working out how to ‘evaluate a program’. In fact, the evaluation discipline is often referred to as 'program evaluation'. Strategic evaluation turns this orientation on its head and starts from the point of view of attempting to identify the strategic knowledge needs of the organization, sector, region or country in which any particular program is embedded. Once these knowledge needs are identified, one can work back from them and determine what type of evaluation is appropriate for each individual program. Not all programs merit the same level of evaluation effort. Programs following well established intervention types which have been evaluated well in the past are usually a lower priority than more innovative and experiemental programs.  

Relationship between strategic evaluation and other evaluation approaches

There are a number of different evaluation approaches (Duignan, 2003) these include: systems thinking evaluation; utilization-focused evaluation; empowerment evaluation; stakeholder-based evaluation; realist evaluation; goal-free evaluation; naturalistic or 4th generation evaluation; theory-based evaluation; and indigenous evaluation (e.g. Kaupapa Maori evaluation). Strategic evaluation has elements in common with a number of these approaches as is described below.

In terms of its relationship to systems thinking evaluation, strategic evaluation takes a broad systems approach in the sense that it focuses on the wider system in which a particular program is located rather than just evaluating a specific program. However systems thinking is a discipline in its own right, with its own formal language and concepts. Any aspects of systems thinking in this formal sense can be utilized within a strategic evaluation approach. However strategic evaluation is not limited to, or tied to, using systems thinking technical language and concepts. 

Along with utilization-focused evaluation, strategic evaluation is preoccupied with the users of evaluation. Utilization-focused evaluation is, however, often focused on being an approach to evaluating an individual program. The approach can be used within strategic evaluation once it has been determined that a particular program is a high-priority for evaluation. 

In the same way, stakeholder-based evaluation focuses on stakeholder information needs and has a lot in common with strategic evaluation. It again is often focused on evaluating a specific programs and can be used within the strategic evaluation approach once a particular program has been identified as high-priority for evaluation.

Realist evaluation is an approach to evaluation which puts an emphasis on finding out: "what works, to what extent, in what contexts, and how?” From the point of view of strategic evaluation, having this level of granularity in evaluation findings is a great ideal to strive for. Strategic evaluation focuses on what are the priorities within this ’shopping list’ of the ideal full set of information it would be great to have about interventions. 

Goal-free evaluation is an approach which starts from the proposition that specific programs may not have captured the outcomes they should ‘really’ be seeking within their particular set of stated goals. It is then a matter of working out what the program’s ‘real’ goals should be. This approach has a lot in common with strategic evaluation in that in order to do this one needs to look at a sector-wide perspective on what are the high-level outcomes that are being sought and work back from this to define the ‘real' goals against which a program should be being evaluated. Within strategic evaluation this is done by developing a model of what it is that the sector is seeking by its activity. From this overall sector model one can work back to identify the role of a particular program in achieving outcomes and evaluate whether it is, or is not, doing so.

Naturalistic or 4th Generation evaluation is an evaluation approach with a particular philsophy of science about methodology (constructivist rather than positivist) and an emphasis on qualitative approaches. Strategic evaluation is open to the selection of methods to suit the evaluation task and is open to methods from naturalistic or 4th Generation evaluation as well as those from other approaches.

Theory-based evaluation puts an emphasis on looking at the wider context in which a program is being implemented and emphasizes  the importance of surfacing program theory - the theory of change that it is believed will lead to improvements in outcomes. Both of these perspectives are consistent with strategic evaluation. 

Indigenous evaluation (e.g. Kaupapa Maori evaluation) takes a values, philosophy of science and cultural perspective based on indigenous views of the world. As a consequence, it looks at the context in which programs take place, for instance,  power, resource allocation, institutional racism, indigenous sovereignty. These persepectives are not inconsistent with the strategic evaluation approach. The key question this perpective raises when using the strategic evaluation approach is: "whose knowledge needs are priorities for being answered?" 

Basic toolkit we need in place to commence doing strategic evaluation

Before we turn to looking in detail at what we are trying to do in strategic evaluation, it is worthwhile itemizing the basic toolkit which is needed for working successfully within a strategic evaluation approach. This involves a mix of conceptual and practical tools. 

Conceptually we need a theoretical approach which does not see evaluation as isolated from other organizational, sector, regional and country-level tasks such as: outcomes identification, strategic planning, prioritization, indicator monitoring, performance management, commissioning, delegation and contracting. This is why strategic evaluation is best viewed as a sub-set of outcomes theory - the theory that deals in an integrated and cross-disciplinary way with these issues (Duignan, 2009). 

Secondly, from a conceptual point of view, we need an approach to evaluation which does not just focus exclusively on a single type of evaluation. We need to be open using: implementation (developmental and formative), process, impact and summative evaluation as, and when, appropriate. 

Lastly, from a conceptual and practical viewpoint, when looking from a sector perspective within strategic evaluation, it would be good if we could be assured that when evaluations of individual programs are taking place, that there is a standardized framework for how this is done. If we are going to feed information from individual evaluations up to a sector level, there should be some consistency in the way this information is collected rather than program-level evaluations having been done in a wide range of idiosyncratic ways.  

In terms of resourcing evaluation, from an administrative point of view, evaluation planners should ideally be free to allocate evaluation resources optimally across the spectrum of different types of evaluation (formative, process, impact etc.) depending on the particular developmental stage of the program and the priority sector knowledge needs of the sector in which it is located. This is in contrast to evaluation funding just being ring-fenced to a particular type of evaluation (e.g. just impact or formative evaluation). 

Given that thinking about sector knowledge needs is central to the strategic evaluation approach, it requires a way of representing sector strategy in an integrated and accessible fashion. This is necessary because thinking about possible strategic direction is the best way of identifying strategic knowledge needs for an organization, sector, region or country.

Using a strategic evaluation approach we also need other tools for knowledge management around aggregating, summarizing, storing and extracting previous findings about ‘what works’ from research and evaluation and current organizational, sector and regional intelligence.

Lastly, there also needs to be evaluation capacity-building at all levels - from a high-level overview of evaluation for governance, management and stakeholders right down to enhancing specific technical skills for those doing evaluation work.

Integrating and coordinating strategy as a first step in the strategic evaluation approach

As noted above, if we are going to talk about using the strategic evaluation approach within an organization, sector, region or country, integrating and coordinating the underlying overall strategy which is being sought to achieve joint outcomes is going to be a prerequisite for the successful implementation of a strategic evaluation approach.

Often we will be working in organizations, sectors, regions or countries where this integration has not yet taken place, or has only been done in a partial, or fragmented manner.

Fortunately we do not need to delay the introduction of the strategic evaluation approach until such time as an underlying strategic integration has taken place. 

Under the umbrella of implementation evaluation, one of the first steps that anyone involved in strategic evaluation usually needs to do is to bring together, as best they can, the underlying strategic underpinning an organization, sector, region or country. Often this will consist of snippets of strategy in a large number of different places in addition to that which is contained within any formal planning documentation.

[Amongst the tools used by Parker Duignan Consulting when it is applying a strategic evaluation approach is visual strategy modeling. This consists of bringing together a group of people within a workshop and getting them to bring to the workshop all of the strategic documentation they have access to. A facilited process is then used to build a DoView® Results Roadmapor similar outcomes model (logic model, theory of change, results chain, strategy map) summarizing the underpinning integrated strategy. The visual outcomes model shows all of the high-level outcomes being sought. Using tools like this means you can quickly develop an integrated strategy model from which the evaluator and stakeholders can get both a helicopter and a drill-down view of the overall strategy]. 

Identifying organizational, sector, regional or country strategic knowledge needs

Strategic evaluation requires effective knowledge management of research and evaluation findings plus organization, sector or regional intelligence. This information needs to be aggregated, stored, summarized and kept up-to-date. It then needs to be provided in an accessible format for those who are involved in deliberating on what organizational, sector, regional or country strategic knowledge needs are.

[Amongst the tools that Parker Duignan Consulting sometimes uses when doing this is to map previous research and evaluation evidence and sector intelligence directly onto a DoView® Strategy Model or similar visual outcomes model and this is then used as the basis for discussions with stakeholders about knowledge needs]. 

Creating realistic expectations amongst stakeholders about what is feasible and affordable in terms of evaluation

Many people are completely unrealistic about how difficult it is to definitively prove attribution of improvements in high-level outcomes to specific programs. This leads to programs naively promising to prove impact, and to funders making demands such as:  ‘we will only fund what has been proven to work’ where this is interpreted to mean that every program will undertake definitive (and usually expensive) impact evaluation.  

An essential part of strategic evaluation is to create realistic expectations amongst stakeholders about what is, and is not, appropriate, feasible and affordable in terms of impact evaluation. It this is not done, the result is that evaluation resources can be missallocated and wasted by just being focused on individual impact evaluations of individual programs within considering the priority of that evaluation spend. This is not to diminish the importance of individual program impact evaluations, just to say that they should always be undertaken with an awareness of the current strategic knowledge needs of the sector in which the program is being run.

[The tool that Parker Duignan Consulting sometimes uses to introduce realism into stakeholders' expectations regarding evaluation is Duignan’s Impact Evaluation Feasibility Check (Duignan, 2008). This is a tool which provides a standard framework for thinking about impact evaluation where one considers the options for impact evaluation of any program in the ligh of the appropriateness, feasibility, affordability and credibility of seven possible impact evaluation designs.

Ongoing organizational, sector, regional or country-wide dialogue 

Ideally a strategic evaluation approach will seek to have in place regular setting (workshop, meeting, consultation) where parties from organizations, sectors, regions or countries can have input into identifying what are current strategic knowledge needs and have information reported back on findings. Typically this consists of periodic meetings, workshops, two-way communication and similar consultation processes etc. 

Having evaluation models which can deal efficiently with 'distributed interventions'

One consequence of strategic evaluation’s shift from ‘program-centric’ thinking to thinking from an organization, sector, region or country-wide perspective, is that one starts thinking in terms of ‘distributed interventions’. This is in contrast to continuing to view interventions as just a series of separate programs. 

It is often the case that a central agency of some sort is wanting a large number of interventions which are focused on similar outcomes to be implemented in a range of different organisations, sites or localities. 

What is the most efficient approach to evaluating such muti-site initiatives? It is costly to, for instance, attempt impact evaluation at all of the sites. From a strategic evaluation perspective it is useful to have an approach for undertaking evaluation cost-effectively in such situations. 

[One tool sometimes used by Parker Duignan Consulting in this situation is the Group Action Planning Approach. This brings together representatives from each of the sites right at the beginning of the process and they work together to plan, implement and evaluate the overall program and their work at each site]. 


The strategic evaluation approach has much to offer to those who are working from an organization, sector, region or country-wide level. This is because its view-point starts from such a level rather than just focusing on the lower-level question issue of evaluating individual programs.

Workshops and mentoring on the strategic evaluation approach

We run tailored, face-to-face and online workshops on using the strategic evaluation approach for organisations and sectors. These build skills in using the approach and use live examples from the organization or sector to link learning directly to the task at hand. More. We also provide mentoring for professionals involved in evaluation, outcomes and strategy work and draw on the strategic evaluation approach in this mentoring work.

Duignan, P. (2009). Using Outcomes Theory to Solve Important Conceptual and Practical Problems in Evaluation, Monitoring and Performance Management Systems. American Evaluation Association Conference 2009, Orlando, Florida, 11-14 November 2009.

Duignan, P. (2008). Encouraging Better Evaluation Design and Use Through a Standardized Approach to Evaluation Planning and Implementation8th European Evaluation Society Conference , Lisbon, October 2008. 

Duignan, P. (2004).  Outline of the Strategic Evaluation Approach.  Presentation to the Annual Conference of the American Evaluation Association,  Atlanta, Georgia, USA.   3-6 November 2004.

Duignan, P. (2003). Approaches and terminology in programme and policy evaluation. In Evaluating Policy and Practice: A New Zealand Reader. N. Lunt, C. Davidson and K. McKegg. Auckland, Pearson Education.

Duignan, P. (1997). Evaluating health promotion: the Strategic Evaluation Framework, D.Phil., University of Waikato, Hamilton.


Please contact us now if you have any questions about the strategic evaluation approach or any other aspects of our outcomes and evaluation work.

© Parker Duignan 2013-2018. Parker Duignan is a trading name of The Ideas Web Ltd.