Strategic evaluation

[This page is currently being edited]

A paradigm shift in evaluation thinking

Duignan’s strategy evaluation (Duignan: 1997, 2004) approach represents a shift in evaluation thinking away from just focusing on evaluating individual programs to thinking in terms of meeting the strategic knowledge needs of an organization, sector, region or country. Of course, evaluating individual programs is still part of the strategic evaluation approach, but only when the programs being evaluated have been identified as answering high-priority strategic questions for the for the organization, sector, region or country. 

The demands of making the conceptual shift to a strategic evaluation perspective should not be underestimated. It is best seen as a multi-year process involving changes in: stakeholders’ thinking; evaluation’s implicit theoretical underpinnings; organizational and institutional culture; organizational values; governance; strategy formulation; resource allocation; professional and sector capacity building and evaluators’ on-the-ground practice. 

The strategic evaluation approach is particularly useful for anyone working at an organizational or sector level who is charged with evaluating organisational-wide, sector-wide, region-wide, country-wide or even multi-country-wide activity composed of a number of different individual programs. 

'Evaluation should focus on meeting the strategic knowledge needs of an organization/sector not just evaluating a program' 

Currently much evaluation planning starts from the point of view of working out how to ‘evaluate a program’. In fact, the evaluation discipline is often referred to as ‘program evaluation’. Strategic evaluation turns this orientation on its head and starts from the point of view of attempting to identify the strategic knowledge needs of the organisation, sector, region or country in which any particular program is embedded. Once these knowledge needs are identified, one can work back from them and determine what type of evaluation is appropriate for each individual program. Not all programs merit the same level of evaluation effort. Programs following well established intervention types which have been evaluated well in the past are usually a lower priority than more innovative and experimental programs.

Basic toolkit needed to start doing strategic evaluation work

Before we turn to looking in detail at what we are trying to do in strategic evaluation, it is worthwhile itemizing the basic toolkit which is needed for working successfully within a strategic evaluation approach. This involves a mix of conceptual and practical tools.

Conceptually we need a theoretical approach which does not see evaluation as isolated from other organizational, sector, regional and country-level tasks such as: outcomes identification; strategic planning, prioritizaiton, indicator monitoring, performance management, commissioning, delegation and contracting. This is why strategic evaluation is best viewed as a sub-set of outcomes theory - the theory that deals in an integrated and cross-disciplinary way with these issues (Duignan, 2009). (Look at the simple overview of outcomes theory for more information on outcomes theory). 

Secondly, from a conceptual point of view, we need an approach to evaluation which does not just focus exclusively on a single ‘type’ of evaluation. We need to be open to using: implementation (developmental and formative), process, impact and summative evaluation as, and when, appropriate.

Thirdly, from a conceptual and practical viewpoint, when looking from a sector perspective from within strategic evaluation, it would be good if we could be assured that when evaluators of individual programs are taking place, that there is a standardized framework for how this is done. If we are going to feed information from individual evaluations up to an organization or sector level there should be some consistency in the way that this information is collected rather than program-level evlauations having been done in a wide range of somewhat idiosyncratic ways.

Fourth, in terms of resourcing evaluation, from an administrative point of view, evaluation planners should ideally be free to allocate evaluation resources optimally across the spectrum of different types of evaluation (formative, process, impact etc.) depending on the particular developmental stage of the program and the priority sector knowledge needs of the sector in which a program is located. This is in contrast to evaluation funding just being ring-fenced to a type of evaluation (e.g. just impact evaluation).

Fifth, given that thinking about organizational or sector knowledge needs is central to the strategic evaluation approach, it requires a way of representing organizational or sector strategy in an integrated, scalable and accessible fashion. This is necessary because thinking about possible strategic direction is a prerequisite for to identifying strategic knowledge needs for an organization, sector, region or country.

Sixth, using a strategic evaluation approach we also need other tools for knowlege management around aggregating, summarizing, storing and extracting previous findings about ‘what works’ from research and evaluation and current organizational, sector and regional intelligence. 

Seventh, lastly, there also needs to be evaluation capacity-building at all levels - from a high-level overview of evaluation provided for governance, management and stakeholders right down to enhancing specific technical skills for those doing evaluation work. 

Eight, strategic evaluation is not an evaluation approach that is in competition with other evaluation approaches. Its key feature is changing the viewpoint in how we think about evaluation from seeing it as a program-centeric activity to making it an organizational, sector or even country-level activity. It draws on, and is consistent with aspects of a number of other evaluation approaches ranging from utilization-focused evaluation to indigeneous evaluation approaches.

The last four points are elaborated below.

Integrating and coordinating strategy as a first step in the strategic evaluation approach

As noted above, if we are going to talk about using the strategic evaluation approach within an organization, sector, region or country, integrating and coordinating the underlying overall strategy which is being sought to achieve joint outcomes is going to be a prerequisite for the successful implementation of a strategic evaluation approach. 

Often we will be working in organizations, sectors, regions or countries where this integration has not yet taken place, or has only been done in a partial, or fragmented manner.

Fortunately we do not need to delay the introduction of the strategic evaluation approach until such time as an underlying strategic integration has taken place.

Under the umbrella of implementation evaluation, one of the first steps that anyone involved in strategic evaluation usually needs to do is to bring together, as best they can, the underlying strategic underpinning of an organization, sector, region or country. Often this will consist of snippets of strategy in a large number of different places in addition to that which is conctained withing any available formal strategic planning documentation. 

 [Amongst the tools used by Parker Duignan Consulting when applying a strategic evaluation approach is visual strategy modeling. This consists of bringing together a group of people within a workshop and getting them to bring to the workshop all of the strategic documentation they have access to regarding their organization, sector etc. A facilitated process is then used to build a visual strategy model or similar intervention logic, theory of change model summarizing the underpinning strategy being pursued by the organization or sector. The visual strategy model shows all of the high-level outcomes being sought and the steps it is believed need to happen in order to achieve them. Using tools like this means you can quickly develop an integrated strategy model from which both a helicopter and drill-down view of your overall strategy direction can be modeled.

(edited up to there)

Identifying organizational, sector, regional or country strategic knowledge needs

Strategic evaluation requires effective knowledge management of research and evaluation findings plus organization, sector or regional intelligence. This information needs to be aggregated, stored, summarized and kept up-to-date. It then needs to be provided in an accessible format for those who are involved in deliberating on what organizational, sector, regional or country strategic knowledge needs are.

[Amongst the tools that Parker Duignan Consulting sometimes uses when doing this is to map previous research and evaluation evidence and sector intelligence directly onto a DoView® Strategy Model or similar visual outcomes model and this is then used as the basis for discussions with stakeholders about knowledge needs]. 

Creating realistic expectations amongst stakeholders about what is feasible and affordable in terms of evaluation

Many people are completely unrealistic about how difficult it is to definitively prove attribution of improvements in high-level outcomes to specific programs. This leads to programs naively promising to prove impact, and to funders making demands such as:  ‘we will only fund what has been proven to work’ where this is interpreted to mean that every program will undertake definitive (and usually expensive) impact evaluation.  

An essential part of strategic evaluation is to create realistic expectations amongst stakeholders about what is, and is not, appropriate, feasible and affordable in terms of impact evaluation. It this is not done, the result is that evaluation resources can be missallocated and wasted by just being focused on individual impact evaluations of individual programs within considering the priority of that evaluation spend. This is not to diminish the importance of individual program impact evaluations, just to say that they should always be undertaken with an awareness of the current strategic knowledge needs of the sector in which the program is being run.

[The tool that Parker Duignan Consulting sometimes uses to introduce realism into stakeholders' expectations regarding evaluation is Duignan’s Impact Evaluation Feasibility Check (Duignan, 2008). This is a tool which provides a standard framework for thinking about impact evaluation where one considers the options for impact evaluation of any program in the ligh of the appropriateness, feasibility, affordability and credibility of seven possible impact evaluation designs.

Ongoing organizational, sector, regional or country-wide dialogue 

Ideally a strategic evaluation approach will seek to have in place regular setting (workshop, meeting, consultation) where parties from organizations, sectors, regions or countries can have input into identifying what are current strategic knowledge needs and have information reported back on findings. Typically this consists of periodic meetings, workshops, two-way communication and similar consultation processes etc. 

Having evaluation models which can deal efficiently with 'distributed interventions'

One consequence of strategic evaluation’s shift from ‘program-centric’ thinking to thinking from an organization, sector, region or country-wide perspective, is that one starts thinking in terms of ‘distributed interventions’. This is in contrast to continuing to view interventions as just a series of separate programs. 

It is often the case that a central agency of some sort is wanting a large number of interventions which are focused on similar outcomes to be implemented in a range of different organisations, sites or localities. 

What is the most efficient approach to evaluating such muti-site initiatives? It is costly to, for instance, attempt impact evaluation at all of the sites. From a strategic evaluation perspective it is useful to have an approach for undertaking evaluation cost-effectively in such situations. 

[One tool sometimes used by Parker Duignan Consulting in this situation is the Group Action Planning Approach. This brings together representatives from each of the sites right at the beginning of the process and they work together to plan, implement and evaluate the overall program and their work at each site]. 

Relationship between strategic evaluation and other evaluation approaches

 There are a number of different evaluation approaches (Duignan, 2003), these include: systems thinking evaluation; utilization-focused evaluation; empowerment evaluation; stakeholder-based evaluation; realist evaluation; goal-free evaluation; naturalistic or 4th generation evaluation; theory-based evaluation; and indigenous evaluation (e.g. Kaupapa Maori evaluation). Strategic evaluation has elements in common with a number of these approaches as is described below.

In terms of its relationship to systems thinking evaluation, strategic evaluation takes a broad systems approach in the sense that it focuses on the wider system in which a particular program is located rather than just limiting thinking to the evaluation of a specific program. However systems thinking is a discipline in its own right, with its own formal language and concepts. Any aspects of systems thinking in this formal sense can be used within a strategic evaluation approach. However strategic evaluation is not limited to, or tied to, just using systems thinking technical language and concepts. 

Along with utilization-focused evaluation, strategic evaluation is preoccupied with the users of evaluation. Strategic evaluation is very much in the spirit of utilization-focused evaluation in that it is focused on evaluation being driven by how people will use evaluation results. In practice, the utilization-focused evaluation approach is often used when focused on evaluating a specific program and it can be used within strategic evaluation once it has been determined that a particular program is a high-priority for evaluation. 

In the same way, stakeholder-based evaluation focuses on stakeholder information needs and has a lot in common with strategic evaluation. It again is often focused on evaluating a specific program and can be used within the strategic evaluation approach once a particular program has been identified as high-priority for evaluation. 

Realist evaluation is an approach to evaluation that puts an emphasis on finding out; ‘what works, to what extent, for who, in what contexts, and how?’ From the point of view of strategic evaluation, having this level of granularity in evaluation findings is a great ideal to strive for. Realist evaluation also moves the emphasis from evaluating ‘Program X’ to evaluating the components inside it. Strategic evaluation can be seen as always attempting to identify what are the priorities from the large ‘shopping list’ of what could potentially be found out if one had the resources to adopt a fully realist evaluation approach.

Goal-free evaluation is an approach which starts from the proposition that specific programs may not have captured the outcomes they should be ‘really’ seeking related to their set of stated goals or the domain in which they are working. It is then a matter of working out what the program’s ‘real’ goals should be. This approach has a lot in common with strategic evaluation in that in order to do this one needs to look at a sector-wide perspective on what are the high-level outcomes that are being sought by a program. You then work back from this wider perspective to define the ‘real’ goals against which a program should be being evaluated. Within strategic evaluation this is done by developing a model of what it is that an organization or sector is seeking by its activity. From this overall organizational or sector model one can work back to identify the role of a particular program in achieving outcomes and evalute whether it is, or is not, doing so.

Naturalistic or 4th Generation evaluation is an evaluation approach with a particular philosophy of science about methodology (constructivist rather than positivist) and an emphasis on qualitiative approaches. Strategic evaluation is open to the selection of methods to suit the evaluation task and methods from Naturalistic or 4th Generation evaluation can be used with it.

Theory-based evaluation puts an emphasis on looking at the wider context in which a program is being implemented and emphasises the importance of surfacing program theory - the theory of change that it is believed will lead to improvements in outcomes. Both of these perspectives are consistent with strategic evaluation. 

Indigenous evaluation (e.g. Kaupapa Maori evaluation) takes a values, philosophy of science and cultural perspective based on indigenous views of the world. As a consequence, it looks at the context in which programs take place, for instance: power; resource allocation; institutional racism; and indigenous sovereignty. Taking into account these sorts of wider perspectives is the type of thing the strategic evaluation approach seeks to encourage. The key question indigenous evaluation raises when using the strategic evaluation approach is: ‘whose knowledge needs are priorities for being answered?’


The strategic evaluation approach has much to offer to those who are working from an organization, sector, region or country-wide level. This is because its view-point starts from such a level rather than just focusing on the lower-level question issue of evaluating individual programs.

Workshops and mentoring on the strategic evaluation approach

We run tailored, face-to-face and online workshops on using the strategic evaluation approach for organisations and sectors. These build skills in using the approach and use live examples from the organization or sector to link learning directly to the task at hand. More. We also provide mentoring for professionals involved in evaluation, outcomes and strategy work and draw on the strategic evaluation approach in this mentoring work.

Duignan, P. (2009). Using Outcomes Theory to Solve Important Conceptual and Practical Problems in Evaluation, Monitoring and Performance Management Systems. American Evaluation Association Conference 2009, Orlando, Florida, 11-14 November 2009.

Duignan, P. (2008). Encouraging Better Evaluation Design and Use Through a Standardized Approach to Evaluation Planning and Implementation8th European Evaluation Society Conference , Lisbon, October 2008. 

Duignan, P. (2004).  Outline of the Strategic Evaluation Approach.  Presentation to the Annual Conference of the American Evaluation Association,  Atlanta, Georgia, USA.   3-6 November 2004.

Duignan, P. (2003). Approaches and terminology in programme and policy evaluation. In Evaluating Policy and Practice: A New Zealand Reader. N. Lunt, C. Davidson and K. McKegg. Auckland, Pearson Education.

Duignan, P. (1997). Evaluating health promotion: the Strategic Evaluation Framework, D.Phil., University of Waikato, Hamilton.


Please contact us now if you have any questions about the strategic evaluation approach or any other aspects of our outcomes and evaluation work

© Parker Duignan 2013-2018. Parker Duignan is a trading name of The Ideas Web Ltd.