The field of evaluation involves making judgements of quality, value and importance to support accountability assessment, learning, and to improve performance. Traditional evaluation designs assume a high level of predictability and control. The problem is that complex programs or contexts challenge this basic assumption. Often programs deal with emergent outcomes and objectives, adaptive program processes, nonlinear theories of change and evolving stakeholder expectations. Under such complex conditions, traditional evaluation methods and tools do not allow realistic and useful representations of reality. In these instances, we need a more adaptive approach to evaluation. One that fits the environment without compromising rigor. In this paper, we articulate what we have found useful in seeing patterns in complex programs, understanding the dynamics in ways that are meaningful to stakeholders and recommending Adaptive Actions to improve impacts over time. In our work, synergy has emerged between complexity theory (through the lens of human systems dynamics) and evaluation practice (through a case study of a complex program of social change). What emerges at this generative intersection is an evaluation method that is simple, robust, rigorous and flexible enough to meet the demands of twenty-first century social change. We will explore the implications of this approach for theory and practice in complexity and evaluation, and we will share some questions that are emerging for us as we prepare for our next cycle of theory and practice development. In this paper, we provide an overview of the challenge and previous efforts to address it, an introduction to basic theory and practice of human systems dynamics (HSD) and theoretical foundations for a new approach to evaluation in complex environments, Adaptive Evaluation. We demonstrate applications of this new evaluation practice in a case study. Finally, we articulate lessons learned and emerging questions.
An array of complexity-based tools and techniques are available today, but how does the practitioner select a particular approach to respond to a particular need? We present a simple taxonomy to describe the landscape of complexity-derived methods for human systems dynamics. Practitioners can use the landscape to understand the diversity of tools and techniques, to foster respect for approaches different from ones’ own, to build an understanding of the field as a whole, and to select specific techniques to apply in specific situations.