School Leadership & Management: Developmental Evaluation

slam+de.jpg

InformEd International is using developmental evaluation (DE) to facilitate the creation of Save the Children’s School Leadership and Management (SLaM) model. DE[1] is used to support the innovation process in organizations and projects by collecting and analyzing real-time data in ways that lead to informed and ongoing decision making as part of the design, development, and implementation process. DE is an especially useful evaluative approach in situations where there are not pre-determined outcomes and the pathway to success is still to be determined. In these circumstances, DE can help answer questions like:

  • What key themes, priorities, and recommendations are emerging as the innovation takes shape?

  • What do initial results reveal about the design of the innovation? What implications does it have for implementation plans?

  • What variations in effects are we seeing across implementation sites? What does this mean for the design of the innovation and the implementation approach?

  • How have different values, perspectives, and relationships influenced the innovation and its outcomes?

  • How is the larger education system (district/provincial) responding to the innovation?

DE differs from traditional forms of evaluation, including formative and summative, which are implemented through a linear problem-solving approach. When the outcome is clearly defined and the problem well understood at the outset, the process for identifying the best solution and testing its effectiveness is straightforward. When the problem is complex and the potential solutions varied, developmental evaluation allows for continuous adaptation and improvement, using rigorous data to inform innovation.


Developmental Evaluation for School Leadership & Management

While conducting the evaluation for I’m Learning, the pilot project in Cambodia, Uganda, and Zimbabwe that informed the basis for the SLaM model development process, InformEd found the rigid research frame did not sufficiently capture emergent themes, trends, and lessons from a project model that exemplifies contextualization and innovation. As the SLaM project approach is similarly emerging, we recommended that the pilot use developmental evaluation for the first two years of the evaluation process. The purpose of this recommendation is to enable authentic and contextualized insights to arise throughout the project development journey, allowing Save the Children to identify in a timely manner emergent themes and address challenges that crop up, thereby continually adapting and improving the project as it progresses.

The DE will focus on program innovation and development. The evaluation methodology, therefore, must be agile, quickly adapting to the needs of project participants, working to probe and question project participants. The DE will aim to identify and support promising practices throughout the first two years.

In the first year, the focus will be on identifying stakeholders’ priorities and needs, developing interventions to address those needs, and engaging with stakeholders on needed revisions during the pilot phase. Implementation of interventions will continue into the second year, as will data collection exercises with relevant stakeholders to capture how effective the SLaM model is meeting their needs and expectations. At the end of two years, the model will be ready for formative evaluation.


Developmental Evaluation Framework

In a traditional research approach, an intervention would be conceptualized and implemented with attention to assure high fidelity of the intervention. However, as SLaM is being developed over the next several years and is being designed as a social innovation in a complex environment, developmental evaluation questions will be used to guide program innovation and design. The table outlines the items we believe need to be developed during the SLaM pilot, alongside potential developmental evaluation questions linked with those products. 

What needs to be developed? Developmental Evaluation Questions
1. Principles for effective school leadership and management. What behaviors are observed in someone showing strong school leadership and management? What behaviors are not associated? What values are associated with those behaviors? How are those values translated into action?
2. Barriers to effective school leadership and management articulated and mapped to stakeholders. How do stakeholders evaluate their own contribution to effective school leadership and management? How would stakeholders like to improve their school leadership and management? What positively affects one’s ability to contribute? What prevents one’s ability to effectively contribute?
3. Interventions addressing barriers identified and prioritized. For each barrier identified, what stakeholders are involved? What are the stakeholders’ motivations? What are possible strategies for overcoming the barriers? How do possible strategies map to feasibility and impact mapping?
4. Implementation approach including technical content, timelines, partners, milestones. Do all intervention schools want to prioritize the same interventions? Can all participants prioritize the same interventions? Why/why not? Do the schools want to follow the same timeline? What content is needed for prioritized interventions? Who is responsible for each component? What milestones and timelines can be set to develop content?
5. Shared consensus among all stakeholders of roles/responsibilities for SLaM. Are all stakeholders aligned regarding their role and responsibility for effective school leadership and management? Are further discussions/sensitizations needed to ensure there is buy-in from all stakeholders?
6. Data collection and reporting system established that amplifies voices, opinions, and experiences; teacher and head teacher feedback; including Save Nepal operational feedback, Save Norway with managing NORAD relationship. What data can be collected from all stakeholders throughout the pilot process? How will this data be gathered, analyzed, and fed back to the program? With what frequency? How will unexpected critical incidents be handled?
7. Process for operationally addressing any challenges that are identified through the data collection and reporting system. What categories of feedback do we expect to receive? Who should be made aware of this feedback? With what frequency? How will unexpected critical incidents be handled?
8. Documents outlining alignment of SLaM principles, curriculum, and implementation approach to MoE standards and policy. What documentation is needed? Who is the primary audience for the documentation?
9. Sustainability strategy. To what extent are interventions sustainable? What would sustainability of each intervention look like? How can sustainability be strengthened?
10. Regional applicability strategy. To what extent are interventions that were developed in this pilot applicable and appropriate for other contexts? How can regional applicability be strengthened?
11. SLaM Project Model and Theory of Change Given the experience piloting the project, does the drafted Theory of Change accurately capture the components and relationships within the project? How can we adapt the existing Theory of Change to capture the emerging project model? How do we ensure that Theory of Change provide space for development of contextualized interventions?

In the next blog of this series, we’ll discuss our process of developing principles collaboratively with local stakeholders during the inception workshop held in Nepal.

Have you led a principles-focused developmental evaluation process? What are your recommendations for developing guiding principles for evaluation?

[1] Patton, Michael Quinn. 2010. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use.