This learning and reflection brief can help teams learn about the value of using a participatory ‘adaptive management’ approach within a participatory, learning- and action-oriented, whole systems programme.
Teams can start to identify if and how their programme may benefit from using adaptive management; the extent to which their programme may already incorporate aspects of adaptive management; where adaptive management principles could potentially be introduced or strengthened; and any possible actions which could support more adaptive programming.
Generally, the key lessons, skills and tools highlighted here are also useful for any team interested in exploring how intentionally and systematically using evidence and learning can shape a programme in real time.
Specifically, this brief will help teams:
Learn about what an adaptive management programme is about, including what this approach looks like in development and humanitarian programmes, and some examples from CLARISSA.
Learn about some of the key methods and tools used by CLARISSA for adaptive management such as a reflexive Theory of Change.
Reflect on their own programme(s) and ways of working, and identify actions which could potentially enhance, or help shift a programme towards an adaptive management approach.
Holding periodic After Action Reviews with country and consortium partners played a central role in CLARISSA’s intentional and systematic approach to using learning and evidence in order to adapt. This is addressed in the learning and reflection brief which follows, Brief 4. After Action Reviews.
Terms used in this Brief:
Actionable learning - Learning which is designed to guide decision-making actions.
Children's research groups - Child- and youth-led research groups within CLARISSA which undertook research during COVID.
Watch the video ‘Using evidence and learning to adapt programmes in real time’ where CLARISSA team members from Bangladesh and Nepal provide some perspectives based on their experience of working using adaptive management.
After you’ve watched the video, note down how you think this approach is similar or different to how you currently work. You will discuss this later in the reflection session.
You can download a low-resolution version of this video to watch offline in this Google Drive folder.
Most practitioners will be aware that programme implementation rarely goes according to plan. There will always be programme changes or modifications which need to be made in response to unexpected events or situations, or because something didn’t work out as originally intended.
So on this level, most development and humanitarian work is already required to respond to the context and is expected to make itself ‘fit for context’. However, what does it mean to embrace participatory ‘adaptive management’, and how is this different from the inevitable programme adjustments we expect to make?
For CLARISSA, participatory adaptive management was based on the understanding that the programme would be addressing many causal interdependencies which combine to drive children into the worst forms of child labour; that this would likely involve many actors on different levels; but that there were high levels of uncertainty around what precisely drove the problem and who the key actors might be.
In response to this, the programme’s interventions were intentionally not pre-defined. Rather, the participatory nature of the programme was designed to inform responses and actions in real time. Therefore, from the outset, CLARISSA acknowledged that it was going to need to adapt itself as it went along – it didn’t have all the answers, and needed to generate evidence and learning which it could use to inform its responses.
In order to do this, CLARISSA set about establishing inclusive mechanisms within the consortium partnership to enable partners and teams at all levels to systematically and intentionally learn together about the different assumptions and strategies being used by CLARISSA.
In other words, what was working, what was not working, who did this apply to, and why was this? This reflection and learning, enabled by regular After Action Reviews (see also Learning and Reflection Brief 4. After Action Reviews) brought together learning across the different levels of work. It was then used to collectively decide how the programme would move forward, and the kinds of actions or changes which were required on various levels.
Terms used in this Brief:
Action Research groups - The groups of children or adults in CLARISSA who worked to further research, learn about and act around the different aspects of worst forms of child labour identified from the Life Stories and systems mapping.
PhotoVoice - A participatory visual method where participants use photos to tell a story or narrative about an aspect of their lives. Participatory Approaches Using Creative Methods to Strengthen Community Engagement and Ownership – Resource Pack has plenty of guidance on PhotoVoice.
Reflexivity - The ability to take a step back and think objectively and critically about something.
Qualitative - Information that cannot be counted, measured or easily expressed using numbers.
Quantitative - Information which can be counted or measured.
This type of process was different from simply identifying the changes or actions typically required to deliver a programme. After Action Reviews didn’t just focus on programme delivery, but also on the programme’s underlying assumptions and strategies, and CLARISSA continuously asked itself whether these were still relevant in light of ongoing, participatory, programme-generated learning and evidence.
CLARISSA also considered collaborative governance as core to its approach and tried to optimise governance through adaptions as part of the process too. The adaptive management approach was considered participatory because it sought to avoid hierarchical or top-down decision-making.
For instance, After Action Reviews were inclusive of all partners and generally, “it was everybody’s business to learn”. Brief 5. Working with partners provides further insight into how CLARISSA worked to enable an equitable, empowered and flexible partnership to support participatory adaptive management.
If you're honest about your engagement with them (affected peoples), you don't know what the solution is. And so you have to embrace the fact that there's going to be uncertainty. - IDS CLARISSA team member
THE DIFFERENT ASPECTS OF CLARISSA’S PARTICIPATORY ADAPTIVE MANAGEMENT
Adaptive DELIVERY: Adaptive delivery is the reality of programme implementation on the ground. This is the typical kind of flexible delivery which responds to a context or event, and which is often unavoidable. Most practitioners are already familiar with this way of working and will likely make many adaptations to their delivery plans over the course of a programme.
How did CLARISSA work to adapt delivery?
Learning from Action Research groups was captured through programme supported documentation of the research process, which included monitoring qualitative* and quantitative* indicators to assess how the Action Research groups were working (performance and facilitation) and what they were achieving (innovations and outcomes). Local implementation teams (facilitators and documenters) periodically reviewed the learning to adapt CLARISSA implementation in consultation with the country-level team. Regular management meetings and ad-hoc ‘mini’ After Action Reviews (see Learning and Reflection Brief 4. After Action Reviews) supported the piloting and adjustment of these approaches.
Examples: Brief 6. Safeguarding for Systemic Action Research describes the way in which child-centred, participatory decisions were taken to change the GPS devices which were planned to be used, as well as the way children captured data during GIS mapping in Nepal. In another example, Nepali local partners provided emergency food to families during COVID lockdowns. In yet another example, the Nepal team decided to change its recruitment criteria for research documenters and field organisers from trained academic researchers to younger graduates, because the graduates were more open to embracing the participatory and child-centred approach required by CLARISSA.
Adaptive PROGRAMMING: Adaptive programming is when more fundamental decisions and changes are made around a programme’s focus areas or ways of working. For instance, deciding to put more energy into a particular issue based on what programme evidence is showing in real time, or deciding to respond to an opportunity that’s recently emerged during implementation. These types of programme adaptions tend to happen much less frequently in many development and humanitarian programmes compared to adaption for delivery.
How did CLARISSA work to adapt programming?
After Action Reviews were facilitated on a six-monthly and annual basis within each country and across all the countries (see Brief 4. After Action Reviews). Monitoring data and learning from programme activities — including Action Research groups and the partnership’s self-evaluation process (see Brief 5. Working with partners) — were the main inputs. After Action Reviews examined the programme’s main assumptions and produced actionable learning* reports. After Action Reviews were also timed to ensure that learning could be communicated effectively and usefully from the country to consortium level, thereby allowing programme plans to be adapted accordingly.
Examples: During the set-up phase (very early programme implementation), it became clear that the programme should focus on worst forms of child labour in the context of informal and domestic markets, as opposed to export-oriented markets and in big global corporations, as initially thought. This was the result of a collective decision-making process based on evidence drawn from what was happening on the ground. In this way, evidence had already started to challenge CLARISSA’s initial assumptions, and it motivated a significant conceptual shift and change of strategy for the programme.
In another example from Bangladesh, CLARISSA opened an unplanned ‘hub’ office and a community space so that staff from the different partners could spend more time collaborating as a team. The hub office and community space meant they would spend less time driving to different offices through Dhaka’s traffic, and more time spent working together building stronger working relationships in a shared physical space nearer to where the research was taking place.
Another example was the decision to phase out the CLARISSA Children’s research group* in Nepal after they had completed an initial PhotoVoice* project. While the value of that initial project was appreciated, the programme decided that more could be derived from focusing on the research by the Action Research groups (See Brief 2. Mapping systems and taking action).
A final example was the development of a mentoring group after the Action Research groups were up and running, in response to a demand from the facilitation teams to have more direct and hands-on support from the IDS team. This led to the set up of biweekly mentoring sessions that proved crucial for building the teams’ reflexive capacities and enabling a space for real-time troubleshooting.
Adaptive GOVERNANCE: Adaptive governance can include renegotiating a programme plan with a donor, perhaps to reconfigure how the programme is structured, including or excluding features of the programme, and reallocating budget. This commonly happens to varying degrees in many programmes, but often there may be donor restrictions on how radical a change or budget reallocation can be.
How did CLARISSA work to adapt governance?
Actionable learning was constantly fed upwards to the programme’s lead partners, and also annually to the donor via the programme reports. A strongly collaborative and trusting relationship with the donor, established during the co-generation phase, and maintained throughout implementation, also enabled CLARISSA to steer toward its key objectives despite working in an environment with many uncertainties, including COVID restrictions.
Major programme adaptations, designed and agreed through the adaptive management approach, were approved by the donor. Donor representatives also participated in many of the early (co-conception and set up phase) programme workshops where decisions about the programme’s design and partnership composition were collectively made. This included discussions about what would be done, by whom, where it would be done, and who with. As such, the donor was already well-acquainted with the proposed adaptations before being asked to approve them, as it had been part of the collaborative decision-making process.
Examples: Before CLARISSA could get fully underway, the military coup in Myanmar created a difficult environment for the programme to operate, so it was agreed that Myanmar would be withdrawn from the programme, and the budget reconfigured. Also, the donor decided it needed to reduce the CLARISSA budget, so a participatory budgeting process was undertaken by all the programme partners to decide how the budget should be reallocated, and which aspects of the programme needed to be modified.
In another example, the programme budget and activities were modified to embrace the restrictions imposed on international travel by the COVID lockdowns, while also responding to input regarding how the teams were collaborating. These aspects also contributed to the decision to shift from the original work stream-led way of working (whereby teams comprised members from different countries and partners) to a country-led way.
Because here, we do not have anything fixed like other development programmes […] here we are talking about Participatory Action Research. The ethos of this programme is learning by doing. Our design itself has driven us to embrace adaptive management, and that has been from our operations, to governance, and even financial management. - CLARISSA partner team member, Bangladesh
CLARISSA RESOURCES ON ADAPTIVE MANAGEMENT
CLARISSA Blog: Why are effective feedback mechanisms in cash transfers so important? Reflections on how effective feedback mechanism can support adaptive delivery.
CLARISSA Blog: Art, craft, and the science of facilitation in a complex partnership programme. Team member insights into the After Action Review process.
CLARISSA Blog: 5 reflections on operationalising CLARISSA to generate evidence. Reflections on how CLARISSA got its adaptive management process up and running.
3. Participatory Adaptive Management: Practical learning from CLARISSA
Learning for programme decision-making can become (almost) everybody’s business. At each level of CLARISSA there was a very collaborative approach to decision-making. This allowed diverse team members from different levels to be part of learning and decision-making, and they were involved in making conscious decisions to shift both big and small aspects of the programme.
This inclusive, collaborative and empowering approach was supported by ongoing efforts to monitor and strengthen the functionality of the partnership (see Brief 5. Working with partners); by organising After Action Reviews in different ways, for instance between partners in-country, and not just high-level or big international meetings; by promoting reflection and learning at all levels, including the uptake of mini-After Action Reviews and individual reflective journalling; and by using participatory, creative tools such as the River of Life to support reflection processes.
By the end of the programme, the extended use of After Action Reviews had been embraced by the whole team and became part of the approach of all staff. Some partners also expanded its use to other programmes beyond CLARISSA, and the approach became more central to several partners’ own operations, with corresponding budget allocations.
Despite these efforts, it was also acknowledged by CLARISSA that some team members on the ground were still excluded from decision-making at times, and that these programme power dynamics, linked to the dynamics of aid itself, were not easily or entirely overcome by the mechanisms and strategies CLARISSA used.
It is possible to design a robust, adaptive, impactful programme which doesn’t have a log frame. In many ways, CLARISSA was a radical process, because it didn’t use a log frame for its results framework.
At the beginning of the process, CLARISSA’s lead partner (IDS) explicitly negotiated with the funder to omit the typically required log frame, because it didn’t align with the programme’s adaptive management approach. Instead, a high-level Theory of Change was included in the proposal and a monitoring, evaluation and learning framework, which the funder approved, included commitments on outputs and targets, as well as a commitment to monitor and evaluate the implementation processes and outcomes as they emerged.
While this was accepted by the funder, the approach remained controversial within the programme itself, with one IDS team member observing “some individuals and some partners found it really hard to embrace”. With the agreed understanding that the purpose of the work was to test out the programme’s Theory of Change and adapt as more was learned in real time, CLARISSA went on to develop its own participatory, co-generated, ‘reflexive’ Theory of Change during the first phase.
This reflexive Theory of Change was designed to appropriately support an ongoing process of critical reflection around the programme assumptions and strategies. Boxes 1 and 2 explain in more detail why and how CLARISSA approached its Theory of Change differently from a typical development or humanitarian programme.
Robust adaptive management programmes require strong monitoring, evaluation and learning capacity. The CLARISSA monitoring, evaluation and learning team enabled other teams to engage with the evidence being generated through the After Action Review workshops and also by engaging across the various decision-making structures and points in time which were built into CLARISSA.
CLARISSA significantly invested in its monitoring, evaluation and learning capacity so it could support informed and deep reflection with teams at all levels. This centred around facilitating participatory reflection by feeding back the copious and rich data which was being collected, as well as by creating an enabling environment through the various systems, mechanisms and activities described here.
When the learning feeds into decision-making, then you can really say you’re intentionally using an adaptive management approach. - IDS CLARISSA team member
BOX 1 The Theory of Change 'Straight-Jacket'
A conventional ‘linear’ approach to a Theory of Change assumes we know everything from the outset; that one set of actions leads neatly, or most likely, to certain outputs and outcomes; and it doesn’t acknowledge the context and complexity of many issues typically addressed in development and humanitarian contexts.
A conventional Theory of Change usually doesn’t allow much flexibility to adapt the programme according to the changing needs of participants, stakeholders and partners to emerging learning, or to a major disruption such as COVID.
Most programmes spend a lot of time on developing a Theory of Change early on and only return to it at the end to evaluate their performance against the initial plans. Often how the programme ends up doesn’t really align with the original Theory of Change. For more detail read theCLARISSA blog: Breaking free from the theory of change straight jacket.
What happens when a Theory of Change is linear and inflexible The evaluation of complex programmes
I was giving example of journey mapping when we were doing that we were supposed to stay there for 12 hours and fill in a small questionnaire but when we did a field test and shared with IDS colleagues this method is not sufficient to find out the deeper stories. In that case, we had to take a broader observation method […] and we could add an ethnographic observation with journey mapping. So, our methodology has changed with our context... - CLARISSA partner team member, Bangladesh
BOX 2
The reflexive Theory of Change
It is now commonly accepted that conventional Theories of Change are not useful for adaptive programmes that embrace uncertainty, and which use increasingly popular adaptive-styled monitoring, evaluation and learning systems.
A ‘reflexive’ Theory of Change is not simply there “because we have to have one” but can facilitate a critical reflection process through which programme assumptions and the strategies used can be unpacked, considered and adjusted in an ongoing manner.
Imagine you throw a pebble into a lake. You can choose a very small or a large pebble, and you can direct where the pebble goes (because you have good aim) – you can throw it close to the edge of the shore or perhaps far way into the middle of the lake. You can throw it high in the air or skim it across the surface. This is what you can control. When the pebble drops in, you know it will make some ripples in the water around your pebble, but you can’t control these ripples – they may spread evenly or there may be other things in the lake, like the shore or fallen trees, or even a strong wind, which make the ripples act differently, or go in different directions. You and your pebble have influenced these ripples. You can make an educated guess as to what might happen, based on what you can see in the lake and how you threw your pebble, but you can’t control exactly how the ripples act. By watching what happens, you’ll be able to adjust your starting guess when you see where the ripples actually did go and how they acted. Other creatures might notice the activity in the lake, become interested, and move closer to see what is happening, and further ripples might be created around other objects in the lake.
The CLARISSA interactive Theory of Change This is an analogy of how the CLARISSA programme visualised its own reflexive Theory Of Change. This approach is borrowed from the framing used by ‘outcome mapping’ (which identifies the spheres of ‘control’, ‘influence’ and ‘interest’). To do this, a workshop with all the partners, including some local partners early on (but before country teams were in place) was organised, during which they explored their understanding of CLARISSA’s potential to influence whole systems change around the worst forms of child labour.
In a participatory session, they mapped all the current actors in the child labour programming system and explored the various pathways which could potentially shift how the issue of worst forms of child labour is currently framed. For example, moving from “worst forms of child labour are a result of unscrupulous business owners” to “what if business owners could become part of the solution?”. This exercise helped everyone gain more clarity about why and how current programmes are designed to address the worst forms of child labour, where there may be limitations, and how the CLARISSA programme might add value and new thinking.
The CLARISSA programme identified three main ‘impact’ pathways – the ‘pebbles’ that create the ripples of impact:
Generating participatory evidence and innovation around the worst forms of child labour through the participatory activities facilitated by the programme.
Supporting advocacy groups around the worst forms of child labour through the child-led and other activities facilitated by the programme.
Working in a bottom-up and participatory way through an intentional and ongoing monitoring, evaluation and learning and the application of CLARISSA’s principles, i.e. child- and people-centred, meaningful participation, facilitation-driven, not expert-driven.
It was envisaged that the ‘ripples’ produced by these core programme actions would lead to new understandings of the problem and influence and spark diverse and valid actions by participants and stakeholders around the worst forms of labour. It also envisaged that broader stakeholders would be interested in the new evidence and learning the CLARISSA programme was generating, and that some would also leverage this knowledge to help reduce the worst forms of child labour.
Although the programme couldn’t control or confidently predict what different actions these might be, it could set itself up to intentionally learn about them as it went along, and to evaluate the changes they were bringing about. In this way, CLARISSA was not bound by a set of specific activities beyond its ‘pebbles’, but it did commit to systematically integrating evidence, reflection and adaption into the heart of its programme.
An interactive Theory Of Change (above) was developed to help illustrate how CLARISSA conceptualised its actions leading to change, and it also provides real examples of what actually happened. By using this interactive tool you can learn more about the CLARISSA ‘pebbles’ and ‘ripples’.
It’s all very well talking about learning, or adaptive management, but there are certain skills that you as a team and as individuals need to learn to turn the practice of adaptive management into true learning. - IDS CLARISSA team member
Using a shared information drive and other technology was critical for the CLARISSA’s adaptive management. Using Microsoft Teams was a really important aspect of the programme’s adaptive management approach.
CLARISSA generated a huge amount of data, and was committed to consistently and meaningfully using the evidence gathered. The Microsoft Teams platform, combined with a strong monitoring, evaluation and learning team, promoted high degrees of transparency and collaboration, as there was a commonly accessible information and communications platform and a solid structure for participation.
Other meeting platforms and programmes such as Zoom and the Miro whiteboard also enabled remote relationship building, reflection and learning. However, the programme also found that different team members had different capacities in relation to using these online tools, which at times did contribute to some inter-organisational strain.
RESOURCES ON ADAPTIVE MANAGEMENT AND THEORY OF CHANGE
4. Skills, methods and tools for Participatory Adaptive Management
Adaptive management programmes require individuals and teams who are skilled in learning and reflection, and who are interested in testing out uncertain strategies, and reflecting and modifying based on what is learned. This is often described as an ‘entrepreneurial’ mindset, and highlights taking informed ‘risks’ to test a strategy.
Strong communication and collaboration across different programme teams, countries and partners were also all critical aspects of a robust adaptive management process within CLARISSA, as a well as clear systems for bringing team members together to undertake these processes. Brief 5. Working with partners discusses this in more detail.
This brief focuses on the concept of being a reflexive* team and programme. Reflexivity can be used individually to improve one’s own practice, or as part of a team activity, such as an After Action Review, to collectively reflect on and make decisions about a programme.
Detailed guidance on the tool ‘Rivers of Life’, After Action Reviews, and on individual reflective journalling skills is addressed in Brief 4. After Action Reviews.
Being reflexive is
Being reflexive is not
Being reflexive is the conscious act of stepping back and reflecting critically on how one approaches and implements one’s work, or how a programme is working, and then taking steps to address the aspects which need changing.
The aim of reflection is not to criticise, but to learn from experiences, avoid repeating mistakes, and to take steps to change how work is done.
On an individual level, being reflexive is closely related to self-awareness and being able to reflect on one’s own relative power and how this can affect working relationships. For example, regularly committing to identifying and exploring personal thoughts, feelings, assumptions, skills and experiences and then evaluating how they do or don’t fit in with the programme approach, and how they may influence others around them.
Being reflexive is not about judging personal values, attitudes, feelings, beliefs, or that a person, group or organisation holds relative power. It is about recognising in an objective manner how they might impact on a programme approach, and how they may help or hinder.
This is explored further in Brief 1. Working in a child- and people-centred way
The same understanding above is true at the programme level. Here, teams reflect objectively on the programme itself and identify how certain assumptions may have led to particular strategies, and how these may or may not be leading to an anticipated result.
At the programme level, being reflexive is not about criticising a programme as a failure. Reflexivity allows for failure. It does not judge that the programme has ‘failed’ because a particular strategy didn’t work out. Rather, the reflexive learning process objectively acknowledges and learns from shortcomings, and feeds into improved strategies which follow.
Any person playing a role in implementing a programme should strive to become a reflective practitioner. Being reflexive is a core part of professional (and volunteer) development.
Being reflexive is not just for senior management, anyone, and everyone, can learn to be more reflexive.
Reflective practice can be undertaken alone, with another person, or in a group/team, or as a whole organisation.
Reflective practice is not about reviewing the performance of a practitioner. The focus is not to supervise staff, teams or volunteers.
An important aspect of being a reflective practitioner is asking probing questions, asking “why”, constructively discussing different team members’ perspectives, assumptions and actions, and the different ways of approaching a question or issue.
Reflective practice is not about who is ‘right’ and who is ‘wrong’, nor making teams or team members feel like they are failing at their job, or that the programme is failing.
Reflective practice is about identifying what has been learned and then using this learning to take actions to improve practice or change course.
Reflective practice isn’t only identifying problems, or judging or evaluating programme outcomes.
Tips from CLARISSA for participatory adaptive management
When working to address a complex challenge, assume you don't know the answer. For instance, in the CLARISSA context, the understanding was that there is weak evidence of what leads children into worst forms of child labour, and that many responses simply don't work.
Working adaptively may not be appropriate in all circumstances. There are relatively simple problems and simple contexts in which more traditional, linear programming approaches are valid.
An adaptive programme isn’t simply vague or ‘anything goes’. It has robust mechanisms in place to be able to collectively learn and adapt at all levels to move forward towards real change
Expect to make the most changes at earlier stages of the programme, and then stick to these decisions while they are tested.
Embrace change – don’t hold onto things that don’t serve the programme’s goal any more.
Learning is everyone’s business: build a programme which is fully inclusive and participatory so programme teams can learn collectively. Make sure this aspect is adequately funded and centred, rather than simply seen as an added burden.
Don’t be afraid to take a ‘risk’ – be entrepreneurial, test out evidence-informed, and collectively monitored assumptions and strategies, and don’t be afraid to fail. Treat failing as an important part of the learning process.
Consider taking the first steps towards being adaptive by reflecting on how this approach could enhance or change the way your programme or organisation works. The team reflection exercise which follows can kick start this thinking.
Start with small actions: For instance, consider making space for being more reflexive as an organisation.
Participatory Visual Methods: a case study. An example of how visual storytelling can open up new spaces to reflect on participatorymethods.org
Participatory Approaches Using Creative Methods to Strengthen Community Engagement and Ownership – Resource pack: Guidance on using a broad range of participatory, creative methods for facilitation and many links to different tools.
5. Tips on planning and budgeting for adaptive management
Planning and budgeting for an adaptively managed programme can look quite different from a typical programme. For instance, there will be significantly more resources allocated to monitoring, evaluation and learning. Not surprisingly, this is often one of the most challenging aspects of working in an adaptive way, both for implementing organisations and for funders.
Learning so far suggests that even where funders are interested in or committed, in theory, to working in an adaptive way, their organisational and contracting systems and mechanisms are often mismatched. For example, funding contracts may still require a log frame, clear outputs and outcomes, and numbers to be reached, and may have many conditions or restrictions attached to budget modifications or the payment of tranches.
Implementing organisations themselves may also have their own regulations, guidelines and practices which don’t create an enabling environment for an agile, adaptive programming approach. Generally, greater focus is still needed to create the right institutional and funding conditions to enable and facilitate adaptive approaches. This includes a more widespread acceptance of the inherent uncertainties and the risk of failure – an ‘entrepreneurial’ approach – involved in tackling complex challenges and issues.
CLARISSA developed a somewhat detailed budget, but it was developed in a participatory, process-oriented way, based around its Systemic Action Research approach. For example, many of the details of activities were only roughly defined, as it was assumed from the outset that activities and resource allocations would be modified over time as learning emerged and activities were decided upon.
This approach was useful when the programme had to respond to shocks such as COVID, political changes affecting the civil society space in Bangladesh, and the funder’s own restructuring and budget cuts. The CLARISSA management team maintained a strong relationship with the funder, and there were also key entrepreneurial-minded champions within the funder’s own management team. All of this contributed to successfully negotiating an adaptive programme with a sufficiently flexible budget arrangement.
For example, CLARISSA wasn’t required to stick rigidly to particular types of budget lines or to refrain from deviating more than 10% on a particular budget line. Significant changes to the budget and allocations of resources were always explained to the donor and were accepted. Budget flexibility allowed the teams to focus resources on where they were most needed and on elements of the programme that were not originally budgeted for, such as the Bangladesh Office Hub.
Teams were also employed for longer than anticipated, mainly due to COVID delays, and also cost more than originally anticipated. Donor flexibility in this respect was critically important to the programme’s success. It is possible that the funder’s own necessary budget cuts to CLARISSA contributed to them being more flexible towards the different iterations of the budget, alongside the inclusion of funder representatives in many of the workshops where adaptations were discussed or emerged.
This reflection session is designed to be undertaken as a team. Allow up to two hours. Use your notebooks to record your answers and main points. You’ll need to refer back to these later.
Mini team/programme/organisation self-assessment: How do we work adaptively? (30 mins)
Read the table together, then discuss how you would describe your programme’s level of ability to adapt. Make a note of the key factors which have influenced how you would describe your programme’s level of adaptability.
Different levels of programme adaptation
RIGID
FLEXIBLE
ADAPTIVE
Inflexible and fixed
Making 'reactive' repairs
Making opportunistic adjustments
Passive adaptive management
Active adaptive management
Plans are considered fixed, including most budget allocations. Programme reviews are exceptional and may only be allowed at specific times (such as a mid-term evaluation), provided their impact is limited.
E.g. ‘The plan doesn’t work but cannot be changed. We either tweak but pretend to still follow it, or we cancel operations’
Plans are expected to be followed. Even minor adaptations require ad-hoc and time-consuming requests, and explicit high-level approval.
E.g. ‘COVID 19 forces us to alter our planned community engagement actions to get back on track’
Recognises the need for flexibility and change when the context shifts, but the focus is on implementing the plan and achieving its objectives. Learning is ‘accidental’ and implementation is prioritised over learning.
E.g. ‘Since travel is not allowed, let’s leverage virtual tools for our capacity development plan’
Some monitoring and reflective capacity is in place to detect context shifts and challenges. Plans are able to change to achieve the desired outcomes. Learning is considered a useful by-product of programme implementation
E.g. ‘Significant time and resources are wasted in travel. We need to establish a Hub Office closer to the communities’
Intentional and systematic experimentation to validate programme assumptions and to test different strategies. The programme acknowledges it has ‘imperfect’ knowledge and tries to reduce uncertainties by capturing actionable learning. Learning is considered a central objective of management. E.g. ‘Our pilots show that earning trust from communities is harder than expected. Let’s double down on our engagement with grassroots partners’
Discussion (45 mins)
Discuss the following questions together. Note down the key points for reference later.
Do you have any programme examples of adaptations for enhanced delivery, for programming, and for governance?
How does your programme decide to make these changes? For instance:
What information or learning was used to inform the change?
Who made the decisions? Was the decision participatory?
Is there an organisational space, mechanism or system in place to enable team learning and change? Or are changes made in an ad hoc way?
Does your programme use a Theory of Change? If so, how would you describe it? What do you feel are the strengths and weaknesses of your Theory of Change?
Action brainstorming (30 mins)
Use your discussions and the mini assessment to help you collectively identify any opportunities for actions which could enhance how your programme learns and adapts, and how it can shift closer to working in an adaptive way.