Mt Isolation (25) – Pilot-Testing Bright Futures

September, 2017

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

So, in the end, there will be 48 posts about climbing 48 mountains and about various aspects of the journey to thus far…

*

Leaving Plan International after 15 years, the last four of which were spent as Country Director in Viet Nam, I was fortunate to join CCF as a consultant.  My task, over what became two great years with CCF, was to help develop a new program approach for the agency.  This was exciting and opportune for me: I had been reflecting a lot about how things had changed in the development sector, and at that point I had a lot of experience across five continents, in a wide variety of roles, under my belt.

There was probably nobody in the world better suited for the task.

*

Last time, I wrote extensively about what we came up with: the “Bright Futures” program approach.  We developed the approach through a very thorough process of reflection, benchmarking, and research, and even though the changes foreseen for CCF were very significant and disruptive, senior management and board embraced our recommendations enthusiastically.  We were given the green light to pilot test the approach in three countries – Ecuador, the Philippines, and Uganda – and I was asked to train staff in each location, accompany the rollout, document learning, and suggest refinements.

This meant that I would continue to work with Michelle Poulton and Daniel Wordsworth, development professionals I’ve described in earlier blogs, people I admired and who were very serious about creating a first-class program organization.

What a fantastic opportunity!

In this blog, I want to describe how the pilot testing went.  But first…

*

I climbed Mt Isolation (4004ft, 1220m) on 8 June 2017, after having spent the previous night at Dolly Copp Campground.  Since getting to the top of Mt Isolation would be a long hike, I wanted to have a full day, so I drove up the previous afternoon and camped at Dolly Copp Campground in Pinkham Notch.

Screen Shot 2017-07-08 at 4.00.36 PM.png

 

As you can see, I went up to the top of Mt Isolation and retraced my steps back, which involved quite a long hike.  I’ve included a large-scale map here, just so that the context for Mt Isolation can be seen: it’s in a basin to the south and east of the Presidential range, with Mt Washington, Adams, Jefferson and Monroe to the north, and Eisenhower, Pierce, and Jackson to the west.

Sadly, I spent an uncomfortable night at Dolly Copp, mainly because I had forgotten the bottom (that is to day, lower) half of my sleeping bag!  So I tossed and turned, and didn’t get a great night’s sleep.

IMG_0667.jpg

 

But the advantage was that I was able to get an early start on what I thought might be a long climb, leaving the Rocky Branch parking lot, and starting the hike at about 7:15am, at least two hours earlier than I would have started if I had driven up from Durham that morning.

IMG_0670.jpg

 

The hike in the forest up the Rocky Branch Trail was uneventful, though there was lots of water along the way, and therefore lots of rock-hopping!  That trail isn’t very well maintained, and with recent heavy rains there were long sections that were more stream than path!

I reached the junction of Rocky Branch and Isolation Trail at about 9:15am, two hours from the start of the hike.  I crossed over and headed upstream.  Rocky Branch was full, as expected with all the rain, so crossing was a bit challenging.  There were four more crossings as I headed up, through the forest, before I reached the junction of Isolation Trail and Davis Path at about 11am.  I have to admit that I dipped my boots into the Rocky Branch more than once on the way up, and even had water flow into my boot (over the ankle) once!  So the rest of the hike was done with a wet left foot…

IMG_0791IMG_0694

IMG_0679

 

 

Once I got onto Isolation Trail, I found it was better maintained than Rocky Branch Trail had been.  Evidence of a strong storm was obvious near the top, where I joined Davis Path: lots of downed trees had been cut, clearing the trail, but the hike was still narrow in places, crowded with downed trees and shrubs on both sides.

As I hiked up Isolation Trail, still in the forest, I began to have views of the Presidential Range.  I reached the turnoff for the spur up to the summit of Mt Isolation at about 11:30am, and reached the top a few minutes later.  So it took me about 4 3/4 hours to reach the top; I didn’t see any other hikers on the way up.

The view from the top was fantastic, probably the best so far of all the hikes in this series: it was clear and dry, and I had the whole Presidential Range in front of me.

IMG_0757

From the Right: Mt Washington, Mt Adams, Mt Jefferson

IMG_0728

Mt Eisenhower

IMG_0722

IMG_0719

IMG_0727

 

 

And I had a winged visitor, looking for food.

IMG_0761

IMG_0740

 

But I also had hordes of one other particular species of visitor: for only the second time in these 25 climbs, swarms of black flies quickly descended, making things impossible and intolerable.  Luckily, I had carried some insect repellent leftover from our years in Australia, and once I applied generous quantities onto my face and arms and head, the black flies left me alone.  Otherwise I would have had to leave the summit immediately, which would have been a real shame, because I had walked nearly 5 hours to get there, without very many views!

IMG_0814

 

Happily, I was able to have a leisurely lunch at the top.  The views were glorious, and the flies left me alone.

After I left, retracing my steps down, I did meet with a few hikers, including a mother and son who had come up from Glen Ellis Falls.  Descending Rocky Branch, of course, I had to cross the river again, five more times.  However, in this case, I crossed once in error and had to recross there to get back to the trail… so, make that seven more times!  Happily, it seemed easier to navigate the crossings on the way back, either the water had gone down (unlikely), or I was a bit more familiar with the right spots to cross.

I arrived back at the parking lot at about 4pm, having taken almost nine hours to climb Mt Isolation.  Tired, but it was a great day out!

*

Change is complicated and, given the nature of our value-driven organisations, changing international organisations is particularly challenging (see my 2001 article on this topic, available here: NML – Fragmentation Article).  Even though the close association that our best people make between their work and their own personal journeys is a huge advantage for our sector (leading to very high levels of commitment and motivation), this same reality also produces a culture that is often resistant to change.  Because when we identify ourselves so closely with our work, organisational change becomes personal change, and that’s very complicated!

And the changes implied with Bright Futures were immense, and disruptive.  We were asking pilot countries:

  • to move from: programs being based on a menu of activities defined at CCF’s headquarters, and focused on basic needs;
  • towards: programs based on a broad, localized, holistic and nuanced understanding of the causes and effects of the adversities faced by children, and of the assets that poor people draw on as they confront adversity.

The implication here was that pilot countries would need to deepen their understanding of poverty, and also learn to grapple with the complexity involved in addressing the realities of the lived experience of people living in poverty.  In a sense, staff in pilot countries were going to have to work much harder – choosing from a menu was easy!

  • to move from: programs being led by local community associations of parents, whose task was primarily administrative: choosing from the “menu” of activities, and managing funds and staff;
  • towards: programs being designed to enhance the leading role (agency) of parents, youth, and children in poor communities, by ensuring that they are the primary protagonists in program implementation.

The implication was that pilot countries could build on a good foundation of parents’ groups.  But extending this to learning to work appropriately with children and youth would be a challenge, and transforming all of these groups into authentic elements of local civil society would be very complex!  The reality was that the parents’ associations were often not really managing “their” staff – often it was the other way ’round – that would have to change.  Another of the challenges here would be for existing staff, and community members in general, to learn to work with groups of children and youth in non-tokenistic ways.

  • to move from: carrying out all activities at the local community level;
  • towards: implementing projects wherever the causes of child poverty and adversity are found, whether at child, family, community, or Area levels.

This would be a big challenge for pilot countries, because it involved understanding the complex linkages and causes of poverty beyond the local level, and then understanding how to invest funds in new contexts to achieve real, scaled, enduring impact on the causes of child poverty and adversity.

One big obstacle here would be the vested interests that had been created by the flow of funds from CCF into local communities and the parents’ groups. Not an easy task, fraught with significant risk.

And, on top of all of that, Bright Futures envisioned the consolidation of the existing local-level parent’s associations into parent “federations” that would operate at “district” level, relating to local government service provision.  Transforming the roles of the existing parents’ associations from handling (what was, to them) vast quantities of money, to answering to an entirely new body at “district level” was a huge challenge.

  • to move from: working in isolation from other development stakeholders;
  • towards: integrating CCF’s work with relevant efforts of other development agencies, at local and national levels.

This would require a whole new set of sophisticated relational and representational competencies that had not been prioritized before.

For example, in a sense, CCF had been operating in a mechanical way – transfer funds from headquarters to parents’ groups, which would then simply choose from a menu of activities that would take place in the local community. Simple, and effective to some extent (at least in terms of spending money!), but no longer suitable if CCF wished to have greater, longer-lasting impact, which it certainly did.

  • to move from: annual planning based on local parents’ groups choosing a set of activities from a menu of outputs, all related to basic needs;
  • towards: planning in a much more sophisticated way, with the overall objective of building sustainable community capacity, the ability to reflect and learn, resilience, and achieving impact over, as an estimation, four 3-year planning periods.

CCF would have to create an entirely new planning system, focused on the “Area” (district) level but linked to planning at Country, Regional, and International contexts.

Fundamental to this new system would be the understanding of the lived reality of people living in poverty; this would be a very new skill for CCF staff.  And pilot countries would have to learn this new planning system immediately, as it would be the foundation of pilot operations… so we had to move very quickly to develop the system, train people, and get started with the new way of planning.  (I will describe that system, the “ASP,” below…)

  • to move from: program activities taking place far from CCF’s operational structure, with visits by staff to local communities once per year;
  • towards: programs being supported much more closely, by decentralizing parts of CCF’s operational structure.

This was a huge change, involving setting up “Area” offices, staffing these offices with entirely new positions, and then shifting roles and responsibilities out from the Country Office.

There was deep institutional resistance to this move, partly because of a semi-ideological attachment to having parents make all programmatic decisions (which I sympathized with, although the evidence was clear that the program activities that resulted were often not high-quality).

But resistance also came from a more-mundane, though powerful source: showing a “massive” increase in staffing on CCF’s financial statements would look bad to charity watchdogs like Charity Navigator and Guidestar.  Even though the total levels of staffing would go down, as staffing at the “parents’ associations” would decrease significantly, those employees had not been shown on CCF’s books, because they were technically employees of the associations.  So the appearance would be a negative one, from a simple bookkeeping, ratio-driven point of view.  But this “point of view” was of the very highest priority to CCF’s senior management, because it strongly influenced donor behavior.

  • to move from: funding program activities automatically, to parents’ groups on a monthly basis, as output “subsidies”;
  • towards: projects being funded according to the pace of implementation. 

This would be another enormous, foundational change, entailing a completely-new financial system and new flows of funding and data: now, the Country and Area offices would authorize fund transfers to the local parents’ (and child and youth) associations based on documented progress of approved projects.

All of this would be new, so CCF had to develop project documentation processes and funding mechanisms that provided sufficient clarity and oversight.

To properly test Bright Futures, we would need to provide a lot of support to the pilot countries as they grappled with these, and other, disruptions!

*

In this blog post, I want to describe several aspects of the year that we piloted Bright Futures in Ecuador, the Philippines, and Uganda as they moved to implement the disruptive changes outlined above: how we helped staff and leadership in the three pilot countries understand what they were going to do; how we worked with them to get ready; and how we accompanied them as they commenced working with the Bright Futures approach.   And how we developed, tested, and implemented an entirely new set of program-planning procedures, the Area Strategic Plan methodology.

As I have just noted, Bright Futures was a profoundly different approach than what these pilot countries were used to, deeply disruptive.  So we set up what seems to me to have been, in retrospect, a careful, thorough, rigorous, and exemplary process of support and learning.  In that sense, I think it’s worth describing the process in some detail, and worth sharing a sample of the extensive documentation that was produced along the way.

*

Before beginning to pilot, we carefully identified what we would be testing and how we would measure success; we set up processes to develop the new systems and capacities that would be needed in the pilot countries and at CCF’s headquarters; and we established mechanisms to support, and learn from, the pilot countries as they pioneered a very new way of working.

In the end, I worked closely with the three pilot countries for a year – helping them understand what they were going to do, working with them to get ready, and then accompanying them as they commenced working with the Bright Futures approach.  And, along the way, I supported staff in the Richmond headquarters as they grappled with the changes demanded of them, and with the impact of the changes on headquarters systems and structures.

When CCF’s senior management had agreed the pilot testing, their president (John Schulz) had decided that the organization would not make changes to key systems and structures across the agency until pilot testing was complete and full rollout of Bright Futures had been approved.  This meant that the functional departments at headquarters had to develop “work-arounds” so that pilot areas could manage financial and donor-relations aspects of their work.

This made sense to me: why spend the time and money to develop new systems when we didn’t know if, or how, Bright Futures would work?  But it meant that much of the agency, including all three pilot Country Offices, would be using parallel basic organizational processes, especially financial processes, at the same time, just adding to the complexity!

*

First we brought key staff from each country together with staff from CCF’s headquarters in Richmond, Virginia, to develop a shared understanding of the road ahead, and to create national plans of action for piloting.  Management approved these detailed plans in late May of 2003.

I recently rediscovered several summary videos that I prepared during the creation and pilot testing of what became Bright Futures.  These videos were used to give senior management a visual sense of what was happening in the field.

Here is a short (11-minute) summary video of the preparation workshop that took place in late April of 2003:

 

It’s fun for me to see these images, now 14 years ago: the people involved, the approaches we used to start pilot testing Bright Futures.  Staff from all three pilot countries are shown, along with Daniel and Michelle, and other senior staff from Richmond.

One important result of that launch workshop was the production of a set of management indicators which would be used to assess pilot performance: the indicators would be measured in each pilot country before, and after the pilot-testing period.  The agreed indicators reflected the overall purposes of the Bright Futures program approach (see my previous blog), and can be found here: Piloting Management Indicators – From Quarterly Report #2.

Once detailed national plans of action were approved, we scheduled “Kickoff” workshops in each pilot country.  These two-day meetings were similar in each location, and included all staff in-country.  On the first day, we would review the background of the pilot, including summary presentations of CCF’s strategic plan, the Organisational Capacity Assessment, and the CCF Poverty Study.   Finally, the basic principles, concepts, and changes included in the pilot testing were presented and discussed, along with an outline of the pilot schedule.  At the end of the first day, we handed out relevant background documentation and asked participants to study it in preparation for the continuation of the meeting on the second day.

The second day of these Kickoff meetings was essentially an extended question and answer, discussion and reflection session, during which I (and staff from CCF’s headquarters, when they attended) would address concerns and areas where more detail was required.  Occasionally, since I was an external consultant, there were questions that needed discussion with functional departments at CCF’s headquarters, so I tracked these issues and methodically followed them up.

During these initial visits, I also worked with Country Office leadership to help them obtain critical external support in two important and sensitive areas:

  • Given the fundamental nature of the changes being introduced, and in particular noting that only part of the operations in each pilot country would be testing Bright Futures, human-resources issues were crucial.  Bright Futures would demand new competencies, new structures, new positions, and change management would be complex.  So in each country we sought external support from specialised agencies; I worked with CCF’s director of human resources in Richmond, Bill Leedom, to source this support locally;
  • One particular skill, on the program side, would be pivotal: new planning systems would require field staff to master the set of competencies and tools known as “PRA” – participatory rural appraisal.  (I had first come across PRA methods when in Tuluà, at Plan’s Field Office there, back in 1987, but somehow most CCF staff had not become familiar with this approach.  Some did, of course, but this gap in knowledge was an example of how CCF staff had been somewhat isolated from good development practices).  Since by 2003 PRA was completely mainstream in the development world, there were well-regarded, specialised agencies in most countries that we contacted to arrange training.

Also, in this first round of visits, I worked with local staff to finalise the selection of two pilot “Areas” in each country.  I visited these locations, helping determine the details of staffing in the Areas, reviewed and decided systems and structural issues (such as how funds would flow, how local parents’ associations would evolve as district-level “federations” were formed), etc.

*

Once the two “Areas” in each pilot country began working, I started to issue quarterly reports, documenting progress and concerns, and including visit reports, guidance notes issued, etc.  (I continued to visit each country frequently, which meant that I was on the road a lot during that pilot-testing year! )  These quarterly reports contained a very complete record of the pilot-testing experience, useful for anybody wanting (at the time) to have access to every aspect of our results, and useful (now) for anybody wanting to see what the rigorous pilot-testing of an organizational change looks like.

I produced five lengthy, comprehensive quarterly reports during that year, which I am happy to share here:

*

Staff from functional departments at CCF’s headquarters also visited pilot countries, which we encouraged: support from Richmond leadership would be important, and their input was valuable.  Of course, leaders at headquarters would need to be supportive of the Bright Futures model once the pilot-testing year was concluded, if CCF were to scale up the approach, so exposing them to the reality was key, especially because things went well!

We asked these visitors to produce reports, which are included in the quarterly reports available in the links included above.

*

Evidence of an interesting dynamic that developed during the year can be seen reflected from a report produced by Bill Leedom, who was CCF’s HR director at the time.  Bill’s visit report for a visit he made to Ecuador is included in the Q1FY04 Quarterly Report (Q1FY04 – 2).  In his report, he describes a discussion he had with the Country Director:

“Carlos (Montúfar, the Country Director in Ecuador) and I had a discussion about the role of consultants in the organization. Although it appears at times that the consultant is running the organization it must be the other way around. CCF hires a consultant to help with a process and then they leave. They are a “hired gun.” If changes are recommended they cannot be implemented without his approval as he will have to live with the consequences of whatever was done. The consultant moves on to another job and does not have to suffer any consequences of a bad recommendation or decision but he and his staff have to. I think Carlos was glad to hear this and hopefully will “stand up” to and express his opinions to what he believes might not be good recommendations by consultants.”

When Bill uses the word “consultants,” I know that he is politely referring to me!  My recollection is that this comment reflects a strong dynamic that was emerging as we pilot tested Bright Futures: leadership in the three pilot countries had volunteered to pilot test a particular set of changes, perhaps without fully understanding the ramifications, or without fully understanding that headquarters (meaning, mostly, me!) would be accompanying the pilot process so closely.

Understandably, leaders like Carlos wanted to maintain authority over what was happening in their programs, while headquarters felt that if we were going to test something, we had to test it as designed, learn what worked and what didn’t work without making changes on the fly.  Only after testing the model as proposed would make changes or adaptations as we prepared to scale up.  Otherwise, we’d never be able to document strengths and weaknesses of what we had agreed to pilot.

But not everything went perfectly – that’s why we were pilot testing, to discover what we needed to change!  When things didn’t go well, naturally, people like Carlos wanted to fix it.  That led to tension, particularly in Ecuador – perhaps because the program in that country was (rightly) highly-esteemed.

Carlos resisted some of the guidance that I was giving, and we had some frank discussions; it helped that my Spanish was still quite fluent.  But Daniel and Michelle, program leadership in Richmond, made it clear to me, and to Carlos and his regional manager that we needed to test Bright Futures as it had been designed, so even though I was an external consultant, I felt that I was on strong ground when I insisted that pilot countries proceed as we had agreed at the launch workshop in April of 2003.

*

From the beginning, we understood that an entirely-new planning, monitoring, and evaluation methodology would need to be developed for Bright Futures.  Since this would be a very large piece of work, we sought additional consulting help, and were fortunate to find Jon Kurtz, who worked with me to prepare and test the Bright Futures “Area Strategic Planning” method, the “ASP.”

We wanted to take the CCF Poverty Study very seriously, which meant that a rigorous analysis of the causes of child poverty and adversity, at various levels, had to be evident in the ASP.  And we had to make sure that program planning reflected all of the principles of Bright Futures – involving, for example, children and youth in the ASP process, incorporating other stakeholders (local NGOs operating in the Area, district government), and so forth.

Area Strategic Planning was aimed at supporting CCF’s goal of achieving broader, deeper and longer-lasting impact on child poverty.  To do this, the ASP process was guided by several key principles.  These principles can be seen in terms of the goals that ASP was designed to help programs to achieve:

  • Understanding poverty: Programs will be based on a deep understanding of, and responsive to the varied nature of child poverty across the communities where CCF works.
  • Leading role: Programs will build the capacities of parents, youth and children to lead their own development. Each group will be given the space and support required to take decisions and action to improve the wellbeing of children in their communities and Areas.
  • Linkages: Programs will be linked to and strengthen the resources that poor people call upon to improve their lives. Efforts will strive to build on the existing energies in communities and on relevant efforts of other development agencies.
  • Accountability: Programs will be recognized by sponsors and donors for their value in addressing child poverty, and at the same time will be accountable to the partner communities, especially the powerless and marginalized groups.
  • Learning: Programs will be based on best practices and continuos learning from experiences. Planning, action and review processes will be linked so that lessons from past programs are reapplied to improve future efforts.

The process for conducting Area Strategic Planning was structured to reflect these principles and aims.  It was foreseen that the proposed ASP process would evolve and be innovated upon beyond the pilot year, as Areas discovered other ways to achieve these same goals.  However, for the purposes of the pilot year the ASP process would follow the following process consisting of four stages:

  1. Community reflections on child poverty and adversity: Initial immersion and reflection in communities to gain a deep understanding of child poverty in each context, including its manifestations and causes, as well as the resources poor people rely on to address these.
  2. Area synthesis and draft program and project planning: Developing programs and projects which respond to the immediate and structural causes of child ill-being in the Area while building on the existing resources identified.
  3. Community validation, prioritization and visioning: Validating the proposed program responses in communities, prioritizing projects, and developing visions for the future for assessing program performance.
  4. Detailed project planning and ASP finalization: Designing projects together with partners and technical experts, defining capacity building goals for the Area Federation(s), and developing estimated budgets for programs and getting final input on and approval of the ASP.

We settled on a process that would look like this:

Screen Shot 2017-09-10 at 1.37.11 PM.png

CCF’s Area Strategic Planning Model

 

The ASP’s Stage Two was crucial: this was where we synthesized the understanding of child poverty and adversity, into root causes, compared those root causes with existing resources (latent or actual) in the Area, and created draft programs and projects.

Screen Shot 2017-09-10 at 1.55.03 PM.png

 

This step required a bit of “magic” – somehow matching the root causes of child poverty to local resources… and you can see Jon working hard to make it work in the video included below.  But it did work!

I really liked this ASP process – it reflected much of what I had learned in my career, at least on the program side.  It looked good, but we needed to test the ASP before training the pilot countries, so a small team of us (me, Jon, and Victoria Adams) went to The Gambia for a week and tried it out.  In this video you can see Jon working the “magic” – conjuring programs and projects from comparing root causes of child poverty (broadly understood) with locally-available (existing or latent) resources:

 

I like that there was a large dose of artistry required here; development shouldn’t be linear and mechanical, it should be joyful and serendipitous, and I was proud that our ASP process made space for that.

With the learnings from that test in The Gambia, we finalized a guidance document, detailing underlying principles, the ASP process, detailed procedures, and reporting guidelines and formats.  The version we used for pilot testing can be downloaded here: ASP Guidance – 16.

Later we trained staff in each pilot country on the ASP.  Here is a video that shows some of that process:

 

I often tell one fun anecdote about the ASP training sessions.  Stage One of the process (see the diagram above) required CCF staff to stay for nearly a week in a village where the agency worked, to carry out a thorough investigation of the situation using PRA methods.

In one country (which I will not name!), after the initial training we moved out to the pilot Area to prepare to spend the week in a village.  When we gathered there after arriving, to discuss next steps, senior national CCF staff informed me that the “village stay” would not be necessary: since they were not expatriates, they had a clear idea of the situation in rural areas of their country.

My response was simple: as a consultant, I had no authority to force them to engage in the village stay, or anything else for that matter, but that we wouldn’t continue the training if they were not willing to participate as had been agreed…!

That got their attention, and (after some discussion) they agreed to spend much of the week in local villages.

I was delighted when, at the end of the week, they admitted that things were very different than they had expected in these villages!  They seemed genuine in their recognition that they had learned a lot.

But I wasn’t surprised – these were smart, well-trained people, but they were highly-educated elite from the capital city, distant physically and culturally from rural areas.  So, I think, the village stay was very useful.

*

Along the way, across the year of pilot testing in Ecuador, the Philippines, and Uganda, I issued a series of short guidance notes, which were circulated across CCF.  These notes aimed to explain what we were pilot testing for staff who weren’t directly involved, covering the following topics:

  1. What are we pilot testing?  Piloting Notes – 1.9.  This guidance note explains the basic principles of Bright Futures that we were getting ready to test.
  2. The operational structure of Bright Futures.  Piloting Notes – 2.4.  This guidance note explains how CCF was going to set up Federations and Area Offices.
  3. Recruiting new Bright Futures staff.  Piloting Notes – 3.6.  This guidance note explains how CCF was going to build up the Area structures with new staff.
  4. The CCF Poverty Study.  Piloting Notes – 4.9  This guidance note gives a summary of the Poverty Study, that would underlie much of the Area Strategic Planning process.
  5. Monitoring and Evaluation.  Piloting Notes – 5.2  This guidance note explains Area Strategic Planning.
  6. Area Federations.  Piloting Notes – 6.6.  This guidance note explains the ideas behind building the power of people living in poverty by federating their organizations so that they could have more influence on local government service provision.
  7. Finance Issues.  Piloting Notes – 7.3.  This guidance note explains how CCF would change funding from being a “subsidy” of money, remitted every month to parents’ associations, towards a more modern process of funding project activities according to advance.
  8. Partnering.  Piloting Notes – 8.7.  This guidance note outlines the basic concepts and processes underlying one of Bright Futures’ biggest changes: working with and through local civil society.
  9. Growing the Capacity of Area Federations.  Piloting Notes – 9.6.  This guidance note describes how the federated bodies of parents, youth, and children, could become stronger.
  10. The Bright Futures Approach.  Piloting Notes – 10.2.  This guidance note explains the  approach in detail.
  11. Child and Youth Agency.  Piloting Notes – 11.  This final guidance note explains the ideas behind “agency” – enabling children and youth to take effective action on things that they find to be important in their communities.

The “Piloting Notes” series was fairly comprehensive, but purposely brief and accessible to the wide range of CCF staff across the world – busy people, with very different language abilities.  The idea was to “over-communicate” the change, so that when the time came to roll out Bright Futures, the agency would be as ready as possible.

*

There is so much more that I could share about that fantastic year.  For example, the work that Andrew Couldridge did helping us grapple with the establishment of Area “Federations” of people living in poverty.  But this blog is already quite long, so I will close it after sharing staff assessments of the pilot testing, and thanking the people who were really driving this positive change in CCF.

*

CCF carried out a formal evaluation of the pilot test of Bright Futures, using an external agency from the Netherlands (coincidentally named Better Futures, I think).  Sadly, I don’t have access to their report, but I think it was quite positive.

But I do have access to the assessment we carried out internally – the summary of that assessment is here: Management Summary – 1.  We surveyed at total of 17 people in the  three pilot countries, asking them about the Bright Futures model, HR and structural aspects, the planning process (ASP), Federations, Partnership, working with children and youth, sponsor relations, and support from Richmond.

I want to share some of the findings from the first domain of assessment (the Bright Futures model) and the last domain (support from Richmond).

  • In terms of the basic Bright Futures model, staff in pilot countries felt that positive aspects were the way that it included working in partnership and linking with other development actors, how it changed funding flows, how it deepened the undersding of poverty, and how it enhanced the participation and involvement of the community in general, and children and youth in particular.
  • On the negative side, the Bright Futures model was felt to be too demanding on them, that there was not enough capacity in communities, that there was a high cost to community participants (I think this was related to their time), that the piloting was too quick, that CCF’s focus was moved away from sponsored families, that Bright Futures guidelines were not complete at the beginning of the pilot period, that CCF itself became less visible, that Area staff may be dominant, and that the role of National Office staff was unclear.
  • In terms of support from the CCF headquarters, staff in pilot countries felt that positive aspects were that visits were very positive, helping clarify, giving a sense of accompaniment and solidarity.  Also, the flow of materials (guidance notes, etc.) was seen positively.
  • On the negative side, support visits were seen as too few and too short, guidelines were provided “just in time” which caused problems, messages from CCF headquarters were contradictory, and more support was called for in later stages of the ASP.

Piloting change is tricky, and leading it from headquarters of any INGO is even trickier – I think we did very well.

*

Once the pilot phase was evaluated, CCF began to prepare for scaling up – preparing a “second wave” of Bright Futures rollout.  Firstly we thought about how countries would be “certified” to “go-live” in Bright Futures – how would we know that they were “ready”?

To help, we produced a document summarizing how “certification” would be handled: Certification – 1.

Five countries were selected for the “second wave”: Angola, Honduras, Sierra Leone, Sri Lanka, and Zambia.  At this point, I was beginning to transition to another role (see below), so my involvement in the “second wave” was minimal.  But I did help facilitate a “pan-Asia” Bright Futures rollout workshop in Colombo, and met several people I would later work closely with when I joined ChildFund Australia (Ouen Getigan and Sarah Hunt, for example!)

*

As I’ve described here, piloting the kind of disruptive, fundamental change that was envisioned in Bright Futures brings many challenges.  And once the lessons from pilot testing are incorporated, scaling up brings a different set of complexities: for example, CCF was able to provide very extensive (and expensive) support to the three Bright Futures pilots, but would not be able to cover the entire, global organisation with that same intensity.  So, often, quality drops off.

One gap that we noticed in the support we were providing to the pilot countries was very basic: attitudes, skills, and understanding of poverty and how to overcome it.  For example, as mentioned above, we had tried to partially address this gap by getting training for pilot-country staff in PRA methods.

Next time, in my final “Bright Futures” post, I will describe how we sought to build competencies, and momentum, for Bright Futures by creating and implementing a week-long immersion training, which we called “BF 101”.   And I’ll share how Bright Futures came to a premature end…

In 2009, four years after I completed my time as a consultant with CCF, I was asked to briefly return and create a week-long training workshop which we called “Bright Futures 101.”  We conducted this workshop in the Philippines and, next time, I will skip ahead in time and describe that fascinating, and successful experience.

And I will describe how Bright Futures ended!

*

But before that, I finished my work with CCF by serving as acting Regional Representative for East Africa, based in Addis Ababa.  This assignment was to fill-in for the incumbent Regional Representative during her sabbatical.  Jean and I would move to Addis, and I worked with CCF’s offices in Ethiopia, Kenya, and Uganda for those fascinating months.

Then … I would move into the world of activism and human-rights campaigning, joining the Unitarian Universalist Service Committee as Executive Director in 2005.  Stay tuned for descriptions of that fascinating experience.

*

Before closing this final description of the two years I spent as a consultant with CCF, I want to thank Michelle and Daniel for giving me the opportunity to lead this process.  As I’ve said several times, they were doing exemplary work, intellectually honest and open.  It was a great pleasure working with them.

Carlos, in Ecuador, Nina in the Philippines, and James in Uganda, all did their best to stay true to the principles of Bright Futures, despite the headaches that came with pilot testing such disruptive change.  And they unfailingly welcomed me to their countries and work on many occasions during those two years.  Thank you!

And I also want mention and recognize a range of other Richmond-based CCF staff who worked very effectively with us to make the pilot testing of Bright Futures a success: Mike Raikovitz, Victoria Adams, Jason Schwartzman, Jon Kurtz, Andrew Couldridge, Dola Mohapatra, Tracy Dolan, and many others.  It was a great team, a great effort.

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.