West Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness Framework

June, 2018

International NGOs do their best to demonstrate the impact of their work, to be accountable, to learn and improve.  But it’s very challenging and complicated to measure change in social-justice work, and even harder to prove attribution.  At least, to do these things in affordable and participatory ways…

Two times in Plan International, earlier in my career, I had worked to develop and implement systems that would demonstrate impact – and both times, we had failed.

In this article I want to describe how, in ChildFund Australia, we succeeded, and were able to build and implement a robust and participatory system for measuring and attributing impact in our work.

Call it the Holy Grail!

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 36 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”  This is my 37th post in the series.

In recent posts in this series I’ve been describing aspects of the ChildFund Australia “Development Effectiveness Framework” (“DEF”) the system that would help us make sure we were doing what we said we were going to do and, crucially, verify that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

There are three particular components of the overall DEF that I am detailing in more depth, because I think they were especially interesting and innovative.  In my previous blog I described how we used Case Studies to complement the more quantitative aspects of the system.  These Case Studies were qualitative narratives of the lived experience of people experiencing change related to ChildFund’s work, which we used to gain human insights, and to reconnect ourselves to the passions that brought us to the social-justice sector in the first place.

This time, I want to go into more depth on two final, interrelated components of the ChildFund Australia DEF: Outcome Indicator Surveys and Statements of Impact.  Together, these two components of the DEF enabled us to understand the impact that ChildFund Australia was making, consistent with our Theory of Change and organizational vision and mission.  Important stuff!

But first…

*

Last time I described climbing to the top of Mt Bond on 10 August 2017, after having gotten to the top of Bondcliff.  After Mt Bond, I continued on to West Bond (4540ft, 1384m), the last of three 4000-footers I would scale that day.  (But, since this was an up-and-back trip, I would climb Mt Bond and Bondcliff twice!  It would be a very long day.)

As I described last time, I had arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  This early start was enabled by staying the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.  Then I had reached the top of Bondcliff at 10:30am, and the summit of Mt Bond at about 11:30am.

Now I would continue to the top of West Bond, and then retrace my steps to Lincoln Woods:

Bond Map - 6c.jpeg

 

So, picking up the story from the top of Mt Bond, the Bondcliff Trail drops down fairly quickly, entering high-altitude forest, mostly pine and ferns.

IMG_1952.jpg

 

After 20 minutes I reached the junction with the spur trail that would take me to the top of West Bond.  I took a left turn here.  The spur trail continues through forest for some distance:

IMG_1955.jpg

IMG_1958.jpg

 

I reached the top of West Bond at 12:30pm, and had lunch there.  The views here were remarkable; it was time for lunch, and I was fortunate to be by myself, so I took my time at the summit.

IMG_1965 (1).jpg

Bondcliff From West Bond

IMG_1972.jpg

At The Summit Of West Bond.  Franconia Ridge And Mt Garfield In The Background.  A Bit Tired!

IMG_1984.jpg

Mt Bond, On The Left, And Bondcliff On The Right

 

Here are two spectacular videos from the top of West Bond.  The first simply shows Bondcliff, with the southern White Mountains in the background:

 

And this second video is more of a full panorama, looking across to Owl’s Head, Franconia Ridge, Garfield, the Twins, Zealand, and back:

 

Isn’t that spectacular?!

After eating lunch at the top of West Bond, I left at a bit before 1pm, and began to retrace my steps towards Lincoln Woods.  To get there, I had to re-climb Mt Bond and Bondcliff.

I reached the top of Mt Bond, for the second time, at 1:20pm.  The view down towards Bondcliff was great!:

IMG_1996.jpg

Bondcliff From The Top Of Mt Bond, Now Descending…

 

Here is a view from near the saddle between Mt Bond and Bondcliff, looking up at the latter:

IMG_2005.jpg

Looking Up At Bondcliff

 

As I passed over Bondcliff, at 2:15pm, I was slowing down, and my feet were starting to be quite sore.  I was beginning to dread the descent down Bondcliff, Wilderness, and Lincoln Woods Trails… it would be a long slog.

Here’s a view from there back up towards Mt Bond:

IMG_2007.jpg

A Glorious White Mountain Day – Mt Bond And West Bond, From Bondcliff

 

But there were still 8 or 9 miles to go!  And since I had declined the kind offer I had received to ferry my car up to Zealand trail-head, which would have saved me 3 miles, I had no other option but to walk back to Lincoln Woods.

It was nearly 5pm by the time I reached the junction with Twinway and the Lincoln Woods Trail.  By that time, I was truly exhausted, and my feet were in great pain, but (as I said) I had no option but to continue to the car: no tent or sleeping bag, no phone service here.

The Lincoln Woods Trail, as I’ve described in more detail elsewhere, is long and flat and wide, following the remnants of an old forest railway:

IMG_2024

IMG_2025

Sleepers From The Old Forestry Railway

 

Scratches from walking poles?

IMG_2026 (1).jpg

 

It was around 5:30 when I got to the intersection with Franconia Notch Trail, which is the path up Owl’s Head.

IMG_2028.jpg

IMG_2034.jpg

 

It was a very long slog down Lincoln Woods Trail – put one foot in front of the other, and repeat!  And repeat and repeat and repeat and repeat …

Finally I reached the Lincoln Woods Visitor Center, where I had parked my car at 6:30am that morning, at 6:40pm, having climbed three 4000-footers, walked 22 miles, and injured my feet in just over 12 hours.

Looking back, I had accomplished a great deal, and the views from the top of three of New Hampshire’s highest and most-beautiful were amazing.  But, at the time, I had little feeling of accomplishment!

IMG_2038 (1).jpg

Knackered!

 

*

Here is the diagram I’ve been using to describe the ChildFund Australia DEF:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework

 

In this article I want to describe two components of the DEF: #2, the Outcome Indicator Surveys; and #12, how we produced “Statements of Impact.”  Together, these two components enabled us to measure the impact of our work.

First, some terminology: as presented in an earlier blog article in this series, we had adopted fairly standard definitions of some related terms, consistent with the logical framework approach used in most mature INGOs:

Screen Shot 2018-05-28 at 2.16.30 PM

 

According to this way of defining things:

  • A Project is a set of Inputs (time, money, technology) producing a consistent set of Outputs (countable things delivered in a community);
  • A Program is a set of Projects producing a consistent set of Outcomes (measurable changes in human conditions related to the organization’s Theory of Change);
  • Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan.

But that definition of “Impact,” though clear and correct, wasn’t nuanced enough for us to design a system to measure it.  More specifically, before figuring out how to measure “Impact,” we needed to grapple with two fundamental questions:

  • How “scientific” did we want to be in measuring impact?  In other words, were we going to build the infrastructure needed to run randomized control group trials, or would we simply measure change in our Outcome Indicators?  Or somewhere in between?;
  • How would we gather data about change in the communities where we worked?  A census, surveying everybody in a community, which would be relatively costly?  If not, what method for sampling would we use that would enable us to claim that our results were accurate (enough)?

*

The question “how ‘scientific’ did we want to be” when we assessed our impact was a fascinating one, getting right to the heart of the purpose of the DEF.  The “gold standard” at that time, in technical INGOs and academic institutions, was to devise “randomized control group” trials, in which you would: implement your intervention in some places, with some populations; identify ahead of time a comparable population that would serve as a “control group” where you would not implement that intervention; and then compare the two groups after the intervention had concluded.

For ChildFund Australia, we needed to decide if we would invest in the capability to run randomized control group trials.  It seemed complex and expensive but, on the other hand, it  would have the virtue of being at the forefront of the sector and, therefore, appealing to technical donors.

When we looked at other comparable INGOs, in Australia and beyond, there were a couple that had gone that direction.  When I spoke with my peers in some of those organizations, they were generally quite cautious about the randomized control trial (“RCT”) approach: though appealing in principle, in practice it was complex, requiring sophisticated technical staff to design and oversee the measurements, and to interpret results.  So RCTs were very expensive.  Because of the cost, people with practical experience in the matter recommended using RCTs, if at all, only for particular interventions that were either expensive or were of special interest for other reasons.

For ChildFund Australia, this didn’t seem suitable, mainly because we were designing a comprehensive system that we hoped would allow us to improve the effectiveness of our development practice, while also involving our local partners, authorities, and people in communities where we worked.  Incorporating RCTs into such a comprehensive system would be very expensive, and would not be suitable for local people in any meaningful way.

The other option we considered, and ultimately adopted, hinged upon an operational definition of “Impact.”  Building on the general definition shown above (“Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan”), operationally we decided that:

Screen Shot 2018-06-18 at 3.06.57 PM.png

 

In other words, we felt that ChildFund could claim that we had made an significant impact in the lives of children in a particular area if, and only if:

  1. There had been a significant, measured, positive change in a ChildFund Australia Outcome Indicator; and
  2. Local people (community members, organizations, and government staff) determined in a rigorous manner that ChildFund had contributed to a significant degree to that positive change.

In other words:

  • If there was no positive change in a ChildFund Australia Outcome Indicator over three years (see below for a discussion of why we chose three years), we would not be able to claim impact;
  • If there was a positive change in a ChildFund Australia Outcome Indicator over three years, and local people determined that we had contributed to that positive change, we would be able to claim impact.

(Of course, sometimes there might be a negative change in a ChildFund Australia Outcome Indicator, which would have been worse if we hadn’t been working in the community.  We were able to handle that situation in practice, in community  workshops.)

I felt that, if we approached measuring impact in this way it would be “good enough” for us – perhaps not as academically robust as using RCT methods, but (if we did it right) certainly good enough for us to work with local people to make informed decisions, together, about improving the effectiveness of our work, and to make public claims of the impact of our work.

So that’s what we did!

*

As a reminder, soon after I had arrived in Sydney we had agreed a “Theory of Change” which enabled us to design a set of organization-wide Outcome Indicators.  These indicators, designed to measure the status of children related to our Theory of Change, were described in a previous article, and are listed here:

Screen Shot 2018-05-28 at 3.16.59 PMScreen Shot 2018-05-28 at 3.17.10 PM

 

These Outcome Indicators had been designed technically, and were therefore robust.  And they had been derived from the ChildFund Australia Vision, Mission, and Program Approach, so they measured changes that would be organically related to the claims we were making in the world.

So we needed to set up a system to measure these Outcome Indicators; this would become component #2 in the DEF (see Figure 1, above).  And we had to design a way for local partners, authorities, and (most importantly) people from the communities where we worked to assess changes to these Outcome Indicators and reach informed conclusions about who was responsible for causing the changes.

First, let me outline how we measured the ChildFund Australia Outcome Indicators.

*

Outcome Indicator Surveys (Component #2 in Figure 1, Above)

Because impact comes rather slowly, an initial, baseline survey was carried out in each location and then, three years later, another measurement was carried out.  A three-year gap was somewhat arbitrary: one year was too short, but five years seemed a bit long.  So we settled on three years!

Even though we had decided not to attempt to measure impact using complex randomized control trials, these survey exercises were still quite complicated, and we wanted the measurements to be reliable.  This was why we ended up hiring a “Development Effectiveness and Learning Manager” in each Country Office – to support the overall implementation of the DEF and, in particular, to manage the Outcome Indicator Surveys.  And these surveys were expensive and tricky to carry out, so we usually hired students from local universities to do the actual surveying.

Then we needed to decide what kind of survey to carry out.  Given the number of people in the communities where we worked, we quickly determined that a “census,” that is, interviewing everybody, was not feasible.

So I contacted a colleague at the US Member of the ChildFund Alliance, who was an expert in this kind of statistical methodology.  She strongly advised me to use the survey method that they (the US ChildFund) were using, called “Lot Quality Assurance Sampling.”  LQAS seemed to be less expensive than other survey methodologies, and it was highly recommended by our expert colleague.

(In many cases, during this period, we relied on technical recommendations from ChildFund US.  They were much bigger than the Australia Member, with excellent technical staff, so this seemed logical and smart .  But, as with Plan International during my time there, the US ChildFund Member had very high turnover, which led to many changes in approach.  This meant, in practice for us, although ChildFund Australia had adopted several of the Outcome Indicators that ChildFund US was using, in the interests of commonality, and – as I said – we had begun to use LQAS for the same reason, soon the US Member was changing their Indicators and abandoning the use of LQAS because new  staff felt it wasn’t the right approach.  This led to the US Member expressing some disagreement with how we, in Australia, were measuring Impact – even though we were following their – previous – recommendations!  Sigh.)

Our next step was to carry out baseline LQAS surveys in each field location.  It took time to accomplish this, as even the relatively-simple LQAS was a complex exercise than we were typically used to.  Surveys were supervised by the DEL Managers, carried out usually by students from local universities.  Finally, the DEL Managers prepared baseline reports summarizing the status of each of the ChildFund Australia Outcome Indicators.

Then we waited three years and repeated the same survey in each location.

(In an earlier article I described how Plan International, where I had worked for 15 years, had failed twice to implement a DEF-like system, at great expense.  One of the several mistakes that Plan had made was that they never held their system constant enough to be comparable over time.  In other words, in the intervening years after measuring a baseline, they tinkered with [“improved”] the system so much that the second measurement couldn’t be compared to the first one!  So it was all for naught, useless.  I was determined to avoid this mistake, so I was very reluctant to change our Outcome Indicators after they were set, in 2010; we did add a few Indicators as we deepened our understanding of our Theory of Change, but that didn’t get in the way of re-surveying the Indicators that we had started with, which didn’t change.)

Once the second LQAS survey was done, three years after the baseline, the DEL Manager would analyze differences and prepare a report, along with a translation of the report that could be shared with local communities, partners, and government staff.  The DEL Manager, at this point, did not attempt to attribute changes to any particular development actor (local government, other NGOs, the community themselves, ChildFund, etc.), but did share the results with the communities for validation.

Rather, the final DEF component I want to describe was used to determine impact.

*

Statements of Impact (Component #12 in Figure 1, Above)

The most exciting part of this process was how we used the changes measured over three years in the Outcome Indicators to assess Impact (defined, as described above, as change plus attribution.)

The heart of this process was a several-day-long workshop at which local people would review and discuss changes in the Outcome Indicators, and attribute the changes to different actors in the area.  In other words, if a particular indicator (say, the percentage of boys and girls between 12 and 16 years of age who had completed primary school) had changed significantly, people at the workshop would discuss why the change had occurred – had the local education department done something to cause the change?  Had ChildFund had an impact?  Other NGOs?  The local community members themselves?

Finally, people in the workshop would decide the level of ChildFund’s contribution to the change (“attribution”) on a five-point scale: none, little, some, a lot, completely.   This assessment, made by local people in an informed and considered way, would then serve as the basic content for a “Statement of Impact” that would be finalized by the DEL Manager together with his or her senior colleagues in-country, Sydney-based IPT staff and, finally, myself.

*

We carried out the very first of these “Impact” workshops in Svay Rieng, Cambodia, in February 2014.  Because this was the first of these important workshops, DEL Managers from Laos and Viet Nam attended, to learn, along with three of us from Sydney.

Here are some images of the ChildFund team as we gathered and prepared for the workshop in Svay Rieng:

IMG_2151

IMG_2169

IMG_2202

 

Here are images of the workshop.  First, I’m opening the session:

IMG_8605

 

Lots of group discussion:

IMG_8758

 

The DEL Manager in Cambodia, Chan Solin, prepared a summary booklet for each participant in the workshop.  These booklets were a challenge to prepare, because they would be used by local government, partners, and community members; but Solin did an outstanding job.  (He also prepared the overall workshop, with Richard Geeves, and managed proceedings very capably.)  The booklet presented the results of the re-survey of the Outcome Indicators as compared with the baseline:

IMG_8817

IMG_8795

 

Here participants are discussing results, and attribution to different organizations that had worked in Svay Rieng District over the three years:

IMG_9612

 

Subgroups would then present their discussions and recommendations for attribution.  Note the headphones – since this was our first Impact Workshop, and ChildFund staff were attending from Laos, Viet Nam, and Australia in addition to Cambodia, we provided simultaneous translation:

IMG_9694

 

Here changes in several Outcome Indicators over the three years (in blue and red) can be seen.  The speaker is describing subgroup deliberations on attribution of impact to the plenary group:

IMG_9703

IMG_9719

IMG_9699

IMG_9701

IMG_9747

IMG_9728

IMG_9763

 

Finally, a vote was taken to agree the attribution of positive changes to Outcome Indicators.  Participants voted according to their sense of ChildFund’s contribution to the change: none, a little, some, a lot, or completely.  Here is a ballot and a tabulation sheet:

IMG_9790

 

Finally, here is an image of the participants in that first Statement of Impact Workshop: Local Community Members, Government Staff, ChildFund Staff (From The Local Area, Country Office, Sydney, and From Neighboring Viet Nam):

IMG_2299

 

*

Once the community workshops were finished, our local Senior Management would review the findings and propose adjustments to our work.  Then the DEL Managers would prepare a final report, which we described as “Statements of Impact.”

Generally speaking, these reports would include:

  • An introduction from the Country Director;
  • A description of the location where the Statement of Impact was produced, and a summary of work that ChildFund had done there;
  • An outline of how the report was produced, noting the three-year gap between baseline and repeat survey;
  • Findings agreed by the community regarding changes to each Outcome Indicator along with any attribution of positive change to ChildFund Australia;
  • Concluding comments and a plan of action for improvement, agreed by the local Country Office team and myself.

Examples of these reports are shared below.

*

This process took some time to get going, because of the three-year delay to allow for re-surveying, but once it commenced it was very exciting.  Seeing the “Statement of Impact” reports come through to Sydney, in draft, from different program countries, was incredible.  They showed, conclusively, that ChildFund was really making a difference in the lives of children, in ways that were consistent with our Theory of Change.

Importantly, they were credible, at least to me, because they showed some areas where we were not making a difference, either because we had chosen not to work in a particular domain (to focus on higher priorities) or because we needed to improve our work.

*

I’m able to share four ChildFund Australia Statements of Impact, downloaded recently from the organization’s website.  These were produced as described in this blog article:

*

Here are a few of the findings from that first “Statement of Impact” in Svay Chrum:

  • ChildFund made a major contribution to the increase in primary-school completion in the district:

Screen Shot 2018-06-27 at 8.49.40 AM.png

 

  • Although the understanding of diarrhea management had improved dramatically, it was concluded that ChildFund had not contributed to this, because we hadn’t implemented any related projects.  “Many development actors contributed to the change”:

Screen Shot 2018-06-27 at 8.52.47 AM.png

 

  • ChildFund had a major responsibility for the improvement in access to hygienic toilets in the district:

Screen Shot 2018-06-27 at 8.49.54 AM.png

 

  • ChildFund made a significant contribution to the increase in access to improved, affordable water in the district:

Screen Shot 2018-06-27 at 8.54.41 AM.png

 

  • ChildFund had made a major contribution to large increases in the percentage of children and youth who reported having opportunities to voice their opinions:

Screen Shot 2018-06-27 at 8.56.08 AM.png

  • Although the percentage of women of child-bearing age in the district who were knowledgeable regarding how to prevent infection with HIV, it was determined the ChildFund had made only a minor contribution to this improvement.  And recommendations were made by the group regarding youth knowledge, which had actually declined:

Screen Shot 2018-06-27 at 8.57.47 AM.png

 

To me, this is fantastic stuff, especially given that the results emerged from deep and informed consultations with the community, local partners, and local authorities.  Really, this was the Holy Grail – accountability, and lots of opportunity for learning.  The results were credible to me, because they seemed to reflect the reality of what ChildFund had worked on, and pointed out areas where we needed to improve; the report wasn’t all positive!

*

For me, the way that the Outcome Indicator Surveys and Statements of Impact worked was a big step forward, and a major accomplishment.  ChildFund Australia now had a robust and participatory way of assessing impact so that we could take steps to confidently improve our work.  With these last two components of the DEF coming online, we had managed to put in place a comprehensive development-effectiveness system, the kind of system that we had not been able to implement in Plan.

As I shared the DEF – its design, the documents and reports it produced – with our teams, partners, Australian government, donors – I began to get lots of positive feedback.   At least for its time, in Australia, the ChildFund Australia DEF was the most comprehensive, robust, participatory, useful system put into place that anybody had ever seen.  Not the most scientific, perhaps, but something much better: usable, useful, and empowering.

*

My congratulations and thanks to the people who played central roles in creating, implementing, and supporting the DEF:

  • In Sydney: Richard Geeves and Rouena Getigan;
  • And the DEL Managers in our Country Offices: Chan Solin (Cambodia), Joe Pasen (PNG), Marieke Charlet (Laos), and Luu Ngoc Thuy and Bui Van Dung (Viet Nam).

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. Mt Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System.

 

 

Mt Bond (36) – “Case Studies” In ChildFund Australia’s Development Effectiveness Framework

June, 2018

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 35 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”

Last time I described the ChildFund Australia “Development Effectiveness Framework,” the system that would help us make sure we were doing what we said we were going to do and, crucially, verifying that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

This time, I want to go into more depth on one component of the DEF, the “Case Studies” that described the lived experience of people that we worked with.  Next time, I’ll describe how we measured the impact of our work.

But first…

*

On 10 August, 2017, I climbed three 4000-footers in one very long day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a tough day, covering 22 miles and climbing three very big mountains.  At the end of the hike, I felt like I was going to lose the toenails on both big toes (which, in fact, I did!) … it was a bit much!

Last time I wrote about climbing to the top of Bondcliff, the first summit of that day.  This time, I will describe the brief walk from there to the top of Mt Bond, the tallest of the three Bonds.  And next time I’ll finish describing that day, with the ascent of West Bond and the return to the trail-head at Lincoln Woods.

*

As I described last time, I arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  I was able to get an early start because I had stayed the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.

It was a bright and mostly-sunny day, with just a few clouds and some haze.  The path between Bondcliff and Mt Bond is quite short – really just dropping down to a saddle, and then back up again, only 1.2 miles:

Bond Map - 6b

 

It took me about an hour to cover that distance and reach the top of Mt Bond from Bondcliff at 11:30am.  The path was rocky as it descended from Bondcliff, in the alpine zone, with many large boulders as I began to go back up towards Mt Bond – some scrambling required.

This photo was taken at the saddle between Bondcliff and Mt Bond: on the left is Bondcliff, on the right is West Bond, and in the middle, in the distance, is Franconia Ridge; Mt Bond is behind me.  A glorious view on an amazing day for climbing:

IMG_1929.jpg

From the Left: Bondcliff, Franconia Ridge, West Bond

 

It got even steeper climbing up from the saddle to the summit, passing through some small pine shrubs, until just before the top.

The views were spectacular at the summit of Mt Bond, despite the sky being slightly hazy – I could see the four 4000-footers of the Franconia Ridge to the west and Owl’s Head in the foreground, the Presidential Range to the east, and several other 4000-footers to the south and south-west:

IMG_1948 (1)

Looking To The West From The Summit Of Mt Bond

 

And I had a nice view back down the short path from the top of Bondcliff:

IMG_1943 (1)

 

There were a few people at the top, and I had a brief conversation with a couple that were walking from Zealand trailhead across the same three mountains I was climbing, and finishing at Lincoln Woods.  This one-way version of what I was doing in an up-and-back trip was possible because they had left a car at Lincoln Woods, driving to the Zealand trailhead in a second vehicle.  They would then ferry themselves back to Zealand from Lincoln Woods.

Kindly, they offered to pick up my car down at Lincoln Woods and drive it to Zealand, which would have saved me three miles.  I should have accepted, because finishing what became 22 miles, and three 4000-foot peaks, would end up hobbling me for a while, and causing two toenails to come off!  But I didn’t have a clear sense of how the day would go, so I declined their offer, with sincere thanks…

Getting to the top of Mt Bond was my 36th 4000-footer – just 12 more to go!

I didn’t stay too long at the top of Mt Bond on the way up, continuing towards West Bond… stay tuned for that next time!

*

Jean and I had moved to Sydney in July of 2009, where I would take up the newly-created position of International Program Director for ChildFund Australia.  It was an exciting opportunity for me to work in a part of the world I knew and loved (Southeast Asia: Cambodia, Laos, Myanmar and Viet Nam) and in a challenging new country (Papua New Guinea).  It was a great chance to work with some really amazing people – in Sydney and in our Country Offices… to use what I had learned to help build and lead effective teams.  Living in Sydney would not be a hardship post, either!  Finally, it was a priceless chance for me to put together a program approach that incorporated everything I had learned to that point, over 25 years working in poverty reduction and social justice.

In the previous article in this series, I described how we developed a “Development Effectiveness System” (“DEF”) for ChildFund Australia, and I went through most of the components of the DEF in great detail.

My ambition for the DEF was to bring together our work into one comprehensive system – building on our Theory of Change and organizational Vision and Mission, creating a consistent set of tools and processes for program design and assessment, and making sure to close the loop with defined opportunities for learning, reflection, and improvement.

Here is the graphic that we used to describe the system:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

As I said last time, I felt that three components of the DEF were particularly innovative, and worth exploring in more detail in separate blog articles:

  • I will describe components #2 (“Outcome Indicator Surveys) and #12 (Statement of Impact) in my next article.  Together, these components of the DEF were meant to enable us to measure the impact of our work in a robust, participatory way, so that we could learn and improve;
  • this time, I want to explore component #3 of the DEF: “Case Studies.”

*

It might seem strange to say it this way, but the “Case Studies” were probably my favorite of all the components of the DEF!  I loved them because they offered direct, personal accounts of the impact of projects and programs from children, youth, men and women from the communities in which ChildFund worked and the staff and officials of local agencies and government offices with whom ChildFund partnered.  We didn’t claim that the Case Studies were random or representative samples; rather, their value was simply as stories of human experience, offering insights would not have been readily gained from quantitative data.

Why was this important?  Why did it appeal to me so much?

*

Over my years working with international NGOs, I had become uneasy with the trend towards exclusive reliance on linear logic and quantitative measurement, in our international development sector.  This is perhaps a little bit ironic, since I had joined the NGO world having been educated as an engineer, schooled in the application of scientific logic and numerical analysis for practical applications in the world.

Linear logic is important, because it introduces rigor in our thinking, something that had been weak or lacking when I joined the sector in the mid-1980s.  And quantitative measurement, likewise, forced us to face evidence of what we had or had not achieved. So both of these trends were positive…

But I had come to appreciate that human development was far more complex than building a water system (for example), much more complicated than we could fully capture in linear models.  Yes, a logical, data-driven approach was helpful in many ways, perhaps nearly all of the time, but it didn’t seem to fit every situation in communities that I came to know in Latin America, Africa, and Asia.  In fact, I began to see that an over-emphasis on linear approaches to human development was blinding us to ways that more qualitative, non-linear thinking could help; we seemed to be dismissing the qualitative, narrative insights that should also have been at the heart of our reflections.  No reason not to include both quantitative and qualitative measures.  But we weren’t.

My career in international development began at a time when the private-sector, business culture, started to influence our organizations in a big way: as a result of the Ethiopian famine of the mid-1980’s, INGOs were booming and, as a result, were professionalizing, introducing business practices.  All the big INGOs started to bring in people from the business world, helping “professionalize” our work.

I’ve written elsewhere about the positive and negative effects that business culture had on NGOs: on the positive side, we benefited from systems and approaches the improved the internal management of our agencies, such as clear delegations of authority, financial planning and audit, etc.  Overall, it was a very good, and very necessary evolution.

But there were some negatives.  In particular, the influx of private-sector culture into our organizations meant that:

  • We began increasingly to view the world as a linear, logical place;
  • We came to embrace the belief that bigger is always better;
  • “Accountability” to donors became so fundamental that sometimes it seemed to be our highest priority;
  • Our understanding of human nature, of human poverty, evolved towards the purely material, things that we could measure quantitatively.

I will attach a copy of the article I wrote on this topic here:  mcpeak-trojan-horse.

In effect, this cultural shift had the effect of emphasizing linear logic and quantitative measures to such a degree, with such force, that narrative, qualitative approaches were sidelined as, somehow, not business-like enough.

As I thought about the overall design of the DEF, I wanted to make 100% sure that we were able to measure the quantitative side of our work, the concrete outputs that we produced and the measurable impact that we achieved (more on that next time).  Because the great majority of our work was amenable to that form of measurement, and being accountable for delivering the outputs (projects, funding) that we had promised was hugely important.

But I was equally determined that we would include qualitative elements that would enable us to capture the lived experience of people who facing poverty.  In other words, because poverty is experienced holistically by people, including children, in ways that can be captured quantitatively and qualitatively, we needed to incorporate both quantitative and qualitative measurement approaches if we were to be truly effective.

The DEF “Case Studies” was one of the ways that we accomplished this goal.  It made me proud that we were successful in this regard.

*

There was another reason that I felt that the DEF Case Studies were so valuable, perhaps just as important as the way that they enabled us to measure poverty more holistically.  Observing our organizations, and seeing my own response to how we were evolving, I clearly saw that the influence of private-sector, business culture was having positive and negative effects.

One of the most negative impacts I saw was an increasing alienation of our people from the basic motivations that led them to join the NGO sector, a decline in the passion for social justice that had characterized us.  Not to exaggerate, but it seemed that we were perhaps losing our human connection with the hope and courage and justice that, when we were successful, we helped make for individual women and men, girls and boys.  The difference we were making in the lives of individual human beings was becoming obscured behind the statistics that we were using, behind the mechanical approaches we were taking to our work.

Therefore, I was determined to use the DEF Case Studies as tools for reconnecting us, ChildFund Australia staff and board, to the reason that we joined in the first place.  All of us.

*

So, what were the DEF Case Studies, and how were they produced and used?

In practice, Development Effectiveness and Learning Managers in ChildFund’s program countries worked with other program staff and partners to write up Case Studies that depicted the lived experience of people involved in activities supported by ChildFund.  The Case Studies were presented as narratives, with photos, which sought to capture the experiences, opinions and ideas of the people concerned, in their own words, without commentary.  They were not edited to fit a success-story format.  As time went by, our Country teams started to add a summary of their reflections to the Case Studies, describing their own responses to the stories told there.

Initially we found that field staff had a hard time grasping the idea, because they were so used to reporting their work in the dry, linear, quantitative ways that we had become used to.  Perhaps program staff felt that narrative reports were the territory of our Communications teams, meant for public-relations purposes, describing our successes in a way that could attract support for our work.  Nothing wrong with that, they seemed to feel, but not a program thing!

Staff seemed at a loss, unable to get going.  So we prepared a very structured template for the Case Studies, specifying length and tone and approach in detail.  This was a mistake, because we really wanted to encourage creativity while keeping the documents brief; emphasizing the “voice” of people in communities rather than our own views; covering failures as much as successes.  Use of a template tended to lead our program staff into a structured view of our work, so once we gained some experience with the idea, as staff became more comfortable with the idea and we began to use these Case Studies, we abandoned the rigid template and encouraged innovation.

*

So these Case Studies were a primary source of qualitative information on the successes and failures of ChildFund Australia’s work, offering insights from children, youth and adults from communities where we worked and the staff of local agencies and government offices with whom ChildFund Australia partnered.

In-country staff reviewed the Case Studies, accepting or contesting the opinions of informants about ChildFund Australia’s projects.  These debates often led to adjustments to existing projects but also triggered new thinking – at the project activity level but also at program level or even the overall program approach.

Case Studies were forwarded to Sydney, where they were reviewed by the DEF Manager; some were selected for a similar process of review by International Program staff, members of the Program Review Committee and, on occasion, by the ChildFund Australia Board.

The resulting documents were stored in a simple cloud-based archive, accessible by password to anyone within the organization.  Some Case Studies were also included on ChildFund Australia’s website; we encouraged staff from our Communications team in Sydney to review the Case Studies and, if suitable, to re-purpose them for public purposes.  Of course, we were careful to obtain informed consent from people included in the documents.

*

Through Case Studies, as noted above, local informants were able to pass critical judgement on the appropriateness of ChildFund’s strategies, how community members perceived our aims and purposes (not necessarily as we intended); and they could alert us to unexpected consequences (both positive and negative) of what we did.

For example, one of the first Case Studies written up in Papua New Guinea revealed that home garden vegetable cultivation not only resulted in increased family income for the villager concerned (and positive impact on children in terms of nutrition and education), it also enhanced his social standing through increasing his capacity to contribute to traditional cultural events.

Here are three images from that Case Study:

Screen Shot 2018-06-09 at 3.07.54 PM

Screen Shot 2018-06-09 at 3.07.27 PM

Screen Shot 2018-06-09 at 3.07.41 PM

 

And here is a copy of the Case Study itself:  PNG Case Study #1 Hillary Vegetable farming RG edit 260111.  Later I was able to visit Hillary at his farm!

Another Case Study came from the ChildFund Connect project, an exciting effort led by my former colleagues Raúl Caceres and Kelly Royds, who relocated from Sydney to Boston in 2016.  I climbed Mt Moriah with them in July, 2017, and also Mt Pierce and Mt Eisenhower in August of 2016.  ChildFund Connect was an innovative project that linked children across Laos, Viet Nam, Australia and Sri Lanka, providing a channel for them directly to build understanding of their differing realities.   This Case Study on their project came from Laos: LAO Case Study #3 Connect DRAFT 2012.

In a future article in this series, I plan on describing work we carried out building the power (collective action) of people living in poverty.  It can be a sensitive topic, particularly in areas of Southeast Asia without traditions of citizen engagement.  Here is a Case Study from Viet Nam describing how ChildFund helped local citizens connect productively with authorities to resolve issues related to access to potable water: VTM Case Study #21 Policy and exclusion (watsan)-FINAL.

*

Dozens of Case Studies were produced, illustrating a wide range of experiences with the development processes supported by ChildFund in all of the countries where we managed program implementation.  Reflections from many of these documents helped us improve our development practice, and at the same time helped us stay in touch with the deeper purpose of our having chosen to work to promote social justice, accompanying people living in poverty as they built better futures.

*

A few of the DEF Case Studies focused, to some extent, on ChildFund Australia itself.  For example, here is the story of three generations of Hmong women in Nonghet District in Xieng Khoung Province in Laos.  It describes how access to education has evolved across those generations:  LAO Case Study #5 Ethnic Girls DRAFT 2012.  It’s a powerful description of change and progress, notable also because one of the women featured in the Case Study was a ChildFund employee, along with her mother and daughter!

Two other influential Case Studies came from Cambodia, both of which touched on how ChildFund was attempting to manage our child-sponsorship mechanisms with our programmatic commitments.  I’ve written separately, some time ago, about the advantages of child sponsorship: when managed well (as we did in Plan and especially in ChildFund Australia), and these two Case Studies evocatively illustrated the challenge, and the ways that staff in Cambodia were making it all work well.

One Case Study describes some of the tensions implicit in the relationship between child sponsorship and programming, and the ways that we were making progress in reconciling these differing priorities: CAM Case Study 6 Sponsorship DRAFT 2012.  This Case Study was very influential, with our staff in Cambodia and beyond, with program staff in Sydney, and with our board.  It powerfully communicated a reality that our staff, and families in communities, were facing.

A second Case Study discussed how sponsorship and programs were successfully integrated in the field in Cambodia: CAM Case Study #10 Program-SR Integration Final.

*

As I mentioned last time, given the importance of the system, relying on our feeling that the DEF was a great success wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

The first review (attached here: External DEF Review – November 2012), which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in my next blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advanced in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

I included an overview of the conclusions reached by both reviewers last time.  Here I want to quote from the first evaluation, with particular reference to the DEF Case Studies:

One of the primary benefits of the DEF is that it equips ChildFund Australia with an increased quantity and quality of evidence-based information for communications with key stakeholders including the Board and a public audience. In particular, there is consolidated output data that can be easily accessed by the communications team; there is now a bank of high quality Case Studies that can be drawn on for communication and reflection; and there are now dedicated resources in-country who have been trained and are required to generate information that has potential for communications purposes. The increase in quantity and quality of information equips ChildFund Australia to communicate with a wide range of stakeholders.

One of the strengths of the DEF recognized by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through Case Studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

This focus on building tools, systems and the overall capacity of the organization places ChildFund Australia in a strong position to tackle a second phase of the DEF which looks at how the organization will use performance information for learning and development. It has already started on this journey, with various parts of the organization using Case Studies for reflection. ChildFund Australia has already undertaken an exercise of coding the bank of Case Studies to assist further analysis and learning. There is lots of scope for next steps with this bank of Case Studies, including thematic reflections. Again, the benefits of this aspect have not been realised yet as the first stages of the DEF roll-out have been focused on data collection and embedding the system in CF practices.

In most Country Offices, Case Studies have provided a new formal opportunity for country program staff to reflect on their work and this has been used as a really constructive process. The Laos Country Office is currently in the process of translating Case Studies so that they can be used to prompt discussion and learning at the country level. In PNG, the team is also interested in using the Case Studies as a communication tool with local communities to demonstrate some of the achievements of ChildFund Australia programs.

In some cases, program staff have found Case Studies confronting when they have highlighted program challenges or weaknesses. The culture of critical reflection may take time to embed in some country offices and may be facilitated by cross-country reflection opportunities. Currently, however, Country Office staff do not know how to access Case Studies from other country programs. ChildFund Australia is exploring how the ‘bank’ of DEF Case Studies would be most accessible and useful to country office personnel.

One of the uses of Case Studies has been as a prompt for discussion and reflection by the programs team in Sydney and by the Board. Case Studies have been seen as a really useful way to provide an insight into a program, practice and ChildFund Australia achievements.

At an organizational level, an indexing and cross-referencing system has been implemented which enables Case Studies to be searched by country and by theme. The system is yet to be introduced to MEL and Program users, but has potential to be a very useful bank of qualitative data for reflection and learning. It also provides a bank of data from which to undertake thematic reflections across and between countries. One idea for consideration is that ChildFund draw on groups of Case Studies to develop practice notes.

In general Case Studies are considered to be the most ‘successful’ part of the DEF by those involved in collecting information.

The second reviewer concentrated on other components, mainly aspects I will describe in more detail in my next article, not so much the Case Studies…

*

So the Case Studies were a very important element in the overall DEF.  I tried very hard to incorporate brief reflections on selected Case Studies at every formal meeting of the International Program Team, of ChildFund Australia’s Program Review Committee, and (less frequently) at meetings of our Board of Directors.  More often than not, time pressures on the agendas of these meetings led to us dropping the Case Studies from discussion, but often enough we did spend time (usually at the beginning of the meetings) reflecting on what we saw in them.

At the beginning, when we first began to use the Case Studies, our discussion tended to be mechanical: pointing out errors in the use of English, or questioning how valid the observations might be, challenging the statistical reliability of the conclusions.  But, over time, I noticed that our teams began to use the Case Studies as they were designed: to gain insight into the lived experience of particular human beings, and to reconnect with the realities of people’s struggle for better lives for themselves and their children.

This was a great success, and really worked as I had hoped.  The Case Studies complemented the more rigorous, quantitative components of the DEF, helping the system be holistic, enabling us to see more deeply into the effect that our work was having while also enhancing our accountability.

*

Next time, I will describe getting to the top of West Bond, and all the way down the 11 miles from there to the Lincoln Woods parking lot, where I staggered back to my car with such damage to my feet that I soon would lose toenails on both my big toes!  And I will share details of the final two components of the DEF that I want to highlight: the Outcome Indicator Surveys and Statements of Impact were probably the culmination of the whole system.

So, stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System.

 

 

Bondcliff (35) – ChildFund Australia’s Development Effectiveness Framework

June, 2018

I began a new journey just over two years ago, in May, 2016, tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers. I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

In each article, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

So, when I wrap things up in this series, there should be 48 articles…

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia, a newly-created position.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team – my intention was to lead and manage with clarity, trust, and inspiration.  A few weeks ago, I wrote describing the role and staffing and structural iterations of ChildFund’s International Program Team and, last time, I outlined the foundational program approach we put in place – a Theory of Change and Outcome and Output Indicators.

Once the program approach was in place, as a strong foundation, we moved forward to build a structured approach to development effectiveness.  I am very proud of what we achieved: the resulting ChildFund Australia “Development Effectiveness Framework” (“DEF”) was, I think, state-of-the-art for international NGOs at the time.  Certainly few (if any) other INGOs in Australia had such a comprehensive, practical, useful system for ensuring the accountability and improvement of their work.

Since the DEF was so significant, I’m going to write three articles about it:

  1. In this article I will describe the DEF – its components, some examples of products generated by the DEF, and how each part of the system worked with the other parts.  I will also share results of external evaluations that we commissioned on the DEF itself;
  2. Next time, I will highlight one particular component of the DEF, the qualitative “Case Studies” of the lived experience of human change.  I was especially excited to see these Case Studies when they started arriving in Sydney from the field, so I want to take a deep dive into what these important documents looked like, and how we attempted to use them;
  3. Finally, I will the last two DEF components that came online (Outcome Indicator Surveys and Statements of Impact), the culmination of the system, where we assessed the impact of our work.

So there will be, in total, three articles focused on the DEF.  This is fitting, because I climbed three mountains on one day in August of 2017…

*

On 10 August, 2017, I climbed three 4000-footers in one day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a very long, very tough day, covering 22 miles and climbing three mountains in one go.  At the end of the hike, I felt like I was going to lose the toenails on both big toes… and, in fact, that’s what happened.  As a result, for the rest of the season I would be unable to hike in boots and had to use hiking shoes instead!

Knowing that the day would be challenging, I drove up from Durham the afternoon before and camped, so I could get the earliest start possible the next morning.  I got a spot at Hancock Campground, right near the trailhead where I would start the climb:

IMG_1871.jpg

 

The East Branch of the Pemigewassit River runs alongside this campground, and I spent a pleasant late afternoon reading a book by Jean Paul Lederach there, and when it was dark I crawled into my sleeping bag and got a good night’s sleep.

IMG_1868

IMG_1869

 

Here is a map of the long ascent that awaited me the next morning, getting to the top of Bondcliff:

Bond Map - 3.jpg

 

After Bondcliff, the plan was that I would continue on to climb Mt Bond and West Bond, and to then return to Lincoln Woods… more on that in the next two articles in this series.  In this one I will describe climbing the first 4000-footer of that day, Bondcliff.

I got an early start on 10 August, packing up my tent-site and arriving at the trailhead at Lincoln Woods at about 6:30am:

IMG_1873.jpg

 

It was just two weeks earlier that I had parked here to climb Owl’s Head, which I had enjoyed a lot.  This time, I would begin the same way – walking up the old, abandoned forestry railway for about 2.6 miles on Lincoln Woods Trail, to where I had turned left up the Franconia Brook Trail towards Owl’s Head.  I arrived at that junction at about 7:30am:

IMG_1883.jpg

IMG_1891.jpg

 

 

This time I would continue straight at that intersection, continuing onto the Wilderness Trail, which winds through forest for a short distance, before opening out again along another old logging railway, complete with abandoned hardware along the way, discarded over 130 years ago:

IMG_1893.jpg

 

At the former (and now abandoned) Camp 16 (around 4.4 miles from the parking lot at Lincoln Woods), I took a sharp left and joined a more normal trail – no more old railway.  I began to ascend moderately, going up alongside Black Brook: now I was on the Bondcliff Trail.

 

I crossed Black Brook twice on the way up after leaving the Wilderness Trail, and then crossed two dry beds of rock, which were either rock slides or upper reaches of Black Brook that were dry that day.

IMG_1898.jpg

 

It’s a long climb up Black Brook; after the second dry crossing, Bondcliff Trail takes a sharp left turn and continues ascending steadily.  Just before reaching the alpine area, and the summit of Bondcliff, there is a short steep section, where I had to scramble up some bigger boulders.  Slow going…

But then came the reward: spectacular views to the west, across Owl’s Head to Franconia Ridge, up to Mt Garfield, and over to West Bond and Mt Bond.  Here Mt Lincoln and Mt Lafayette are on the left, above Owl’s Head, with Mt Garfield to the right:

IMG_1905

Lincoln and Lafayette In The Distance On The Left, Mt Garfield In The Distance On The Right

 

Here is a view looking to the southwest from the top of Bondcliff:

IMG_1907

From The Summit Of Bondcliff

IMG_1920

From The Summit Of Bondcliff

 

And this is the view towards Mt Bond, looking up from the top of Bondcliff:

IMG_1925

West Bond Is On The Left, And Mt Bond On The Right

 

I got to the top of Bondcliff at about 10:30am, just about four hours from the start of the hike.  Feeling good … at this point!  Here is a spectacular view back down towards Bondcliff, taken later in the day, from the top of West Bond:

IMG_1964.jpg

 

I would soon continue the climb, with a short hop from Bondcliff up to the top of Mt Bond.  Stay tuned!

*

Last time I wrote about how we built the foundations for ChildFund Australia’s new program approach: a comprehensive and robust “Theory of Change” that described what we were going to accomplish at a high level, and why; a small number of reliable, measurable, and meaningful “Outcome Indicators” that would enable us to demonstrate the impact of our work as related explicitly to our Outcome Indicators; and a set of “Output Indicators” that would allow us to track our activities in a consistent and comparable manner, across our work across all our programs: in Cambodia, Laos, Papua New Guinea, and Viet Nam.  (Myanmar was a slightly different story, as I will explain later…)

Next, on that foundation, we needed a way of thinking holistically about the effectiveness of our development work: a framework for planning our work in each location, each year; for tracking whether we were doing what we had planned; for understanding how well we were performing; and improving the quality and impact of our work.  And doing all this in partnership with local communities, organizations, and governments.

This meant being able to answer five basic questions:

  1. In light of our organizational Theory of Change, what are we going to do in each location, each year?
  2. how will we know that we are doing what we planned to do?
  3. how will we know that our work makes a difference and gets results consistent with our Theory of Change?;
  4. how will we learn from our experience, to improve the way we work?;
  5. how can community members and local partners directly participate in the planning, implementation, and evaluation of the development projects that ChildFund Australia supports?

Looking back, I feel that what we built and implemented to answer those questions – the ChildFund Australia “Development Effectiveness Framework” (“DEF”) – was our agency’s most important system.  Because what could be more important than the answers to those five questions?

*

I mentioned last time that twice, during my career with Plan International, we had tried to produce such a system, and failed (at great expense).  We had fallen into several traps that I was determined to avoid repeating this time, in ChildFund Australia, as we developed and implemented the DEF:

  • We would build a system that could be used by our teams with the informed participation of local partners and staff, practically – that was “good enough” for its purpose, instead of a system that had to be managed by experts, as we had done in Plan;
  • We would include both quantitative and qualitative information, serving the needs of head and heart, instead of building a wholly-quantitative system for scientific or academic purposes, as we had done in Plan;
  • We would not let “the best be the enemy of the good,” and I would make sure that we moved to rapidly prototype, implement, and improve the system instead of tinkering endlessly, as we had done in Plan.

I go into more detail about the reasons for Plan’s lack of success in that earlier article.

*

Here is a graphic that Caroline Pinney helped me create, which I used very frequently to explain how the DEF was designed, functioned, and performed:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

In this article, I will describe each component of the DEF, outlining how each relates to each other and to the five questions outlined above.

However, I’m going to reserve discussion of three of those components for my next two articles:

  • Next time, I will cover #3 in Figure 1, the “Case Studies” that we produced.  These documents helped us broaden our focus from the purely quantitative to include consideration of the lived experience of people touched by the programs supported by ChildFund Australia.  In the same way, the Case Studies served as valuable tools for our staff, management, and board to retain a human connection to the spirit that motivated us to dedicate our careers to social justice;
  • And, after that, I will devote an article to our “Outcome Indicator Surveys” (#2 in Figure 1, above) and Statements of Impact (#12 in Figure 1). The approach we took to demonstrating impact was innovative and very participatory, and successful.  So I want to go into a bit of depth describing the two DEF components involved.

Note: I prepared most of what follows.  But I have included and adapted some descriptive material produced by the two DEF Managers that worked in the International Program Team:  Richard Geeves and Rouena Getigan.  Many thanks to them!

*

Starting Points

The DEF was based on two fundamental statements of organizational identity.  As such, it was built to focus us on, and enable us to be accountable for, what we were telling the world we were:

  1. On the bottom left of the DEF schematic (Figure 1, above) we reference the basic documents describing ChildFund’s identity: our Vision, Mission, Strategic Plan, Program Approach, and Policies – all agreed and approved by our CEO (Nigel Spence) and Board of Directors.  The idea was that the logic underlying our approach to Development Effectiveness would therefore be grounded in our basic purpose as an organization, overall.  I was determined that the DEF would serve to bring us together around that purpose, because I had seen Plan tend to atomize, with each field location working towards rather different aims.  Sadly, Plan’s diversity seemed to be far greater than required if it were simply responding to the different conditions we worked in.  For example, two Field Offices within 20 km of each other in the same country might have very different programs.  This excessive diversity seemed to relate more to the personal preferences of Field Office leadership than to any difference in the conditions of child poverty or the local context.  The DEF would help ChildFund Australia cohere, because our starting point was our organizational identity;
  2. But each field location did need a degree flexibility to respond to their reality, within ChildFund’s global identity, so at the bottom of the diagram we placed the Country Strategy Paper (“CSP”), quite centrally.  This meant that, in addition to building on ChildFund Australia’s overall purpose and identity globally, we would also build our approach to Development Effectiveness on how we chose to advance that basic purpose in each particular country where we worked, with that country’s particular characteristics.

Country Strategy Paper

The purpose and outline of the CSP was included in the ChildFund Australia Program Handbook:

To clarify, define, communicate and share the role, purpose and structure of ChildFund in-country – our approach, operations and focus. The CSP aims to build a unity of purpose and contribute to the effectiveness of our organisation.

When we develop the CSP we are making choices, about how we will work and what we will focus on as an organisation. We will be accountable for the commitments we make in the CSP – to communities, partners, donors and to ourselves.

While each CSP will be different and reflect the work and priorities of the country program, each CSP will use the same format and will be consistent with ChildFund Australia’s recent program development work.

During the development of the CSP it is important that we reflect on the purpose of the document. It should be a useful and practical resource that can inform our development work. It should be equally relevant to both our internal and external stakeholders. The CSP should be clear, concise and accessible while maintaining a strategic perspective. It should reflect clear thinking and communicate our work and our mission. It should reflect the voice of children.  Our annual work plans and budgets will be drawn from the CSP and we will use it to reflect on and review our performance over the three year period.

Implementation of the DEF flowed from each country’s CSP.

More details are found in Chapter 5 of the Program Handbook, available here: Program Handbook – 3.3 DRAFT.  Two examples of actual ChildFund Australia Country  Strategy Papers from my time with the organization are attached here:

For me, these are clear, concise documents that demonstrate coherence with ChildFund’s overall purpose along with choices driven by the situation in each country.

*

Beginning from the Country Strategy Paper, the DEF branches in two inter-related (in fact, nested) streams, covering programs (on the left side) and projects (on the right side).  Of course, projects form part of programs, consistent with our program framework:

Screen Shot 2018-05-28 at 2.16.30 PM

Figure 2: ChildFund Australia Program Framework

 

But it was difficult to depict this embedding on the two dimensions of a graphic!  So Figure 1 showed programs on one side and projects on the other.

Taking the “program” (left) side first:

Program Description

Moving onto the left side of Figure 1, derived from the Country Strategy Paper, and summarized in the CSP, each Country Office defined a handful (some countries had 3, others ended up with 5) “Program Descriptions” (noted as #1 in Figure 1), each one describing how particular sets of projects would create impact, together, as measured using ChildFund Australia’s Outcome Indicators – in other words, a “Theory of Change,” detailing how the projects included in the program linked together to create particular  positive change.

The purpose and outline of the Program Description was included in the ChildFund Australia Program Handbook:

ChildFund Australia programs are documented and approved through the use of “Program Descriptions”.  All Program Descriptions must be submitted by the Country Director for review and approval by the Sydney International Program Director, via the International Program Coordinator.

For ChildFund Australia: a “program” is an integrated set of projects that, together, have direct or indirect impact on one or more of our agreed organisational outcome indicators.   Programs normally span several geographical areas, but do not need to be implemented in all locations; this will depend on the geographical context.  Programs are integrated and holistic. They are designed to achieve outcomes related to ChildFund Australia’s mission, over longer periods, while projects are meant to produce outputs over shorter timeframes.

Program Descriptions were summarized in the CSP, contained a listing of the types of projects (#5 in Figure 1) that would be implemented, and were reviewed every 3 or 4 years (Program Review, #4 in Figure 1).

To write a Program Description, ChildFund staff (usually program managers in a particular Country Office) were expected to review our program implementation to-date, carry out extensive situational analyses of government policies, plans and activities in the sector and of communities’ needs in terms of assets, aspirations and ability to work productively with local government officials responsible for service provision. The results of ChildFund’s own Outcome Indicator surveys and community engagement events obviously provided very useful evidence in this regard.

Staff then proposed a general approach for responding to the situation and specific strategies which could be delivered through a set of projects.  They would also show that the approach and strategies proposed are consistent with evidence from good practice both globally and in-country, demonstrated that their choices were evidence-based.

Here are 2 examples of Program Descriptions:

Producing good, high-quality Program Descriptions was a surprising challenge for us, and I’m not sure we ever really got this component of the DEF right.  Probably the reason that we struggled was that these documents were rather abstract, and our staff weren’t used to operating at this level of abstraction.

Most of the initial draft Program Descriptions were quite superficial, and were approved only as place-holders.  Once we started to carry out “Program Reviews” (see below), however, where more rigor was meant to be injected into the documents, we struggled.  It was a positive, productive struggle, but a struggle nonetheless!

We persisted, however, because I strongly believed that our teams should be able to articulate why they were doing what they were doing, and the Program Descriptions were the basic tool for that exact explanation.  So we perservered, hoping that the effort would result in better programs, more sophisticated and holistic work, and more impact on children living in poverty.

*

 

 

Program Reviews

For the same reasons outlined above, in my discussion of the “Program Descriptions” component of the DEF, we also struggled with the “Program Review” (#4 in Figure 1, above).  In these workshops, our teams would consider an approved “Program Description” (#1 in Figure 1) every three or four years, subjecting the document to a formal process of peer review.

ChildFund staff from other countries visited the host country to participate in the review process and then wrote a report making recommendations for how the Program under review might be improved.  The host country accepted (or debated and adjusted) the  recommendations, acted on them and applied them to a revision of the Program Description: improving it, tightening up the logic, incorporating lessons learned from implementation, etc.

Program Reviews were therefore fundamentally about learning and improvement, so we made sure that, in addition to peers from other countries, the host Country Office invited in-country partners and relevant experts.  And International Program Coordinators from Sydney were asked to always attend Program Reviews in the countries that they were supporting, again for learning and improvement purposes.

The Program Reviews that I attended were useful and constructive, but I certainly sensed a degree of frustration.  In addition to struggling with the relatively-high levels of abstraction required, our teams were not used to having outsiders (even their peers other ChildFund offices) critique their efforts.  So, overall, this was a good and very-important component of the DEF, designed correctly, but it needed more time for our teams to learn how to manage this process and to be open to such a public process of review.

*

Projects and Quarterly Reports

As shown on the right hand side of Figure 1, ChildFund’s field staff and partners carried out routine monitoring of projects (#6 in the Figure) to ensure that they were on track, and on which they based their reporting on activities and outputs.  Project staff summarized their monitoring through formal Quarterly Reports (#7) on each project documenting progress against project plans, budgets, and targets to ensure projects are well managed.  These Quarterly Reports were reviewed in each Country Office and most were also forwarded to ChildFund’s head office in Sydney (and, often, donors) for review.

When I arrived, ChildFund Australia’s Quarterly reporting was well-developed and of high quality, so I didn’t need to focus on this aspect of our work.  We simply incorporated it into the more-comprehensive DEF.

*

Quarterly Output Tracking

As described last time, ChildFund developed and defined a set of Outputs which became standard across the organization in FY 2011-12.  Outputs in each project were coded and  tracked from Quarter to Quarter by project.  Some of the organizational outputs were specific to a sector such as education, health and water sanitation or a particular target group such as children, youth or adults.  Other Outputs were generic and might be found in any project, for example, training or awareness raising, materials production and consultation.

Organizational Outputs were summarized for all projects in each country each Quarter and country totals were aggregated in Sydney for submission to our Board of Directors (#8 in Figure 1, above).  In March 2014 there were a total of 47 organizational Outputs – they were listed in my last article in this series.

One purpose of this tracking was to enhance our accountability, so a summary was reviewed every Quarter in Sydney by the International Program Team and our Program Review Committee.

Here is an example of how we tracked outputs: this is a section of a Quarterly Report produced by the International Program Team for our Board and Program Review Committee: Output Report – Q4FY15.

*

Project Evaluations

ChildFund also conducted reviews or evaluations of all projects (#9 in Figure 1, above) – in different ways.  External evaluators were employed under detailed terms of reference to evaluate multi-year projects with more substantial budgets or which were significant for learning or to a particular donor.  Smaller projects were generally evaluated internally.  All evaluators were expected to gather evidence of results against output targets and performance indicators written against objectives.

*

All development effectiveness systems have, at their heart, mechanisms for translating operational experiences into learning and program improvement.  In the representation of ChildFund’s DEF in Figure 1, this was represented by the central circle in the schematic which feeds back evidence from a variety of sources into our organizational and Country Strategy Papers, Program Descriptions and project planning and design.

Our program staff found that their most effective learning often occurred during routine monitoring through observation of project activities and conversations in communities with development partners.  Through thoughtful questioning and attentive listening, staff could make the immediate decisions and quick adjustments which kept project activities relevant and efficient.

Staff also had more formal opportunities to document and reflect on learning.  The tracking of outputs and aggregation each Quarter drew attention to progress and sometimes signaled the need to vary plans or redirect resources.

Project evaluations (#9 in Figure 1, above) provided major opportunities for learning, especially when external evaluators bring their different experiences to bear and offer fresh perspectives on a ChildFund project.

*

The reader can easily grasp that, for me, the DEF was a great success, a significant asset for ChildFund Australia that enabled us to be more accountable and effective.  Some more-technically-focused agencies were busy carrying out sophisticated impact evaluations, using control groups and so forth, but that kind of effort didn’t suit the vast majority of INGOs.  We could benefit from the learnings that came from those scientific evaluations, but we didn’t have the resources to introduce such methodologies ourselves.  And so, though not perfect, I am not aware of any comparable organization that succeeded as we did with our DEF.

While the system built on what I had learned over nearly 30 years, and even though I felt that it was designed comprehensively and working very well, that was merely my opinion!

Given the importance of the system, relying on my opinion (no matter how sound!) wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

*

The first review, which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in an upcoming blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advance in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

In that light, this first external review of the DEF concluded the following:

The development of the DEF places ChildFund Australia in a sound position within the sector in the area of development effectiveness. The particular strength of ChildFund Australia’s framework is that it binds the whole organisation to a set of common indicators and outputs. This provides a basis for focussing the organisation’s efforts and ensuring that programming is strategically aligned to common objectives. The other particular strength that ChildFund Australia’s framework offers is that it provides a basis for aggregating its achievements across programs, thereby strengthening the organisation’s overall claims of effectiveness.

Within ChildFund Australia, there is strong support for the DEF and broad agreement among key DEF stakeholders and users that the DEF unites the agency on a performance agenda. This is in large part due to dedicated resources having been invested and the development of a data collection system has been integrated into the project management system (budgeting and planning, and reporting), thereby making DEF a living and breathing function throughout the organisation. Importantly, the definition of outcomes and outputs indicators provides clarity of expectations across ChildFund Australia.

One of the strengths of the DEF recognised by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through case studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

Significantly, the DEF signals a focus on effectiveness to donors and the sector. One of the benefits already felt by ChildFund Australia is that it is able to refer to its effectiveness framework in funding submissions and in communication with its major donors who have an increasing interest on performance information.

Overall, the review found that the pilot of the DEF has been implemented well, with lots of consultation and engagement with country offices, and lots of opportunity for refinement. Its features are strong, enabling ChildFund to both measure how much it is doing, and the changes that are experienced by communities over time. The first phase of the DEF has focused on integrating effectiveness measurement mechanisms within program management and broader work practices, while the second phase of the DEF will look at the analysis, reflection and learning aspects of effectiveness. This second phase is likely to assist various stakeholders involved in collecting effectiveness information better understand and appreciate the linkages between their work and broader organisational learning and development. This is an important second phase and will require ongoing investment to maximise the potential of the DEF. It place ChildFund Australia in a strong position within the Australian NGO sector to engage in the discourse around development effectiveness and demonstrate its achievements.

A full copy of this first review, removing only the name of the author, is attached here: External DEF Review – November 2012.

In early 2015 we carried out a second review.  This time, we had implemented the entire DEF, carrying out (for example) Statement of Impact workshops in several locations.  The whole system was now working.

At that point, we were very confident in the DEF – from our point of view, all components were working well, producing good and reliable information that was being used to improve our development work.  Our board, program-review committee, and donors were all enthusiastic.  More importantly, local staff and communities were positive.

The only major concern that remained related to the methodology we used in the Outcome Indicator Surveys.  I will examine this issue in more detail in an upcoming blog article in this series; but the reader will notice that this second formal, external evaluation focuses very much on the use of the LQAS methodology in gathering information for our Outcome Indicator workshops and Statements of Impact.

That’s why the external evaluator we engaged to carry out this second review was an expert in survey methodologies (in general) and in the LQAS (in particular.)

In that light, this second external review of the DEF concluded the following:

ChildFund Australia is to be commended for its commitment to implementing a comprehensive and rigorous monitoring and evaluation framework with learning at its centre to support and demonstrate development effectiveness. Over the past five years, DEL managers in Cambodia, Laos, Papua New Guinea and Vietnam, with support and assistance from ChildFund Australia, country directors and program managers and staff, have worked hard to pilot, refine and embed the DEF in the broader country programs.  Implementing the DEF, in particular the Outcome Indicator Survey using LQAS, has presented several challenges.  With time, many of the early issues have been resolved, tools improved and guidelines developed.  Nevertheless, a few issues remain that must be addressed if the potential benefits are to be fully realised at the organisational, country and program levels.

Overall, the DEF is well suited for supporting long-term development activities in a defined geographic area.  The methodologies, scope and tools employed to facilitate Outcome Indicator Surveys and to conduct Community Engagement and Attribution of Impact processes are mostly fit for purpose, although there is considerable room for improvement.  Not all of the outcome indicators lend themselves to assessment via survey; those that are difficult to conceptualise and measure being most problematic. For some indicators in some places, a ceiling effect is apparent limiting their value for repeated assessment. While outcome indicators may be broadly similar across countries, both the indicators and the targets with which they are to be compared should be locally meaningful if the survey results are to be useful—and used—locally.

Used properly, LQAS is an effective and relatively inexpensive probability sampling method.  Areas for improvement in its application by ChildFund include definition of the lots, identification of the sampling frame, sample selection, data analysis and interpretation, and setting targets for repeated surveys.

Community Engagement and the Attribution of Impact processes have clearly engaged the community and local stakeholders.  Experience to date suggests that they can be streamlined to some extent, reducing the burden on staff as well as communities.  These events are an important opportunity to bring local stakeholders together to discuss local development needs and set future directions and priorities.  Their major weakness lies in the quality of the survey results that are presented for discussion, and their interpretation.  This, in turn, affects the value of the Statement of Impact and other documents that are produced.

The DEF participatory processes have undoubtedly contributed to the empowerment of community members involved. Reporting survey results in an appropriate format, together with other relevant data, in a range of inviting and succinct documents that will meet the needs of program staff and partners is likely to increase their influence.

A full copy of this second review, removing only the name of the author, is attached here: DEF Evaluation – April 2015.

*

Great credit is due to ChildFund staff that contributed to the conceptualization, development, and implementation of the DEF.  In particular, Richard Geeves and Rouena Getigan in the International Program Team in Sydney worked very hard to translate my sometimes overly-ambitious concepts into practical guidelines, and ably supported our Country Offices.

One of the keys to the success of the DEF was that we budgeted for dedicated in-country support, with each Country Office able to hire a DEL Manager (two in Viet Nam, given the scale of our program there.)

Many thanks to Solin in Cambodia, Marieke in Laos, Joe in Papua New Guinea, and Thuy and Dung in Viet Nam: they worked very hard to make the DEF function in their complex realities.  I admire how that made it work so well.

*

In this article, I’ve outlined how ChildFund Australia designed a comprehensive and very robust Development Effectiveness System.  Stay tuned next time, when I describe climbing Mt Bond, and then go into much more depth on one particular component (the Case Studies, #3 in Figure 1, above).

After that, in the following article, I plan to cover reaching the top of West Bond and descending back across Mt Bond and Bondcliff (and losing toenails on both big toes!) and go into some depth to describe how we carried out Outcome Indicator Surveys (#2 in Figure 1) and Statements of Impact (#12) – in many ways, the culmination of the DEF.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change.

 

 

Owl’s Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change

May, 2018

I began a new journey just over two years ago (May, 2016), tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

So, in each article in this series, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team with clarity, trust, and inspiration.  Last time I described the role and staffing and structural iterations of the International Program Team there.

This time, I want to begin to unpack the program approach that we put in place, building on what was already there, and on the lessons I had learned in the previous 25 years.

But first…

*

Owl’s Head (4025ft, 1227m) is described by many hikers as uninteresting, boring, and challenging – something that “should not be left to the end” of the 48 peaks.  I guess that’s because climbers want to finish their long voyage up so many great mountains in a blaze of glory, but they find Owl’s Head to be a letdown after the challenges and thrills of the other 47 4000-footers.

I climbed Owl’s Head on 26 July, 2017, and enjoyed every minute of it!

Yes, it’s long and mostly in the forest.  Yes, getting up the rock slide on the western side of Owl’s Head is tough going.  Yes, there are several river crossings which can be problematic when the water’s high.  And, yes, it’s not a ridge walk, so the views are (mostly) obstructed.  But on this late-July day, the walking was fantastic, the river crossings were nerve-wracking but doable, and the views going up (and coming down) the rock slide, looking across at Franconia Ridge, were fantastic.

I left Durham at about 6am, getting an early start because my calculations were that the ascent would be over 6 hours, just getting to the top.  Figuring in a descent of 4 hours, at least, made me want to get walking as soon as possible.  As has been my normal routine these days, I stopped in Tilton for coffee, and I bought a sandwich for lunch in Lincoln, very near the trailhead.

Screen Shot 2017-07-29 at 5.12.23 PM.png

 

I had brought sandals to carry with me for the river crossings, just in case.

After parking at the Lincoln Woods Visitor Center, I started the hike at 8:10am.

IMG_1546.jpg

 

It was a beautiful, cool, sunny day.  Just beyond the Visitor Center, two trails head up the East Branch of the Pemigewasset River: the Pemi East Side Trail and the Lincoln Woods Trail.  To get to the Lincoln Woods Trail, which I would take, I crossed a suspension bridge and took a right turn to head north:

IMG_1558.jpg

IMG_1561.jpg

 

The Lincoln Woods Trail runs along an old forest railway, and is wide and straight for over two miles.  Dappled, high forest, just gorgeous, crisp day.  Nervous about how long I thought it would take me to reach Owl’s Head, and return, I flew up this first easy part, almost trotting up the gentle incline:

IMG_1566.jpg

Lincoln Woods Trail – Formerly a Forest Railway, Straight and Wide

 

Old railway ties can be seen in the image, above.  Here is an image of one of the nails in a tie:

IMG_1710.jpg

 

There were a few other hikers heading up the Lincoln Woods Trail along with me, more than I expected on a summer Wednesday, but it wasn’t crowded.  I reached the junction with the Osseo Trail at 8:33am, and Black Point Trail at 8:53am:

 

Just before 9am, I arrived at the junction with Franconia Brook Trail.  So it had taken me about 50 minutes to walk up the 2.6 miles from the Lincoln Woods Visitor Center.  It had been gently up hill the whole way so far.

Here, just after a small footbridge over Franconia Brook, I would turn left, up the Franconia Brook Trail:

IMG_1572.jpg

Footbridge Over Franconia Brook

IMG_1576.jpg

 

(A few weeks later I would come to this junction once again, but would continue straight on the Bondcliff Trail.)

Franconia Brook Trail was a real trail, at least at the beginning, but soon, as I headed north up the Franconia Brook, there were long sections that must have also been old railway – straight, and wide, and gradually uphill.  Pleasant walking!  I thought that coming down would be even faster.

From here, the water level in Franconia Brook didn’t look too high:

IMG_1574.jpg

 

I hiked up Franconia Brook Trail, 1.7 miles, and reached the junction with Lincoln Brook Trail at 9:33am.  I was still making very good time – 1.7 miles in about 30 minutes.  But I didn’t feel that I was rushing, it was very nice hiking through the woods on the wide trail!

Here I would swing west to walk around Owl’s Head in a clockwise sense, following (and repeatedly crossing) the Lincoln Brook until reaching Owl’s Head Path:

IMG_1580.jpg

 

I would cross Franconia Brook four times going up the west side of Owl’s Head, and four times coming back down, retracing my steps.  The first crossing, at 9:44am, was the most difficult, and I almost gave my boots a good bath that time.  It was a little dicey…

Of course, as I climbed up the valley, the Brook became smaller as I walked above different streams that were feeding into it.  So the first (and last, when returning) crossing had the most water.

IMG_1583.jpg

IMG_1671.jpg

IMG_1588.jpg

IMG_1680.jpg

IMG_1593.jpg

 

 

The trail was less maintained here, certainly not an old forest railway, though I did see two trail crews working on it that day.

I reached the turnoff for Owl’s Head Path at 11:08am.  I had become nervous that I had passed it, feeling that I should have reached the turnoff some time before, and there were no signs.  By the time I reached the cairns marking the turnoff I was quite anxious and was thinking vaguely about turning back.  But, luckily, as I was approaching the cairns that can be seen in the next image, a young woman came down from having gone up Owl’s Head, and she confirmed that I had reached the junction!

IMG_1656.jpg

The Junction With Owl’s Head Path – Steeply Up From Here!

 

So it had taken me nearly an hour and a half to walk Lincoln Brook Trail, from Franconia Brook Trail to Owl’s Head Path, including four stream crossings.  Since Owl’s Head Path was supposed to be quite steep for some time, up a rock slide, I decided to leave some weight here at the bottom; so I took a quart of water and my sandals out of my pack and hid them at the junction.

I started up Owl’s Head at 11:17am, a bit lighter, after having a snack.  Soon I reached the famous rock slide, which was very steep, indeed.  Mostly gravel, so lots of sliding downward which made it heavy going.

IMG_1647.jpg

 

It was slippery and challenging, and did I mention that it was very steep?  Another young person came down and we crossed paths; she was very unhappy and had turned back before reaching the summit.  It was too dangerous and she was giving up, and was vocal about how unpleasant it was.  This would have been summit number 29 for her, but when carrying a full pack it wasn’t possible.  It was very heavy going, relentless and challenging!

But the views from the rock slide were fantastic, looking back towards Franconia Ridge I could see all four of the 4000-footers there: Flume, Liberty, Lincoln and Lafayette.  The light was still good, not yet noon, so the sun shined on the ridge from the east:

IMG_1642.jpg

Flume Is On The Far Left, Then Liberty, Lincoln, And Then Lafayette.

IMG_1643.jpg

 

Here is a video of that view from the rock slide, looking over to Franconia Ridge:

The White Mountain Guide indicates that the top of Owl’s Head is not very accessible, and that the end of Owl’s Head Path, which is just short of the actual summit, qualifies as reaching the top.  Apparently, at least when my edition of the Guide was published, reaching the actual summit involved a fair amount of bush-whacking.

Owl’s Head Path began to flatten out at about 12:09pm, and I reached what (I think) was the former end of the Path at about 12:15pm.

IMG_1628.jpg

The End Of Owl’s Head Path – Not The Summit!

 

Here I was able to turn left, to the north, and there was a path heading towards the actual summit – not a very wide path, switching back and forth a lot, but certainly not bush-whacking.

I got to the actual top at about 12:30pm, and had lunch.  Though I had seen a few other climbers after I passed the discouraged young woman, I had the summit to myself for lunch – it was very pleasant!

IMG_1615

Owl’s Head Summit

IMG_1620

IMG_1617.jpg

Some Vestiges Of Lunch Are Visible!

 

I had really really enjoyed the walk so far… maybe partly because expectations had been so low?

I left the summit, after a nice lunch, still wet with sweat, at about 12:45pm.  I could see Franconia Ridge to the west, through the forest:

IMG_1630.jpg

 

And there were some views to the east, towards the Bonds, but the Owl’s Head ridge was more forested that way, so no photos were possible.  I got back to the top of Owl’s Head Path at about 1pm, and to the beginning of the rock slide about 20 minutes later.  I dropped down the slide, taking care and many photos, reaching the junction with Lincoln Woods Trail at about 2pm.  So, about an hour to descend carefully.

The walk back down Lincoln Woods Trail was pleasant:

IMG_1662

IMG_1663

 

Recrossing Lincoln Brook four times – simpler this time – and passing the trail-maintenance crews again, I got back to the junction with Franconia Brook Trail at about 3:36pm.  Here I turned south and walked back down that old railway line:

 

There was a bit of old railway hardware along the side of the trail:

IMG_1678.jpg

 

For much of this section, there were a few mosquitoes, but the walking was pleasant, on a soft bed of pine needles:

IMG_1688.jpg

 

I passed a young woman resting on the side of the trail, with a very full pack.  “You’re carrying a lot!” I said, and she replied: “I’m ready to let it go!” in a resigned tone of voice…

Ups and down … mostly downward gently.  Long and level and wide.  I reached the junction with Lincoln Woods Trail at about 4:11pm, and the Trail got even wider and straighter and easier.  Funnily enough, there is a section of measured length here, which (of course) I had passed on the way up: 200 yards.  The idea is to measure how many paces it took.  On the way up, I counted 41 (double) paces, and 44 on the way back.  So I was walking with shorter paces on the way down!

 

I reached the Lincoln Woods Visitor Center, and my car, at about 5:15pm.  It had taken me almost 9 hours to climb Owl’s Head, which was substantially less than I had calculated: from the White Mountain Guide, just the ascent, walking up, should have been about 6 1/2 hours.

But it was a great hike on a wonderful day.  I enjoyed every minute of it!

*

As I arrived in Sydney to take up the newly-created position of International Program Director, one of my biggest priorities was to clarify our program approach.  This would involve lots of internal discussion, research and reflection, and I was determined to bring to this task the lessons I had learned in the previous 25 years of working in the sector (and described in the articles in this series!)

I understood that our program approach needed to be built on a clear understanding of what we were going to achieve, and why.  After completing the staffing of the first iteration of the International Program Team in Sydney, getting to know our programs in Cambodia, Papua New Guinea, and Viet Nam, and settling in with other Sydney-based senior managers and our board, I got going!

*

I had first heard of the concept of “Theory of Change” when I asked Alan Fowler to critique an early draft of the UUSC Strategic Plan in 2005.  He had, quite rightly, pointed out that the draft Strategy was good, but that it didn’t really clarify why we wanted to do what we were describing: how did we understand the links between our actions and our vision and mission?

Reflecting on Alan’s observation, I understood that we should put together a clear statement of causality, linking our actions with the impact we sought in the world.  So we did that, and ended up with a very important statement that really helped UUSC be clear about things:

Human rights and social justice have never advanced without struggle. It is increasingly clear that sustained, positive change is built through the work of organized, transparent and democratic civic actors, who courageously and steadfastly challenge and confront oppression. 

UUSC’s strategy derived from that statement in a powerful way.

Perhaps a better definition of the concept comes from the “Theory of Change Community”:

Theory of Change is essentially a comprehensive description and illustration of how and why a desired change is expected to happen in a particular context. It is focused in particular on mapping out or “filling in” what has been described as the “missing middle” between what a program or change initiative does (its activities or interventions) and how these lead to desired goals being achieved. It does this by first identifying the desired long-term goals and then works back from these to identify all the conditions (outcomes) that must be in place (and how these related to one another causally) for the goals to occur. These are all mapped out in an Outcomes Framework.

The Outcomes Framework then provides the basis for identifying what type of activity or intervention will lead to the outcomes identified as preconditions for achieving the long-term goal. Through this approach the precise link between activities and the achievement of the long-term goals are more fully understood. This leads to better planning, in that activities are linked to a detailed understanding of how change actually happens. It also leads to better evaluation, as it is possible to measure progress towards the achievement of longer-term goals that goes beyond the identification of program outputs.

At ChildFund Australia, one of my earliest actions was to develop and finalize a Theory of Change and the associated Outcomes Framework and Outputs.  In this article, I want to describe how we did this, and what we achieved.

*

First, some definitions.  Strangely, my experience is that when we in the INGO community try to agree on a common set of definitions, we usually end up arguing intensely and never agreeing!  The concepts we seek to define can be viewed productively in different ways; for me, it seemed most useful to find definitions that we could all live with, and use them, rather than trying to reach full consensus (which, over time, seemed to be an impossible dream!)

Here is the visual framework and definitions that we used in ChildFund Australia:

Screen Shot 2018-05-28 at 2.16.30 PM.png

 

A set of Inputs producing a consistent set of Outputs is a Project; a set of Projects producing a consistent set of Outcomes is a Program; a set of Programs producing a consistent set of Impacts is a Strategic Plan.

Note that:

  • “Inputs” are usually time or money;
  • “Outputs” are tangible and concrete products delivered by or through ChildFund: for example, a training course, a trip or meeting, a publication, rent, a latrine – see below;
  • “Outcomes” are changes in the Outcome Indicators that we developed – see below;
  • “Impact” is the highest-level of organisational achievement, related directly to the achievement of our mission.

This is pretty standard stuff, nothing particularly innovative.  But ChildFund Australia hadn’t formally adopted these definitions, which now began to provide a common language for our program work.

*

When we began to develop ChildFund Australia’s Theory of Change, Outcomes Framework, and Outputs, I took care to bring into the process several important lessons I had learned from previous experiences:

  • As mentioned above, from my experience at UUSC I had learned that the creation of a Theory of Change had the potential to be energizing and unifying, if it was carried out in a participatory manner;
  • Along the way, as the loyal reader of this series will have seen, my own view of development and poverty had grown to incorporate elements of social justice, collective action, and human rights.  I wanted to recognize these important elements into ChildFund Australia’s understanding of child poverty and development;
  • I recognized the significant complexity and cost associated with crafting and measuring Outcome Indicators, which would essentially articulate how we would hold ourselves accountable to our purpose.  Outcome Indicators are complex to use and expensive to measure.  So I felt that we should rely on the work done by technical agencies (the UNDP and UNICEF, other INGOs, and other ChildFund members) whenever possible, and to rely on national-government measurement systems when available and credible.  That meant that using MDG-related indicators, where appropriate, would be our first priority, because of the enormous effort that had been put into creating and measuring them around most of the world;
  • From my work with CCF, especially having participated in their child-poverty study, I had learned that children experience poverty in a more-complex way than we had earlier recognized: as deprivation, certainly; but also as exclusion and vulnerability.  We would incorporate this DEF framework now in Australia;
  • In my next blog article, I will describe how we created a “Development Effectiveness Framework” for ChildFund Australia.  The “DEF” would describe and detail the processes and products through which we would use the Theory of Change, Outcomes Framework, and Outcomes to operationally improve the effectiveness of our development work.  Twice, during my career with Plan International, we had tried to produce such a system, and failed comprehensively (and at great expense.)  We had failed due to several fundamental mistakes that I was determined to avoid making in Australia:
    • At Plan, we fell into the trap of designing a system whose purpose was, mostly, the demonstration of impact rather than learning and improvement of programming.   This led to a complex, and highly-technical system that was never actually able to be implemented.  I wanted, this time, to do both – to demonstrate impact and to improve programs – but fundamentally to create a practical system that could be implemented in the reality of our organization;
    • One of the consequences of the complexity of the systems we tried to design at Plan was that community members were simply not able to participate in the system in any meaningful way, except by using the data to participate in project planning.  We would change this at ChildFund, and build in many more, meaningful areas for community involvement;
    • Another mistake we made at Plan was to allow the creation of hundreds of “outputs.”  It seemed that everybody in that large organization felt that their work was unique, and had to have unique descriptors.  I was determined to keep the DEF as simple and practical as possible;
    • The Plan system was entirely quantitative, in keeping with its underlying (and fallacious) pseudo-scientific purpose.  But I had learned that qualitative information was just as valid as quantitative information, illustrating a range of areas for program improvement that complemented and extended the purely quantitative.  So I was going to work hard to include elements in the DEF that captured the human experience of change in narrative ways;
    • Both times we tried to create a DEF-like system in Plan, we never really quite finished, the result was never fully finalized and rolled out to the organization.  So, on top of the mistakes we made in developing the system, at great expense, the waste was even more appalling because little good came of the effort of so many people, and the spending of so much time and money.  In ChildFund, we would not let “the best be the enemy of the good,” and I would make sure to move to rapidly prototype, implement, and improve the system;
  • Finally, I had learned of the advantages and disadvantages of introducing this kind of fundamental change quickly, or slowly:
    • Moving slowly enables more participation and ownership, but risks getting bogged down and losing windows of opportunity for change are often short-lived;
    • Moving quickly allows the organization to make the change and learn from it within that short window of enthusiasm and patience.  The risk is that, at least for organizations that are jaded by too many change initiatives, the process can be over before people actually take it seriously, which can lead to a perception that participation was lacking.

I decided to move quickly, and our CEO (Nigel Spence) and board of directors seemed comfortable with that choice.

*

The ChildFund Australia Theory of Change

Upon arrival in Sydney in July of 2009, I moved quickly to put in place the basic foundation of the whole system: our Theory of Change.  Once staffing in the IPT was in place, we began.  Firstly, since we knew that effective programs address the causes of the situation they seek to change, building on the work of Amartya Sen, we defined poverty as the deprivation of the capabilities and freedoms people need to live the life they value.

Then I began to draft and circulate versions of a Theory of Change statement, incorporating input from our board, senior managers (in Sydney and in our Country Offices in Cambodia, Papua New Guinea and Viet Nam), and program staff across the agency.

This process went very well, perhaps because it felt very new to our teams.  Quickly we settled on the following statement:

Theory of Change.001

The ChildFund Australia “Theory of Change”

 

Note here that we had included a sense of social justice and activism in the Theory of Change, by incorporating “power” (which, practically, would mean “collective action”) as one central pillar.  And it’s clear that the CCF “DEV” framework was also incorporated explicitly.

The four dot-points at the end of the Theory of Change would come to fundamentally underpin our new program approach.  We would:

  • Build human, capital, natural and social assets around the child, including the caregiver.  This phrasing echoed the Ford Foundation’s work on asset-based development, and clarified what we would do to address child deprivation;
  • Build the voice and agency of poor people and poor children.  This pillar incorporated elements of “empowerment,” a concept we had pioneered in Plan South America long before, along with notions of stages of child and human development; and
  • Build the power of poor people and poor children.  Here we were incorporating the sense that development is related to human rights, and that human rights don’t advance without struggle and collective action; and we would
  • Work to ensure that children and youth are protected from risks in their environments.  Our research had shown that poverty was increasingly being experienced by children as related to vulnerability, and that building their resilience and the resilience of the caregivers and communities around them was crucial in the modern context.

This Theory of Change would serve admirably, and endure unchanged, through the next five years of program development and implementation.

*

Output Indicators

Now, how would we measure our accomplishment of the lofty aims articulated in the Theory of Change?  We would need to develop a set of Outcome and Output Indicators.

Recall that, according to the definitions that we had agreed earlier, Outputs were seen as: tangible and concrete products delivered by or through ChildFund: for example, a training course, a trip or meeting, a publication, rent, a latrine.

Defining Outputs was an important step for several reasons, mostly related to accountability.  Project planning and monitoring, in a classical sense, focuses on determining the outputs that are to be delivered, tracking whether or not they are actually produced, and adjusting implementation along the way.

For ChildFund Australia, and for our public stakeholders, being able to accurately plan and track the production of outputs represented a basic test of competence: did we know what we were doing?  Did we know what we had done?  Being able to answer those questions (for example, “we planned to drill 18 wells, and train 246 new mothers, and ended up drilling 16 wells and training 279 new mothers”) would build our creditability.  Perhaps more pungently, if we could not answer those questions (“we wanted to do the best we could, but don’t really know where our time and the budget went…”!) our credibility would suffer.  Of course, we wanted to know much more than that – our DEF would measure much more – but tracking outputs was basic and fundamental.

To avoid the trap we had fallen into in Plan, where we ended up with many hundreds of Outputs, I was determined to keep things simple.  We had already planned to bring all our Program Managers to Sydney in October of 2009, for another purpose, and I managed to commandeer this key group for a day.  I locked them in a meeting room for a day with the task of listing all the outputs that they were producing, and agreeing a short and comprehensive list.  We would then work with this draft and use it as a starting point.

The process worked very well.  Our Program Managers produced a list of around 35 Output Indicators that covered, well-enough, pretty much all the work they were doing.  Over the next three years, as our programming evolved and matured, we ended up adding about 15 more Output Indicators, with the final list (as of March, 2014) as follows:

Screen Shot 2018-05-28 at 3.01.27 PM.png

 

This listing worked very well, enabling us to design, approve, monitor and manage project activities in an accountable way.  As will be seen when I describe our Development Effectiveness Framework, in the next article in this series, we incorporated processes for documenting ChildFund Australia’s planning for Output production through the project-development process, and for tracking actual Output delivery.

Outcome Indicators

Designing Outcome Indicators was a bigger challenge.  Several of our colleague ChildFund agencies (mostly the US member) had developed indicators that were Outcome-like, and I was aware of the work of several other INGOs that we could “borrow.”  Most importantly, as outlined above, I wanted to align our child-focused Outcome Indicators with the Millennium Development Goals as much as possible.  These were robust, scientific, reliable and, in most countries, measured fairly accurately.

As we drafted sets of Outcome Indicators and circulated them for comment with our Board Program Review Committee, Senior Management, and program staff, our CEO (Nigel Spence) was insistent that we kept the number of Outcome Indicators as small as possible.

I agreed with Nigel, in general (“keep things simple”) and in particular (in Plan we had been swamped by too many indicators, and never actually implemented either system).  But it was a big challenge to measure the lofty concepts included in our Theory of Change with just a few indicators!

When we finalized the first iteration, approved by our Board of Directors in June of 2010, we had only 16 Outcome Indicators:

Screen Shot 2018-05-28 at 3.16.59 PM.png

Screen Shot 2018-05-28 at 3.17.10 PM.png

 

 

Nigel thought this was too many; I thought we had missed covering several crucial areas.  So it seemed a good compromise!

It would take some time to work out the exact mechanism for measuring these Indicators in our field work, but in the end we were able to keep things fairly simple and we began to work with communities to assess change and determine attribution (more on that in the next article in this series.)

Additional Outcome Indicators were introduced over the next few years, elaborating especially the domains of “Protection” and “Power,” which were relatively undeveloped in that initial package of 16, finalized in June of 2010.

*

So, by the time I was celebrating one year at ChildFund Australia, we had agreed and  approved a clear and comprehensive Theory of Change, a coherent and concise set of robust Outcome Indicators, and a complete set of (not too many) Output Indicators.

*

Looking back, I think we got this right.  The process was very inclusive and participatory, yet agile and productive.  The results were of high quality, reflecting the state of the art of our sector, and my own learning through the years.  It was a big step forward for ChildFund Australia.

This meant that the foundation for a strong Development Effectiveness Framework was in place, a framework which would help us make our program work as effective as possible in building brighter futures for children.  This was (if I do say so myself!), a huge achievement in such a complex organization, especially that we accomplished it in only one year.

From the perspective of 2018, there is little I would change about how we took on this challenge, and what we produced.

*

My next article in this series will describe how we build the ChildFund Australia Development Effectiveness Framework on the foundation of our Theory of Change and Outcome and Output Indicators.  Stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (1): the ChildFund Australia International Program Team.

Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101.

October, 2017

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

Since then, across 25 posts (so far), I’ve described climbing 25 4000-foot mountains in New Hampshire, and I’ve reflected on: two years as a Peace Corps Volunteer in Ecuador; my 15 years with Plan International; the deep, disruptive changes in the development sector over that time; and, most recently, the two years I spent consulting with CCF, developing a new program approach for that agency that we called “Bright Futures.”

This time I want to conclude my description of those Bright Futures years by sharing our attempt to encourage a new set of values and attitudes in CCF’s staff, through a weeklong immersion, experiential training workshop we called “Bright Futures 101.”

*

Peter Drucker is supposed to have said that “culture eats strategy for breakfast.”  This certainly seemed to be true as CCF moved into the pilot testing and rollout of Bright Futures – the agency was investing in new systems and new structures in a big way.  But Bright Futures would only realise its promise of more effective work for children living in poverty if the culture of the organisation shifted how it viewed its work, how it viewed the people it worked for.

*

But first… I climbed both Mt Lincoln and Mt Lafayette on 22 June, 2017, on a beautiful, mostly-sunny day.  My plan had been to do this loop back in September of 2016, with my brother, but my fall and the resulting injuries (broken rib, torn rotator cuff) forced a postponement.

That morning I left Durham at 6:45am, and drove up through Concord, stopping in Tilton for a coffee, and in Lincoln to buy a sandwich for lunch.  So I didn’t get to the trailhead until just after 9am.

The parking lot at Lafayette Place was nearly-full, with lots of people arriving, getting ready to hike on what was a clear, cool day, perfect for hiking.  It was a bit surprising for a Thursday; I was glad not to be doing this climb on the weekend!

I know that I climbed both Lincoln and Lafayette in the distant past, probably in the 1980’s, but I don’t really have any clear memory of the hike.  So it was new to me, again, perhaps 30+ years later!

On this day, I had arrived at the trailhead for both the “Falling Waters” trail, and for the “Old Bridle Path.”  I planned to walk up Falling Waters, across Franconia Ridge to Mt Lincoln and Mt Lafayette, and then down the Old Bridle Path, back to Lafayette Place.

Screen Shot 2017-07-13 at 12.11.06 PM.png

 

As I started out, there were many people walking along with me, so it took some time to get sorted into a fairly-stable pack.  It took me about 15 minutes to reach the beginning of the Falling Waters Trail; I would return here later in the day, coming down the Old Bridle Path.  So far, it was a beautiful day for hiking!  But lots of people…

I continued up the Falling Waters trail, along the stream with many small waterfalls (so, the trail is aptly named!)  I took lots of photos and several videos of the waterfalls.  The trail ascended steadily, moderately, along the brook.

IMG_0888

 

The walk was typical White-Mountains rock-hopping, moderately and steadily upward in the shadow of Mt Lincoln.  I was working pretty hard, and gradually more space opened up between groups of hikers.  There were no insects during this part of the hike – indeed, there would be none until I got to Greenleaf Hut later in the afternoon.

I started to emerged from the forest into scrub pine at about 11am, and the views across to Franconia Notch became remarkable:

IMG_0888

 

Then, suddenly, I was out of the trees, ascending Little Haystack, and the views were just spectacular:

IMG_0907

Mt Lafayette and Franconia Notch

IMG_0902

Mt Lincoln Just North Of Mt Haystack

IMG_0901

Looking North Towards Mt Lincoln

IMG_0896

Franconia Notch.  Cannon Mountain is Clearly Visible At The Top Of Franconia Notch

IMG_0894

North and South Kinsman Visible Across Franconia Notch

IMG_0891

Cannon Mountain and the Kinsmans

 

I reached the top of Little Haystack at 11:25am, where I joined the Franconia Ridge Trail:

IMG_0908.jpg

 

I had been ascending the western slopes of Mt Lincoln; once I got up onto Franconia Ridge, views to the east were just as amazing: I was above Owl’s Head, and could easily see Bond Mountain, West Bond, and Bondcliff (all of which I would climb on a very long day in September, later that year), and out across the Twins to Washington and the Presidential Range in the distance.  Maybe I could see the Atlantic Ocean far in the distance.

IMG_0911

Looking East Towards Owl’s Head and the Bonds

IMG_0915

Looking South Towards Mt Liberty and Mt Flume

IMG_0913

Looking North Towards Mt Lincoln

 

There were many people at the top of Little Haystack, some of whom were probably staying at the nearby Greenleaf AMC Hut., which I would pass on my way down, later.  But many also were doing the same loop that I was doing, across Lincoln and Lafayette.  One amazing boy, maybe 4 years old, was zipping along ahead of his mother, who kept calling him back.  He seemed full of energy, and wanted to fly ahead.  I wondered how long his energy would last, but he certainly kept it up for the whole time I saw him… weaving in and out of my path, with his mother calling out to him all the way.

The walk along Franconia Ridge, to Mt Lincoln, was spectacular.

 

I arrived at the summit of Mt Lincoln right at noon, and rested briefly.  It had taken about 2 hours and 40 minutes to the top from the Lafayette Place parking area.

IMG_0927.jpg

 

It was too early for lunch, so I soon left Mt Lincoln and headed north towards Mt Lafayette.  I will describe that hike, and the trek back down, next time!

*

Last time I described how we had piloted the Bright Futures program approach in CCF, further developing and testing the methods, systems, and structures that had been defined through our research and internal and external benchmarking.  It was a very exciting process, and I was lucky to be asked to accompany the pilot offices in Ecuador, the Philippines, and Uganda as they explored the disruptive changes implied in Bright Futures.  Lots of travel, and lots of learning and comradeship.

Near the end of that period, I came into contact with the Unitarian Universalist Service Committee (UUSC), a human-rights, social-justice campaigning organization based in Cambridge, Massachusetts.  In late 2004, as I was finishing my consulting time with CCF as acting Regional Representative for East Africa, based in Addis Ababa, I was offered a position at UUSC as Executive Director (initially as “Deputy Director”) working for Charlie Clements, UUSC’s dynamic and charismatic president and CEO.

Working at UUSC would be a big and exciting shift for me, out of international development and into social justice campaigning.  But the move felt like a natural extension of what we had been doing in CCF, where we had included an explicit focus on building the power of excluded people into Bright Futures.  I was able to use what I had learned across 20 years in the international development sector, leading and managing large international agencies, to lead and manage operations at UUSC, while also learning about campaigning and advocacy (and working in a unionized context!)

I’ll begin to describe my years at UUSC next time.  For now, I want to skip forward a few years, to my second, brief incarnation with CCF.

*

In early 2009, a few former colleagues at CCF, now rebranded as ChildFund International, got back in touch.  At that point I had transitioned to the 501c4 branch of UUSC, which we had created in 2008, and I had some spare time after the federal election the year before.  (More on that in a future post.)

Between 2004 and 2009, ChildFund had continued to roll out Bright Futures, but there had been major changes in leadership.  Sadly, John Schulz, CCF’s president, had taken a leave of absence to fight cancer, and had then died.  Though I had never worked directly with John, I had always appreciated his leadership and his unwavering support to Daniel Wordsworth and Michelle Poulton as they redesigned the agency’s program approach.

The internal leadership changes that took place after John’s departure led to Daniel and Michelle leaving CCF, as Anne Goddard became the agency’s new CEO in 2007.  Initially, at least, it seemed that the global transition to Bright Futures continued to be a priority for ChildFund.  (Later, that would change, as I will describe below…)

During that period, as Bright Futures was scaled up across the agency, many structural and systems-related challenges were addressed, and staff inside ChildFund’s program department were busy addressing these issues – updating their financial systems, transitioning long partnerships, training new staff in new positions.  In particular, Mike Raikovitz, Victoria Adams, Jason Schwartzman, and Dola Mohapatra were working very hard to sort out the nuts and bolts of the change.

It is a truism, attributed to Peter Drucker, that “culture eats strategy for breakfast.”  Alongside their important, practical work, Jason and Dola in particular were learning that lesson, and as a result they began to focus also on the cultural side of the change involved in Bright Futures: the attitudes and values of ChildFund staff.  Systems and structures were vital elements of Bright Futures, but nothing would work if staff retained their old attitudes toward their work, toward the people they worked with and for.  And there was a clear need, from Jason’s and Dola’s perspective, for attitude shifts; in fact, it seemed to them that the biggest obstacle to implementing Bright Futures were old values and attitudes among existing staff.

*

Dola worked as Deputy Regional Director for ChildFund Asia, a brilliant and highly-committed professional.  I worked closely with Dola in the design and implementation of BF101, and I enjoyed every moment of it; I admired Dola’s passion and commitment to ChildFund’s work, and his dedication to improving the effectiveness of ChildFund’s programming.

DSC01210

Dola Mohapatra, at the BF101 workshop

 

Jason managed a range of program-related special projects from ChildFund’s headquarters in Richmond, Virginia.  Jason was (and is) a gifted and insightful professional, who I had met back during my tenure as Plan’s program director, when he had worked with CCF’s CEO in a collaboration with Plan and Save and World Vision.  Jason had rejoined ChildFund to help develop an approach to working with youth.

Screen Shot 2017-10-25 at 11.16.09 AM.png

Jason Schwartzman, on the left, during our community immersion

 

In addition to Dola and Jason, I worked closely with Evelyn Santiago, who was ChildFund Asia’s program manager.  Evelyn brought key skills and experience to the design of our workshop.

DSC01519 (1024x768)

Evelyn Santiago at the BF101 Workshop

Screen Shot 2017-10-25 at 4.21.43 PM.png

Jason, Me, Dola and Evelyn

 

As noted above, Dola and Jason had identified the need to reinforce the values and attitudes side of Bright Futures, and felt that a deep, experiential-learning event might help better align staff with the principles of the new program approach.  They approached me for help and, as I had some time, we worked together to design and carry out a ten-day workshop that we called “Bright Futures 101” – in other words, the basics of Bright Futures, with a big emphasis on values and attitudes.

Working with Jason, Dola and Evelyn was a privilege – they were and are smart, experienced professionals whose commitment to social justice, and to the principles and values of Bright Futures were strong.

In this blog post, I want to describe “BF101” – our approach, the design, and how it went.

*

Rather than being just introduction to the tools incorporated into Bright Futures, our purpose was to promote and encourage the kinds of personal transformations required to make the new program approach a reality.  So we prepared something that ChildFund had never tried before – a long, experiential workshop with a village stay.

From the beginning, we agreed that BF101 would have two overall objectives:

  1. to build a comprehensive understanding of the principles underlying ChildFund’s Bright Futures program approach; and
  2. to build a questioning, exploring, and adaptive approach to program development and implementation that was aligned with ChildFund’s value of fostering and learning from its own innovation.

So, implicitly, we wanted to shift ChildFund’s culture.  By including significant participant leadership, immersion in communities, experiential education, and pre- and post-course assignments, we wanted to promote a meaningful connection between head (understanding), heart (values and principles), and hand (concrete action), thinking that this connection would spill over into their daily work when they returned home.  A 1 1/2-day immersion in a local community would be a key component of the workshop.

After a lengthy, collaborative design process, we agreed on a three-part workshop design (included here – Building Program Leaders – Immersion Workshop – Final Preworkshop Version).  The overall framework looked like this:

Screen Shot 2017-10-25 at 1.53.26 PM.png

Once Dola and Evelyn approved the design, they asked ChildFund Philippines to book a venue, and invitations were sent out to 3 or 4 participants from each office in Asia.  Extensive pre-reading assignments were sent to each participant, covering current trends in poverty and international development as well as the fundamental documents related to Bright Futures that I have shared in earlier posts in this series, such as the CCF Child Poverty Study, the Organisational Capacity Assessment, etc.

*

In the first workshop section, “Setting the Stage,” we would prepare participants for the experience.  A lengthy role play, adapted from a full-day exercise I had created in Viet Nam, was designed to challenge participants in an experiential, emotional manner, helping them actually feel what it was like to be a community member participating in programs implemented by ChildFund in the old way, the pre-Bright-Futures way.

We assigned various roles – community members (dressed appropriately), staff members of a fictitious NGO called “WorldChild International” (wearing formal attire), observers, etc.  I had written an extensive script (Role Play – Module 1 – Design – 4) which set up a serious of interactions designed to provoke misunderstandings, conflict, moments of emotional impact, and some fun:

Role Play 3 (1024x768)

DSC00544

DSC01079

 

As usual, the most important part of any exercise like this one was the group reflection afterwards, in this case led by Lloyd McCormack:

DSC01089

 

This led into a session, which I led, on mind-shifts and archetypes: M2 – Archetypes – 2.  The purpose here was to build on the impact from the role play to get participants thinking about their own attitudes and values, and how they might need to shift.

Ending the first section of the workshop, Jason, who had flown in directly from the US and was quite jet-lagged, gave an excellent historical overview of CCF’s programmatic evolution.  This presentation contained an important message of continuity: Bright Futures was the next step in a long and proud programmatic history for the agency: we were building on what had been accomplished in the past, not starting over.  Jason’s presentation set the scene for our work on the changes in attitudes and values that were in store:

Jason4.jpg

 

The next sessions outlined each of the main values and commitments articulated in Bright Futures (at least at that point in its evolution):

  • Deprived, Excluded, and Vulnerable children are our primary focus.  This session built on the CCF Poverty Study, which I described in an earlier post in this series.  At BF101 we sought to unpack what this “primary focus” would mean in practice;
  • We Build on the Stages of Child Development.  After I had concluded my tenure as consultant at CCF, program development efforts had built on Bright Futures by articulating a clear theory of child development, along with interventions related to each stage.  This was a very good development in ChildFund’s program approach which, however, had the potential to conflict with the bottom-up nature of Bright Futures.   So this section of BF101 would deepen understanding on how to resolve this seeming contradiction in practice;
  • Programs are Evidence-Based.  Again, ChildFund had continued to develop aspects of its program approach, building on Bright Futures to try to professionalize the design of projects and programs.  As above, this was a very good development in ChildFund’s program approach which, however, had the potential to conflict with the bottom-up nature of Bright Futures.   So we would reflect on how to resolve this seeming contradiction in practice;
  • We Build Authentic Partnerships.  This commitment flowed directly from the work we had done on Bright Futures earlier.

*

Perhaps the most important and crucial element of the BF101 design was a 1 1/2-day stay in communities.  We divided up the participants into smaller groups, and set out to spend a night in a community nearby the conference center:

 

*

Our concluding sessions were aimed at building on the community immersion by considering a range of personal and institutional transformations required, discussing systems implications, and then breaking into National Office groups to plan for action after the workshop.

*

During the workshop, Jason was blogging regularly, and asked me to prepare one, also.  Here is one of Jason’s blogs: http://ccfinthefield.blogspot.com/2009/05/opposite-sides-time-to-reflect.html.  And here is mine: http://ccfinthefield.blogspot.com/2009/05/seeking-balance.html.

*

We used a simple tool to track participant assessments along the way:

IMG_1047.jpg

 

As can be seen, the overwhelming majority of participants rated the workshop as very positive and helpful.  I myself felt quite happy with the workshop – I felt that we had gotten fairly deep into discussions that had the potential to transform people’s attitudes and values in a positive way.  Although it was a lot to ask people to set aside their work and families for seven full days, and to spend a night in a village, it seemed to pay off.

So, BF101 was successful, and fun.  Together with the systems work and structural shifts that were ongoing in the agency, it set the scene for the continued rollout of Bright Futures across ChildFund International, now including a positive, constructive way to promote values and attitudes consistent with the new program approach.

*

But, sadly, Bright Futures would soon be set aside by ChildFund.  In what felt like an echo of Plan International’s pathology (new leadership = starting over from scratch), despite having embraced the approach initially, ChildFund’s new leadership moved deliberately away from Bright Futures.  The global financial crisis had erupted and, like many international NGOs, ChildFund’s income was dropping.  It was felt that investment in the transition to Bright Futures was no longer affordable, so much of the investment in research, piloting, systems development, and training (for example, followup to BF101) was dropped.

As a consultant, I could only look at this decision with sadness and regret.  The dedication and resources that Michelle, Daniel, Victoria, Mike, Jon, Andrew, Jason, Dola and many others across ChildFund had invested in such a positive and disruptive shift was, to a great extent, lost.

Many years later, when I joined ChildFund Australia as International Program Director, a very senior program leader expressed similar regret to me, lamenting that Bright Futures was a clear ideology which was now lacking.

I’ve recently been reminded of another consequence of the virtual abandonment of Bright Futures: a year later, 65% of the participants in the BF101 workshop had left ChildFund.  Perhaps we didn’t do enough to help participants operationalize the changes we were promoting, in the context of ChildFund’s reality of the time.  But that would have been quite a contradiction of the basic message of BF101: that each person needed to take the initiative to operationalize their own transformations.

My own assumption is that the personal transformations begun during our week in the Philippines led to significant disappointment when the agency didn’t follow through, when ChildFund didn’t (or wasn’t able to) invest in creating BF102, 202, etc.

*

Why is it that international NGOs so often suffer this phenomenon, that when leadership changes (at country, regional, or global levels) everything changes?  That new leaders seem to view the accomplishments of their predecessors as irrelevant or worse?

I think it comes, at least in part, from the way that we who work in the value-based economy associate ourselves, and our self images, with our work so strongly and emotionally.  This ego-driven association can be a great motivator, but it also clouds our vision.  I saw this many times in Plan, as many (if not most) new Country Directors or Regional Directors or International Executive Directors scorned their predecessors and dismissed their accomplishments as misguided at best, quickly making fundamental changes without taking the time to appreciate what could be build upon.  And, when the next generation of leaders arrived, the cycle just repeated and repeated.

This, to me, is the biggest weakness of our sector.  Today, alongside this ego-driven pathology, the entire international-development sector is also facing severe disruptive change, which greatly complicates matters… but that’s a story for another day!

*

Meanwhile, I made the big move, joining UUSC as Executive Director, shifting from international development to social justice and human rights campaigning, internationally and domestically.  And into a strongly unionized environment.  These were the days of Bush’s Iraq invasion, torture and neoliberal economics, and I was excited to turn my work towards the grave problems affecting my own country.

Next time I will begin to tell that part of the story… stay tuned!

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.

Mt Isolation (25) – Pilot-Testing Bright Futures

September, 2017

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

So, in the end, there will be 48 posts about climbing 48 mountains and about various aspects of the journey to thus far…

*

Leaving Plan International after 15 years, the last four of which were spent as Country Director in Viet Nam, I was fortunate to join CCF as a consultant.  My task, over what became two great years with CCF, was to help develop a new program approach for the agency.  This was exciting and opportune for me: I had been reflecting a lot about how things had changed in the development sector, and at that point I had a lot of experience across five continents, in a wide variety of roles, under my belt.

There was probably nobody in the world better suited for the task.

*

Last time, I wrote extensively about what we came up with: the “Bright Futures” program approach.  We developed the approach through a very thorough process of reflection, benchmarking, and research, and even though the changes foreseen for CCF were very significant and disruptive, senior management and board embraced our recommendations enthusiastically.  We were given the green light to pilot test the approach in three countries – Ecuador, the Philippines, and Uganda – and I was asked to train staff in each location, accompany the rollout, document learning, and suggest refinements.

This meant that I would continue to work with Michelle Poulton and Daniel Wordsworth, development professionals I’ve described in earlier blogs, people I admired and who were very serious about creating a first-class program organization.

What a fantastic opportunity!

In this blog, I want to describe how the pilot testing went.  But first…

*

I climbed Mt Isolation (4004ft, 1220m) on 8 June 2017, after having spent the previous night at Dolly Copp Campground.  Since getting to the top of Mt Isolation would be a long hike, I wanted to have a full day, so I drove up the previous afternoon and camped at Dolly Copp Campground in Pinkham Notch.

Screen Shot 2017-07-08 at 4.00.36 PM.png

 

As you can see, I went up to the top of Mt Isolation and retraced my steps back, which involved quite a long hike.  I’ve included a large-scale map here, just so that the context for Mt Isolation can be seen: it’s in a basin to the south and east of the Presidential range, with Mt Washington, Adams, Jefferson and Monroe to the north, and Eisenhower, Pierce, and Jackson to the west.

Sadly, I spent an uncomfortable night at Dolly Copp, mainly because I had forgotten the bottom (that is to day, lower) half of my sleeping bag!  So I tossed and turned, and didn’t get a great night’s sleep.

IMG_0667.jpg

 

But the advantage was that I was able to get an early start on what I thought might be a long climb, leaving the Rocky Branch parking lot, and starting the hike at about 7:15am, at least two hours earlier than I would have started if I had driven up from Durham that morning.

IMG_0670.jpg

 

The hike in the forest up the Rocky Branch Trail was uneventful, though there was lots of water along the way, and therefore lots of rock-hopping!  That trail isn’t very well maintained, and with recent heavy rains there were long sections that were more stream than path!

I reached the junction of Rocky Branch and Isolation Trail at about 9:15am, two hours from the start of the hike.  I crossed over and headed upstream.  Rocky Branch was full, as expected with all the rain, so crossing was a bit challenging.  There were four more crossings as I headed up, through the forest, before I reached the junction of Isolation Trail and Davis Path at about 11am.  I have to admit that I dipped my boots into the Rocky Branch more than once on the way up, and even had water flow into my boot (over the ankle) once!  So the rest of the hike was done with a wet left foot…

IMG_0791IMG_0694

IMG_0679

 

 

Once I got onto Isolation Trail, I found it was better maintained than Rocky Branch Trail had been.  Evidence of a strong storm was obvious near the top, where I joined Davis Path: lots of downed trees had been cut, clearing the trail, but the hike was still narrow in places, crowded with downed trees and shrubs on both sides.

As I hiked up Isolation Trail, still in the forest, I began to have views of the Presidential Range.  I reached the turnoff for the spur up to the summit of Mt Isolation at about 11:30am, and reached the top a few minutes later.  So it took me about 4 3/4 hours to reach the top; I didn’t see any other hikers on the way up.

The view from the top was fantastic, probably the best so far of all the hikes in this series: it was clear and dry, and I had the whole Presidential Range in front of me.

IMG_0757

From the Right: Mt Washington, Mt Adams, Mt Jefferson

IMG_0728

Mt Eisenhower

IMG_0722

IMG_0719

IMG_0727

 

 

And I had a winged visitor, looking for food.

IMG_0761

IMG_0740

 

But I also had hordes of one other particular species of visitor: for only the second time in these 25 climbs, swarms of black flies quickly descended, making things impossible and intolerable.  Luckily, I had carried some insect repellent leftover from our years in Australia, and once I applied generous quantities onto my face and arms and head, the black flies left me alone.  Otherwise I would have had to leave the summit immediately, which would have been a real shame, because I had walked nearly 5 hours to get there, without very many views!

IMG_0814

 

Happily, I was able to have a leisurely lunch at the top.  The views were glorious, and the flies left me alone.

After I left, retracing my steps down, I did meet with a few hikers, including a mother and son who had come up from Glen Ellis Falls.  Descending Rocky Branch, of course, I had to cross the river again, five more times.  However, in this case, I crossed once in error and had to recross there to get back to the trail… so, make that seven more times!  Happily, it seemed easier to navigate the crossings on the way back, either the water had gone down (unlikely), or I was a bit more familiar with the right spots to cross.

I arrived back at the parking lot at about 4pm, having taken almost nine hours to climb Mt Isolation.  Tired, but it was a great day out!

*

Change is complicated and, given the nature of our value-driven organisations, changing international organisations is particularly challenging (see my 2001 article on this topic, available here: NML – Fragmentation Article).  Even though the close association that our best people make between their work and their own personal journeys is a huge advantage for our sector (leading to very high levels of commitment and motivation), this same reality also produces a culture that is often resistant to change.  Because when we identify ourselves so closely with our work, organisational change becomes personal change, and that’s very complicated!

And the changes implied with Bright Futures were immense, and disruptive.  We were asking pilot countries:

  • to move from: programs being based on a menu of activities defined at CCF’s headquarters, and focused on basic needs;
  • towards: programs based on a broad, localized, holistic and nuanced understanding of the causes and effects of the adversities faced by children, and of the assets that poor people draw on as they confront adversity.

The implication here was that pilot countries would need to deepen their understanding of poverty, and also learn to grapple with the complexity involved in addressing the realities of the lived experience of people living in poverty.  In a sense, staff in pilot countries were going to have to work much harder – choosing from a menu was easy!

  • to move from: programs being led by local community associations of parents, whose task was primarily administrative: choosing from the “menu” of activities, and managing funds and staff;
  • towards: programs being designed to enhance the leading role (agency) of parents, youth, and children in poor communities, by ensuring that they are the primary protagonists in program implementation.

The implication was that pilot countries could build on a good foundation of parents’ groups.  But extending this to learning to work appropriately with children and youth would be a challenge, and transforming all of these groups into authentic elements of local civil society would be very complex!  The reality was that the parents’ associations were often not really managing “their” staff – often it was the other way ’round – that would have to change.  Another of the challenges here would be for existing staff, and community members in general, to learn to work with groups of children and youth in non-tokenistic ways.

  • to move from: carrying out all activities at the local community level;
  • towards: implementing projects wherever the causes of child poverty and adversity are found, whether at child, family, community, or Area levels.

This would be a big challenge for pilot countries, because it involved understanding the complex linkages and causes of poverty beyond the local level, and then understanding how to invest funds in new contexts to achieve real, scaled, enduring impact on the causes of child poverty and adversity.

One big obstacle here would be the vested interests that had been created by the flow of funds from CCF into local communities and the parents’ groups. Not an easy task, fraught with significant risk.

And, on top of all of that, Bright Futures envisioned the consolidation of the existing local-level parent’s associations into parent “federations” that would operate at “district” level, relating to local government service provision.  Transforming the roles of the existing parents’ associations from handling (what was, to them) vast quantities of money, to answering to an entirely new body at “district level” was a huge challenge.

  • to move from: working in isolation from other development stakeholders;
  • towards: integrating CCF’s work with relevant efforts of other development agencies, at local and national levels.

This would require a whole new set of sophisticated relational and representational competencies that had not been prioritized before.

For example, in a sense, CCF had been operating in a mechanical way – transfer funds from headquarters to parents’ groups, which would then simply choose from a menu of activities that would take place in the local community. Simple, and effective to some extent (at least in terms of spending money!), but no longer suitable if CCF wished to have greater, longer-lasting impact, which it certainly did.

  • to move from: annual planning based on local parents’ groups choosing a set of activities from a menu of outputs, all related to basic needs;
  • towards: planning in a much more sophisticated way, with the overall objective of building sustainable community capacity, the ability to reflect and learn, resilience, and achieving impact over, as an estimation, four 3-year planning periods.

CCF would have to create an entirely new planning system, focused on the “Area” (district) level but linked to planning at Country, Regional, and International contexts.

Fundamental to this new system would be the understanding of the lived reality of people living in poverty; this would be a very new skill for CCF staff.  And pilot countries would have to learn this new planning system immediately, as it would be the foundation of pilot operations… so we had to move very quickly to develop the system, train people, and get started with the new way of planning.  (I will describe that system, the “ASP,” below…)

  • to move from: program activities taking place far from CCF’s operational structure, with visits by staff to local communities once per year;
  • towards: programs being supported much more closely, by decentralizing parts of CCF’s operational structure.

This was a huge change, involving setting up “Area” offices, staffing these offices with entirely new positions, and then shifting roles and responsibilities out from the Country Office.

There was deep institutional resistance to this move, partly because of a semi-ideological attachment to having parents make all programmatic decisions (which I sympathized with, although the evidence was clear that the program activities that resulted were often not high-quality).

But resistance also came from a more-mundane, though powerful source: showing a “massive” increase in staffing on CCF’s financial statements would look bad to charity watchdogs like Charity Navigator and Guidestar.  Even though the total levels of staffing would go down, as staffing at the “parents’ associations” would decrease significantly, those employees had not been shown on CCF’s books, because they were technically employees of the associations.  So the appearance would be a negative one, from a simple bookkeeping, ratio-driven point of view.  But this “point of view” was of the very highest priority to CCF’s senior management, because it strongly influenced donor behavior.

  • to move from: funding program activities automatically, to parents’ groups on a monthly basis, as output “subsidies”;
  • towards: projects being funded according to the pace of implementation. 

This would be another enormous, foundational change, entailing a completely-new financial system and new flows of funding and data: now, the Country and Area offices would authorize fund transfers to the local parents’ (and child and youth) associations based on documented progress of approved projects.

All of this would be new, so CCF had to develop project documentation processes and funding mechanisms that provided sufficient clarity and oversight.

To properly test Bright Futures, we would need to provide a lot of support to the pilot countries as they grappled with these, and other, disruptions!

*

In this blog post, I want to describe several aspects of the year that we piloted Bright Futures in Ecuador, the Philippines, and Uganda as they moved to implement the disruptive changes outlined above: how we helped staff and leadership in the three pilot countries understand what they were going to do; how we worked with them to get ready; and how we accompanied them as they commenced working with the Bright Futures approach.   And how we developed, tested, and implemented an entirely new set of program-planning procedures, the Area Strategic Plan methodology.

As I have just noted, Bright Futures was a profoundly different approach than what these pilot countries were used to, deeply disruptive.  So we set up what seems to me to have been, in retrospect, a careful, thorough, rigorous, and exemplary process of support and learning.  In that sense, I think it’s worth describing the process in some detail, and worth sharing a sample of the extensive documentation that was produced along the way.

*

Before beginning to pilot, we carefully identified what we would be testing and how we would measure success; we set up processes to develop the new systems and capacities that would be needed in the pilot countries and at CCF’s headquarters; and we established mechanisms to support, and learn from, the pilot countries as they pioneered a very new way of working.

In the end, I worked closely with the three pilot countries for a year – helping them understand what they were going to do, working with them to get ready, and then accompanying them as they commenced working with the Bright Futures approach.  And, along the way, I supported staff in the Richmond headquarters as they grappled with the changes demanded of them, and with the impact of the changes on headquarters systems and structures.

When CCF’s senior management had agreed the pilot testing, their president (John Schulz) had decided that the organization would not make changes to key systems and structures across the agency until pilot testing was complete and full rollout of Bright Futures had been approved.  This meant that the functional departments at headquarters had to develop “work-arounds” so that pilot areas could manage financial and donor-relations aspects of their work.

This made sense to me: why spend the time and money to develop new systems when we didn’t know if, or how, Bright Futures would work?  But it meant that much of the agency, including all three pilot Country Offices, would be using parallel basic organizational processes, especially financial processes, at the same time, just adding to the complexity!

*

First we brought key staff from each country together with staff from CCF’s headquarters in Richmond, Virginia, to develop a shared understanding of the road ahead, and to create national plans of action for piloting.  Management approved these detailed plans in late May of 2003.

I recently rediscovered several summary videos that I prepared during the creation and pilot testing of what became Bright Futures.  These videos were used to give senior management a visual sense of what was happening in the field.

Here is a short (11-minute) summary video of the preparation workshop that took place in late April of 2003:

 

It’s fun for me to see these images, now 14 years ago: the people involved, the approaches we used to start pilot testing Bright Futures.  Staff from all three pilot countries are shown, along with Daniel and Michelle, and other senior staff from Richmond.

One important result of that launch workshop was the production of a set of management indicators which would be used to assess pilot performance: the indicators would be measured in each pilot country before, and after the pilot-testing period.  The agreed indicators reflected the overall purposes of the Bright Futures program approach (see my previous blog), and can be found here: Piloting Management Indicators – From Quarterly Report #2.

Once detailed national plans of action were approved, we scheduled “Kickoff” workshops in each pilot country.  These two-day meetings were similar in each location, and included all staff in-country.  On the first day, we would review the background of the pilot, including summary presentations of CCF’s strategic plan, the Organisational Capacity Assessment, and the CCF Poverty Study.   Finally, the basic principles, concepts, and changes included in the pilot testing were presented and discussed, along with an outline of the pilot schedule.  At the end of the first day, we handed out relevant background documentation and asked participants to study it in preparation for the continuation of the meeting on the second day.

The second day of these Kickoff meetings was essentially an extended question and answer, discussion and reflection session, during which I (and staff from CCF’s headquarters, when they attended) would address concerns and areas where more detail was required.  Occasionally, since I was an external consultant, there were questions that needed discussion with functional departments at CCF’s headquarters, so I tracked these issues and methodically followed them up.

During these initial visits, I also worked with Country Office leadership to help them obtain critical external support in two important and sensitive areas:

  • Given the fundamental nature of the changes being introduced, and in particular noting that only part of the operations in each pilot country would be testing Bright Futures, human-resources issues were crucial.  Bright Futures would demand new competencies, new structures, new positions, and change management would be complex.  So in each country we sought external support from specialised agencies; I worked with CCF’s director of human resources in Richmond, Bill Leedom, to source this support locally;
  • One particular skill, on the program side, would be pivotal: new planning systems would require field staff to master the set of competencies and tools known as “PRA” – participatory rural appraisal.  (I had first come across PRA methods when in Tuluà, at Plan’s Field Office there, back in 1987, but somehow most CCF staff had not become familiar with this approach.  Some did, of course, but this gap in knowledge was an example of how CCF staff had been somewhat isolated from good development practices).  Since by 2003 PRA was completely mainstream in the development world, there were well-regarded, specialised agencies in most countries that we contacted to arrange training.

Also, in this first round of visits, I worked with local staff to finalise the selection of two pilot “Areas” in each country.  I visited these locations, helping determine the details of staffing in the Areas, reviewed and decided systems and structural issues (such as how funds would flow, how local parents’ associations would evolve as district-level “federations” were formed), etc.

*

Once the two “Areas” in each pilot country began working, I started to issue quarterly reports, documenting progress and concerns, and including visit reports, guidance notes issued, etc.  (I continued to visit each country frequently, which meant that I was on the road a lot during that pilot-testing year! )  These quarterly reports contained a very complete record of the pilot-testing experience, useful for anybody wanting (at the time) to have access to every aspect of our results, and useful (now) for anybody wanting to see what the rigorous pilot-testing of an organizational change looks like.

I produced five lengthy, comprehensive quarterly reports during that year, which I am happy to share here:

*

Staff from functional departments at CCF’s headquarters also visited pilot countries, which we encouraged: support from Richmond leadership would be important, and their input was valuable.  Of course, leaders at headquarters would need to be supportive of the Bright Futures model once the pilot-testing year was concluded, if CCF were to scale up the approach, so exposing them to the reality was key, especially because things went well!

We asked these visitors to produce reports, which are included in the quarterly reports available in the links included above.

*

Evidence of an interesting dynamic that developed during the year can be seen reflected from a report produced by Bill Leedom, who was CCF’s HR director at the time.  Bill’s visit report for a visit he made to Ecuador is included in the Q1FY04 Quarterly Report (Q1FY04 – 2).  In his report, he describes a discussion he had with the Country Director:

“Carlos (Montúfar, the Country Director in Ecuador) and I had a discussion about the role of consultants in the organization. Although it appears at times that the consultant is running the organization it must be the other way around. CCF hires a consultant to help with a process and then they leave. They are a “hired gun.” If changes are recommended they cannot be implemented without his approval as he will have to live with the consequences of whatever was done. The consultant moves on to another job and does not have to suffer any consequences of a bad recommendation or decision but he and his staff have to. I think Carlos was glad to hear this and hopefully will “stand up” to and express his opinions to what he believes might not be good recommendations by consultants.”

When Bill uses the word “consultants,” I know that he is politely referring to me!  My recollection is that this comment reflects a strong dynamic that was emerging as we pilot tested Bright Futures: leadership in the three pilot countries had volunteered to pilot test a particular set of changes, perhaps without fully understanding the ramifications, or without fully understanding that headquarters (meaning, mostly, me!) would be accompanying the pilot process so closely.

Understandably, leaders like Carlos wanted to maintain authority over what was happening in their programs, while headquarters felt that if we were going to test something, we had to test it as designed, learn what worked and what didn’t work without making changes on the fly.  Only after testing the model as proposed would make changes or adaptations as we prepared to scale up.  Otherwise, we’d never be able to document strengths and weaknesses of what we had agreed to pilot.

But not everything went perfectly – that’s why we were pilot testing, to discover what we needed to change!  When things didn’t go well, naturally, people like Carlos wanted to fix it.  That led to tension, particularly in Ecuador – perhaps because the program in that country was (rightly) highly-esteemed.

Carlos resisted some of the guidance that I was giving, and we had some frank discussions; it helped that my Spanish was still quite fluent.  But Daniel and Michelle, program leadership in Richmond, made it clear to me, and to Carlos and his regional manager that we needed to test Bright Futures as it had been designed, so even though I was an external consultant, I felt that I was on strong ground when I insisted that pilot countries proceed as we had agreed at the launch workshop in April of 2003.

*

From the beginning, we understood that an entirely-new planning, monitoring, and evaluation methodology would need to be developed for Bright Futures.  Since this would be a very large piece of work, we sought additional consulting help, and were fortunate to find Jon Kurtz, who worked with me to prepare and test the Bright Futures “Area Strategic Planning” method, the “ASP.”

We wanted to take the CCF Poverty Study very seriously, which meant that a rigorous analysis of the causes of child poverty and adversity, at various levels, had to be evident in the ASP.  And we had to make sure that program planning reflected all of the principles of Bright Futures – involving, for example, children and youth in the ASP process, incorporating other stakeholders (local NGOs operating in the Area, district government), and so forth.

Area Strategic Planning was aimed at supporting CCF’s goal of achieving broader, deeper and longer-lasting impact on child poverty.  To do this, the ASP process was guided by several key principles.  These principles can be seen in terms of the goals that ASP was designed to help programs to achieve:

  • Understanding poverty: Programs will be based on a deep understanding of, and responsive to the varied nature of child poverty across the communities where CCF works.
  • Leading role: Programs will build the capacities of parents, youth and children to lead their own development. Each group will be given the space and support required to take decisions and action to improve the wellbeing of children in their communities and Areas.
  • Linkages: Programs will be linked to and strengthen the resources that poor people call upon to improve their lives. Efforts will strive to build on the existing energies in communities and on relevant efforts of other development agencies.
  • Accountability: Programs will be recognized by sponsors and donors for their value in addressing child poverty, and at the same time will be accountable to the partner communities, especially the powerless and marginalized groups.
  • Learning: Programs will be based on best practices and continuos learning from experiences. Planning, action and review processes will be linked so that lessons from past programs are reapplied to improve future efforts.

The process for conducting Area Strategic Planning was structured to reflect these principles and aims.  It was foreseen that the proposed ASP process would evolve and be innovated upon beyond the pilot year, as Areas discovered other ways to achieve these same goals.  However, for the purposes of the pilot year the ASP process would follow the following process consisting of four stages:

  1. Community reflections on child poverty and adversity: Initial immersion and reflection in communities to gain a deep understanding of child poverty in each context, including its manifestations and causes, as well as the resources poor people rely on to address these.
  2. Area synthesis and draft program and project planning: Developing programs and projects which respond to the immediate and structural causes of child ill-being in the Area while building on the existing resources identified.
  3. Community validation, prioritization and visioning: Validating the proposed program responses in communities, prioritizing projects, and developing visions for the future for assessing program performance.
  4. Detailed project planning and ASP finalization: Designing projects together with partners and technical experts, defining capacity building goals for the Area Federation(s), and developing estimated budgets for programs and getting final input on and approval of the ASP.

We settled on a process that would look like this:

Screen Shot 2017-09-10 at 1.37.11 PM.png

CCF’s Area Strategic Planning Model

 

The ASP’s Stage Two was crucial: this was where we synthesized the understanding of child poverty and adversity, into root causes, compared those root causes with existing resources (latent or actual) in the Area, and created draft programs and projects.

Screen Shot 2017-09-10 at 1.55.03 PM.png

 

This step required a bit of “magic” – somehow matching the root causes of child poverty to local resources… and you can see Jon working hard to make it work in the video included below.  But it did work!

I really liked this ASP process – it reflected much of what I had learned in my career, at least on the program side.  It looked good, but we needed to test the ASP before training the pilot countries, so a small team of us (me, Jon, and Victoria Adams) went to The Gambia for a week and tried it out.  In this video you can see Jon working the “magic” – conjuring programs and projects from comparing root causes of child poverty (broadly understood) with locally-available (existing or latent) resources:

 

I like that there was a large dose of artistry required here; development shouldn’t be linear and mechanical, it should be joyful and serendipitous, and I was proud that our ASP process made space for that.

With the learnings from that test in The Gambia, we finalized a guidance document, detailing underlying principles, the ASP process, detailed procedures, and reporting guidelines and formats.  The version we used for pilot testing can be downloaded here: ASP Guidance – 16.

Later we trained staff in each pilot country on the ASP.  Here is a video that shows some of that process:

 

I often tell one fun anecdote about the ASP training sessions.  Stage One of the process (see the diagram above) required CCF staff to stay for nearly a week in a village where the agency worked, to carry out a thorough investigation of the situation using PRA methods.

In one country (which I will not name!), after the initial training we moved out to the pilot Area to prepare to spend the week in a village.  When we gathered there after arriving, to discuss next steps, senior national CCF staff informed me that the “village stay” would not be necessary: since they were not expatriates, they had a clear idea of the situation in rural areas of their country.

My response was simple: as a consultant, I had no authority to force them to engage in the village stay, or anything else for that matter, but that we wouldn’t continue the training if they were not willing to participate as had been agreed…!

That got their attention, and (after some discussion) they agreed to spend much of the week in local villages.

I was delighted when, at the end of the week, they admitted that things were very different than they had expected in these villages!  They seemed genuine in their recognition that they had learned a lot.

But I wasn’t surprised – these were smart, well-trained people, but they were highly-educated elite from the capital city, distant physically and culturally from rural areas.  So, I think, the village stay was very useful.

*

Along the way, across the year of pilot testing in Ecuador, the Philippines, and Uganda, I issued a series of short guidance notes, which were circulated across CCF.  These notes aimed to explain what we were pilot testing for staff who weren’t directly involved, covering the following topics:

  1. What are we pilot testing?  Piloting Notes – 1.9.  This guidance note explains the basic principles of Bright Futures that we were getting ready to test.
  2. The operational structure of Bright Futures.  Piloting Notes – 2.4.  This guidance note explains how CCF was going to set up Federations and Area Offices.
  3. Recruiting new Bright Futures staff.  Piloting Notes – 3.6.  This guidance note explains how CCF was going to build up the Area structures with new staff.
  4. The CCF Poverty Study.  Piloting Notes – 4.9  This guidance note gives a summary of the Poverty Study, that would underlie much of the Area Strategic Planning process.
  5. Monitoring and Evaluation.  Piloting Notes – 5.2  This guidance note explains Area Strategic Planning.
  6. Area Federations.  Piloting Notes – 6.6.  This guidance note explains the ideas behind building the power of people living in poverty by federating their organizations so that they could have more influence on local government service provision.
  7. Finance Issues.  Piloting Notes – 7.3.  This guidance note explains how CCF would change funding from being a “subsidy” of money, remitted every month to parents’ associations, towards a more modern process of funding project activities according to advance.
  8. Partnering.  Piloting Notes – 8.7.  This guidance note outlines the basic concepts and processes underlying one of Bright Futures’ biggest changes: working with and through local civil society.
  9. Growing the Capacity of Area Federations.  Piloting Notes – 9.6.  This guidance note describes how the federated bodies of parents, youth, and children, could become stronger.
  10. The Bright Futures Approach.  Piloting Notes – 10.2.  This guidance note explains the  approach in detail.
  11. Child and Youth Agency.  Piloting Notes – 11.  This final guidance note explains the ideas behind “agency” – enabling children and youth to take effective action on things that they find to be important in their communities.

The “Piloting Notes” series was fairly comprehensive, but purposely brief and accessible to the wide range of CCF staff across the world – busy people, with very different language abilities.  The idea was to “over-communicate” the change, so that when the time came to roll out Bright Futures, the agency would be as ready as possible.

*

There is so much more that I could share about that fantastic year.  For example, the work that Andrew Couldridge did helping us grapple with the establishment of Area “Federations” of people living in poverty.  But this blog is already quite long, so I will close it after sharing staff assessments of the pilot testing, and thanking the people who were really driving this positive change in CCF.

*

CCF carried out a formal evaluation of the pilot test of Bright Futures, using an external agency from the Netherlands (coincidentally named Better Futures, I think).  Sadly, I don’t have access to their report, but I think it was quite positive.

But I do have access to the assessment we carried out internally – the summary of that assessment is here: Management Summary – 1.  We surveyed at total of 17 people in the  three pilot countries, asking them about the Bright Futures model, HR and structural aspects, the planning process (ASP), Federations, Partnership, working with children and youth, sponsor relations, and support from Richmond.

I want to share some of the findings from the first domain of assessment (the Bright Futures model) and the last domain (support from Richmond).

  • In terms of the basic Bright Futures model, staff in pilot countries felt that positive aspects were the way that it included working in partnership and linking with other development actors, how it changed funding flows, how it deepened the undersding of poverty, and how it enhanced the participation and involvement of the community in general, and children and youth in particular.
  • On the negative side, the Bright Futures model was felt to be too demanding on them, that there was not enough capacity in communities, that there was a high cost to community participants (I think this was related to their time), that the piloting was too quick, that CCF’s focus was moved away from sponsored families, that Bright Futures guidelines were not complete at the beginning of the pilot period, that CCF itself became less visible, that Area staff may be dominant, and that the role of National Office staff was unclear.
  • In terms of support from the CCF headquarters, staff in pilot countries felt that positive aspects were that visits were very positive, helping clarify, giving a sense of accompaniment and solidarity.  Also, the flow of materials (guidance notes, etc.) was seen positively.
  • On the negative side, support visits were seen as too few and too short, guidelines were provided “just in time” which caused problems, messages from CCF headquarters were contradictory, and more support was called for in later stages of the ASP.

Piloting change is tricky, and leading it from headquarters of any INGO is even trickier – I think we did very well.

*

Once the pilot phase was evaluated, CCF began to prepare for scaling up – preparing a “second wave” of Bright Futures rollout.  Firstly we thought about how countries would be “certified” to “go-live” in Bright Futures – how would we know that they were “ready”?

To help, we produced a document summarizing how “certification” would be handled: Certification – 1.

Five countries were selected for the “second wave”: Angola, Honduras, Sierra Leone, Sri Lanka, and Zambia.  At this point, I was beginning to transition to another role (see below), so my involvement in the “second wave” was minimal.  But I did help facilitate a “pan-Asia” Bright Futures rollout workshop in Colombo, and met several people I would later work closely with when I joined ChildFund Australia (Ouen Getigan and Sarah Hunt, for example!)

*

As I’ve described here, piloting the kind of disruptive, fundamental change that was envisioned in Bright Futures brings many challenges.  And once the lessons from pilot testing are incorporated, scaling up brings a different set of complexities: for example, CCF was able to provide very extensive (and expensive) support to the three Bright Futures pilots, but would not be able to cover the entire, global organisation with that same intensity.  So, often, quality drops off.

One gap that we noticed in the support we were providing to the pilot countries was very basic: attitudes, skills, and understanding of poverty and how to overcome it.  For example, as mentioned above, we had tried to partially address this gap by getting training for pilot-country staff in PRA methods.

Next time, in my final “Bright Futures” post, I will describe how we sought to build competencies, and momentum, for Bright Futures by creating and implementing a week-long immersion training, which we called “BF 101”.   And I’ll share how Bright Futures came to a premature end…

In 2009, four years after I completed my time as a consultant with CCF, I was asked to briefly return and create a week-long training workshop which we called “Bright Futures 101.”  We conducted this workshop in the Philippines and, next time, I will skip ahead in time and describe that fascinating, and successful experience.

And I will describe how Bright Futures ended!

*

But before that, I finished my work with CCF by serving as acting Regional Representative for East Africa, based in Addis Ababa.  This assignment was to fill-in for the incumbent Regional Representative during her sabbatical.  Jean and I would move to Addis, and I worked with CCF’s offices in Ethiopia, Kenya, and Uganda for those fascinating months.

Then … I would move into the world of activism and human-rights campaigning, joining the Unitarian Universalist Service Committee as Executive Director in 2005.  Stay tuned for descriptions of that fascinating experience.

*

Before closing this final description of the two years I spent as a consultant with CCF, I want to thank Michelle and Daniel for giving me the opportunity to lead this process.  As I’ve said several times, they were doing exemplary work, intellectually honest and open.  It was a great pleasure working with them.

Carlos, in Ecuador, Nina in the Philippines, and James in Uganda, all did their best to stay true to the principles of Bright Futures, despite the headaches that came with pilot testing such disruptive change.  And they unfailingly welcomed me to their countries and work on many occasions during those two years.  Thank you!

And I also want mention and recognize a range of other Richmond-based CCF staff who worked very effectively with us to make the pilot testing of Bright Futures a success: Mike Raikovitz, Victoria Adams, Jason Schwartzman, Jon Kurtz, Andrew Couldridge, Dola Mohapatra, Tracy Dolan, and many others.  It was a great team, a great effort.

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.

 

 

Mt Jackson (24) – The Bright Futures Program Approach

August, 2017

I climbed Mt Jackson (4052ft, 1235m) on 2 June, 2017.  This was my first climb of 2017, having taken a rest over the long, cold winter of 2016-2017.  In 2016, I had been able to start hiking in early May, but this year we had much more snow, and longer and later cold spells.  So I gave May 2017 a miss, and began to tackle the 4000-footers in early June…

*

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

*

Leaving Plan International after 15 years, the last 4 of which were spent as Country Director in Viet Nam, I was fortunate to join CCF as a consultant.  My task, over what became two great years, was to help develop a new program approach for the agency.  This was exciting and opportune for me: I had been reflecting a lot about how things had changed in the development sector, and at that point I had a lot of experience across five continents, in a wide variety of roles, under my belt.

So I was very ready for the challenge that CCF offered me – I felt I had a lot to offer.  Little did I know that I was also stepping into a great environment, where CCF’s senior programmatic leadership, and the CEO, were beginning a very exciting journey of reflection and discovery.

*

My first task had been to research current thinking, and best practices, across our sector.  Last time I described that research and the recommendations that had emerged.  To my delight, Daniel Wordsworth and Michelle Poulton embraced my findings enthusiastically, and senior management had endorsed them as well.

Our next step was to take the research that I had done, with its recommended themes of change, and create the specifics of CCF’s new program approach.  In this, Daniel took the lead, with me acting as a sounding board and advocate for the principles and themes of the prior research.  This was appropriate, as now we would be detailing concretely how the agency would implement programs, core stuff for CCF.  So I moved into more of an advisory role, for now.

In this blog post, I want to share the details of what we came up with, and how CCF ended up proceeding.

*

As I drove north from Durham, the weather forecast was problematic, with a strong chance of afternoon rain.  But I decided to take the chance.  This was #24 of my 48 climbs, and I hadn’t had any rain so far, on any of those climbs.  So I figured I was on a long run of good luck – couldn’t possibly rain this time, right?

I left Durham at around 7:45am, and arrived at the trailhead at just after 10am, parking just off of Rt 302 near Crawford Notch.

IMG_0553.jpg

 

Even though it was June, I could see some patches of snow above me in the mountains as I approached Crawford Notch, but all was clear on the road.

My plan was to walk up the Webster Cliff Trail to Mt Webster, on to Mt Jackson, and then take the Webster-Jackson Trial to loop back to Mt Webster.   I would retrace my steps from there, on Webster Cliff Trail, to the trailhead.

Screen Shot 2017-07-10 at 3.06.18 PM.png

 

As I began the hike, it was a nice day, cool and a bit cloudy.  I crossed Rt 302 and quickly reached a pedestrian bridge over the Saco River.  The Webster Cliff Trail forms part of the Appalachian Trail here:

IMG_0557.jpg

IMG_0561.jpg

 

The first section of the Webster Cliff Trail was moderately steep.  Though the temperature was cool, I heated up as I ascended.  It was a beautiful day hiking, still sunny at this point:

IMG_0572.jpg

 

Clouds gathered as I ascended, and by 11am the sun was mostly gone.  The trail was consistently steep and became rockier as I ascended the Webster Cliff Trail, passing above the tree line.  Once I was onto the ridge, the views were great, looking north up into Crawford Notch:

IMG_0576

Looking Across Crawford Notch, Mt Tom

IMG_0589.jpg

That’s Mt Webster Up Ahead

 

Here are two views of the ridge, taken over a year later, from across the way on Mt Willey:

IMG_1135

Mt Webster is on the left.  I ascended steeply up the right side, then along the ridge

IMG_1157

The Ridge

 

I ran into some snow remnants along the path as I approached Mt Webster!  Just proves, once again, that you have to be prepared for snow  – even in June!

I was prepared this time… but the snow patches were not an issue this time!:

IMG_0594.jpg

 

The walking was good, but windy, and clouds were building from the west.  So far, I had not seen any other hikers…

I arrived at Mt Webster ( 3910ft, 1192m – not a 4000-footer) at 1:30pm.  The plan was to rejoin the trail here on my way back, via the Webster-Jackson Trail.

IMG_0600.jpg

 

To the west, I could look across Crawford Notch and see Mt Tom and Mt Field and Mt Willey.  The views north towards the Presidential Range were great, though Mt Washington was in the clouds.  There were patches of blue sky above me, but darker skies to the west.

 

Just before reaching Mt Webster, I passed a through hiker: he was hiking north, doing the entire Appalachian Trail.  Impressive, since it was only early June, that he was this far north.  Maybe in his 60’s, with a grey beard.  He asked me what my “trail handle” was, assuming (I guess) that I was also a through hiker.  I just laughed and said: “well, my name is Mark”!

“These are some heavy hills” I said.

“Hills?!” he exclaimed.

So I guess he was feeling the ascent, as I was.  But, having just restocked his pack with food, he was carrying much more weight than I was…

Just past Mt Webster, I began the Webster-Jackson loop that planned to take; first, continuing on to Mt Jackson, then down and around to return to Mt Webster:

IMG_0630.jpg

IMG_0632.jpg

 

Here I encountered the second hiker of the day.  Dan was hiking with the guy I had met earlier, and was waiting here for him.  Dan had joined the other guy a week ago, for part of the through hike.  Dan seemed tired and ready to get off the trail, asking me what was the fastest way to the road.  Seemed like he had had enough, describing lots of rain and snow and ice over the last days.

I told him how I had run into so much ice over that way, on Mt Tom and Mt Field the year before, and how I had fallen in May on Mt Liberty.

I left Dan there, and arrived at the top of Mt Jackson at about 1:45pm, and ate lunch – a tried-and-true “Veggie Delite” sandwich from Subway.  It began to sprinkle, light rain falling.

Here the views of the Presidential Range were great, though Mt Washington was still in the clouds.  Mispah Springs Hut can just be seen, a speck of light in the middle left of the photo:

IMG_0605.jpg

 

The Mt Washington Hotel, in Bretton Woods, can be seen here in the distance with distinctive red roofs, looking north through Crawford Notch:

IMG_0603.jpg

 

From the top of Mt Jackson, the Webster Cliff Trail continues on towards Mt Pierce (which I had climbed with Raúl and Kelly earlier in the year) and the rest of the Presidential Range.  I turned left here, taking the Webster-Jackson Trail, hoping to loop back up to Mt Webster.  My hunch was that Dan was going to wait for his friend, and then follow me down, since that would be the quickest way to “civilization” and he was ready for a shower!

I began to drop steadily down Webster-Jackson, a typical White-Mountains hike, rock-hopping.  But I was a bit surprised, and became increasingly concerned, at the amount of elevation I was losing, as I went down, and down, and down… I knew I’d have to make up this elevation drop, every step of it!

 

I passed five people coming up – two young men running the trail, a mother and daughter (probably going up to stay at the Mispah Hut), and one guy huffing and puffing.

I arrived at the bottom of the loop at just before 3pm, exhausted and now regretting having taken this detour.  Cursing every step down, which I would have to make up, soon: because, from here, it would be a long way back up to Mt Webster, and it was beginning to rain steadily.

IMG_0615.jpg

 

At the bottom of the Webster-Jackson loop, there is a beautiful waterfall, and the temperature was much lower than it had been at the top of the ridge:

It was a VERY LONG slog back up to the top of Mt Webster, where I arrived again at 3:45pm, very tired and very wet.  It had become much colder here since I had passed through earlier in the day, now windy and steadily raining.

Here I would walk back along the ridge.  And I began to feel quite nervous about the possibility of slipping on the slick rocks – from here it would be all downhill, and a fall on the now-slippery rocks could be trouble!

I didn’t really stop at the top of Mt Webster – too cold and rainy.  Conditions had changed a lot since I’d passed this peak that morning!

IMG_0635

IMG_0637

IMG_0642

 

Although it was raining steadily, some blue sky did roll by once in a while:

IMG_0640

 

From here I began the descent back to Rt 302, and soon the trees began to grow in size, and cover me.  I never slipped on the wet granite stones, though I came close a couple of times.  I had to take it very slowly, taking care as I went across every one of the many rocks…  But I got soaked through – for the first time in 24 climbs!

IMG_0643

IMG_0645

Soaking Wet, But Happy

 

I was back at my car at about 6:15pm; it was raining hard and 49 degrees.

IMG_0647

 

The Mt Jackson climb was great, despite the unwelcome rain and cold.  It was longer and harder than expected – nothing technical or super-steep, just long, due mostly to my decision to do the loop down from the summit and back up, and because I had to take care on the slick rocks coming down.

*

Once CCF’s management had endorsed my recommendations for their new program approach, Daniel and I began the design process.  Along the way, CCF’s President John Schulz had baptized the new approach as “Bright Futures,”  which was very smart: branding the change with an inspirational, catchy name that also captured the essence of what we were proposing would help open people to the idea.

Gesture 5.jpg

Daniel Wordsworth, 2003

Here I will be quoting extensively from a document that Daniel and I worked on, but which was primarily his.  He boiled down the essence of Bright Futures into three fundamental objectives.  Bright Futures would:

  1. Broaden, deepen and bring about longer-lasting impact in children’s lives;
  2. Fortify sponsorship;
  3. Strengthen accountability.

Bright Futures would be based on the belief that people must be given the space to design and shape the programs that will be carried out in their communities and countries.  The fundamental principle that guided our thinking was that there was no universal strategy that CCF could apply across the complex and different contexts in which it worked.  Therefore, the emphasis was not on a framework that outlined what should be done – e.g. health, education, etc – but rather on a set of key processes that would set the tone of the agency’s work and provide coherence to its programming around the world.

There were five key work processes, qualities of work, that would characterize CCF’s Bright Futures programming.  Each of these was firmly linked to the transformational themes that my own research had identified, but Daniel managed to put things in clear and incisive terms, displaying the brilliant insights I had come to admire:

Screen Shot 2017-07-31 at 1.51.32 PM

Grounded and Connected: Bright Futures programs would be integrated into the surrounding social environment, contributing to and drawing from the assets and opportunities that this environment provides.

To accomplish this, programs would be based in well-defined, homogeneous “Areas”, matching the level of government service provision – often the “district” level.  Program planning would be based at the community level, and program implementation would be accountable to local communities, but programs would be integrated with relevant efforts of the government and other development agencies, at local and national levels. CCF staff would be decentralized, close to communities, to ensure on-the-spot follow-up, using participatory methods and strict project management discipline to ensure effective program implementation.  By partnering with other organizations, building the capacity of local people, and seizing opportunities to replicate program methods wherever possible, impact would be expanded into other communities within the Area and beyond.

These would be big changes for CCF, on many dimensions.  Current programming was exclusively at village or community level, but it was disconnected from efforts to overcome poverty that were taking place at other levels.  Staff visited programs rarely, typically only once per year.  And notions of replication or even sustainability were rarely addressed.  Making these changes a reality would be challenging.

Achieve Long-Term Change: Bright Futures programs would be grounded in an understanding of poverty and of the causes of poverty, and designed to make a long-lasting difference in the lives of poor children.

To accomplish this, program design would begin with immersion in communities and a thorough analysis of the deeper issues of poverty confronting children and communities.  Program interventions would then take place where the causes of child poverty were found, whether at child, family, community, or area (district) levels. Programs would be designed and implemented according to a series of three-year strategic plans, and would consist of a comprehensive set of integrated “Project Activities” that had specific objectives, implementation plans and budgets.  Financial flow would follow budget and implementation.

As we began to design Bright Futures, CCF’s programming was guided by an agency-wide set of outcomes that had been articulated some years before, called “AIMES.”  These “outcomes” were really more of a set of indicators, most of which were tightly focused on basic needs such as immunization, primary-school completion, etc.  Communities seemed to view these indicators as a menu, from which they selected each year.  And, as I mentioned above, interventions were exclusively at village or community level.

With the advent of Bright Futures, the findings of the CCF Poverty Study, and of my own research, we would fundamentally change these practices.  From now on, there would be no “menu” to draw from; rather, CCF would help local organizations to grapple with the causes of child poverty, viewing that poverty in a broader way, and consulting deeply with local people and children; staff would then create an “Area Strategic Plan” (“ASP”) that outlined how programming would address these causes across the “Area.”

(Details of how the ASP would be designed will be included in my next posting, stay tuned!)

Build People: Bright Futures programs seek to build a stronger society with the ability to cooperate for the good of children and families.

To accomplish this, programs would build Federations and Associations of poor children, youth and adults that represent the interests of excluded and deprived people.  These entities would manage program implementation (mostly) through and with partners. Programs would be implemented through local bodies such as district government, NGOs, or community-based organizations, building the capacity of these groups to effectively implement solutions to issues facing poor children.  A long-term, planned approach to capacity building would be adopted, that reinforced and strengthened local competencies and organizations so that communities could continue their efforts to build bright futures for their children long after CCF had phased out of their communities.  This approach would include clearly articulated and time-bound entry and exit conditions, and specific milestones to gauge progress towards exit.

This was another big and challenging change.  CCF would continue to work with parents’ associations at community level, as it had been doing, because this was a real strength of the agency.  However, these associations tended to lack capacity, were left to fend for themselves, and did not interact with other stakeholders and “duty-bearers” around them.

All of this would change with Bright Futures.  Parents’ associations would now be “federated” to district level, and the Parent’s Federations would be the primary bodies that CCF worked with and for.  These Federations, being located at the “district” level, would interact with local government service providers (“duty bearers”), serving as interest groups on behalf of poor and excluded people.  And the Parents’ Federations would, normally, not be seen as program implementors.  Rather, they would – at least in the first instance – locate local partners that could implement the kinds of projects that were identified in the ASP.

Here we had a challenge, as we moved the existing Parents’ Associations into very different roles, where they no longer controlled funds as they had previously.  There were many vested interests involved here, and we anticipated opposition from people who had learned to extract benefits informally, especially given that in the previous model CCF’s staff had been very hands-off and remote from program implementation.  And the very idea of “federating” and influencing local duty-bearers was completely new to CCF.

Show Impact: Bright Futures programs demonstrate the impact of our work in ways that matter to us and the children and communities we work with.

To accomplish this, using CCF’s poverty framework of Deprivation, Exclusion, and Vulnerability, the National Office would clearly articulate the organization’s niche, and demonstrate its particular contribution.   The outputs of each project would be rigorously monitored to ensure effective implementation, and programs would likewise be carefully monitored to ensure relevance to enrolled children.

Before Bright Futures, CCF’s National Offices had very little influence on programming.  If a local Parents’ Association was not breaking any rules, then funding went directly from CCF’s headquarters in Richmond, Virginia to the Association, without intervention from the National Office.  Only when a serious, usually finance- or audit-related, issue was identified could the National Office intervene, and then they could only halt fund transmissions and await remedial action from Richmond.

Now, the National Office and local Area team would be monitoring project implementation on a regular basis, using techniques that ensured that the voices of local children were central to the process of monitoring and evaluation.  We would have to develop tools for this.

Recognize Each Child’s Gift: Bright Futures programs recognize and value each particular child as a unique and precious individual.

To accomplish this, programs would be designed to facilitate the development of each child in holistic ways, taking into account the different phases of development through which each child passes.  The voices of children would be heard and would shape the direction of programs.  CCF would promote children and youth as leaders in their own development, and in the development of their communities and societies.  This would now be central to program implementation.

While the local Parents’ Associations would be retained, and federated to district level, two new forms of Association and Federation would be introduced: of children, and of youth.  These new Associations and Federations would be given prominent roles in program design and project implementation, as appropriate to their age.

*

These were all big, fundamentally-disruptive changes, involving seismic shifts in every aspect of CCF’s program work.  I felt that we had incorporated much of the learning and reflection that I had done, beginning in my Peace Corps days and all the way through my 15 years with Plan – this was the best way to make a real, lasting difference!

Once Daniel and Michelle were happy with the way that we were articulating Bright Futures, our next step was to get senior-management and board approval.

I was very pleased that, in the end, CCF’s leaders were very supportive of what Daniel was proposing.  But, in a note of caution given the magnitude of the changes we were proposing, we were asked to pilot test the approach before rolling it out.

This cautious approach made sense to me, and I was delighted that Daniel asked me to continue as an outside consultant, to oversee and support the pilot National Offices, documenting their experience and our learning as the Bright Futures approach was tested.

*

We then began to consider where we should pilot test.  First, we asked for volunteers across CCF’s National Offices and then, after creating a short list of viable options, we reviewed the status of each of the National Offices remaining on the list.  We quickly came to the conclusion that we would select one National Office in each of the continents where the majority of CCF’s work took place:

  • Carlos - 1.jpg

    Carlos Montúfar

    In the Americas, we chose Ecuador.  The office there was well-run, stable, and was regarded as a model in many ways.  The National Director (Carlos Montúfar) was a strong leader, and he and his team were enthusiastic about being Bright Futures “pilots”;

 

 

 

 

  • Screen Shot 2017-08-01 at 2.18.44 PM.png

    James Ameda

    In Africa, we chose Uganda.  Here things were a bit different than in Ecuador: the Uganda office was considered by many in CCF as needed a bit of a shakeup.  James Ameda was a senior National Director and was supportive of the pilot, but there were some tensions in his team and performance across CCF/Uganda in some areas was weak;

 

 

 

  • For Asia, we decided to choose the Philippines office.  The office in Manila was well-
    Screen Shot 2017-08-01 at 2.18.35 PM.png

    Nini Hamili

    run, with high morale and strong leadership in the form of Nini Hamili, a charismatic and long-tenured National Director.  Nini was a very strong leader, who sidelined as a mediator in violent Mindanao – I came to see how courageous Nini was…

 

 

 

 

*

Soon I would begin regularly to visit the three pilot offices, training them on the methods and systems that were being developed for Bright Futures, accompanying them as they learned and adapted, documenting our experience.

It was a great privilege working with Carlos, James, and Nini and their teams – they had taken on a huge challenge: not only did Bright Futures represent a set of fundamental shifts in what they were accustomed to doing, but they were asked to continue to manage their programs the old way in the areas of their country where Bright Futures wasn’t being introduced.

And it was equally impressive working with Daniel and Michelle at CCF’s Richmond headquarters, along with staff like Victoria Adams, Mike Raikovitz, and many others, and fellow consultants Jon Kurtz and Andrew Couldridge.

Next time, I will go into much more detail on the pilot testing of Bright Futures, including how we designed and implemented perhaps the most fundamental program-related system, Area Strategic Planning.

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.