About Mark McPeak

Nonprofit leader: transformational change, organizational effectiveness, conflict management.

West Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness Framework

June, 2018

International NGOs do their best to demonstrate the impact of their work, to be accountable, to learn and improve.  But it’s very challenging and complicated to measure change in social-justice work, and even harder to prove attribution.  At least, to do these things in affordable and participatory ways…

Two times in Plan International, earlier in my career, I had worked to develop and implement systems that would demonstrate impact – and both times, we had failed.

In this article I want to describe how, in ChildFund Australia, we succeeded, and were able to build and implement a robust and participatory system for measuring and attributing impact in our work.

Call it the Holy Grail!

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 36 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”  This is my 37th post in the series.

In recent posts in this series I’ve been describing aspects of the ChildFund Australia “Development Effectiveness Framework” (“DEF”) the system that would help us make sure we were doing what we said we were going to do and, crucially, verify that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

There are three particular components of the overall DEF that I am detailing in more depth, because I think they were especially interesting and innovative.  In my previous blog I described how we used Case Studies to complement the more quantitative aspects of the system.  These Case Studies were qualitative narratives of the lived experience of people experiencing change related to ChildFund’s work, which we used to gain human insights, and to reconnect ourselves to the passions that brought us to the social-justice sector in the first place.

This time, I want to go into more depth on two final, interrelated components of the ChildFund Australia DEF: Outcome Indicator Surveys and Statements of Impact.  Together, these two components of the DEF enabled us to understand the impact that ChildFund Australia was making, consistent with our Theory of Change and organizational vision and mission.  Important stuff!

But first…

*

Last time I described climbing to the top of Mt Bond on 10 August 2017, after having gotten to the top of Bondcliff.  After Mt Bond, I continued on to West Bond (4540ft, 1384m), the last of three 4000-footers I would scale that day.  (But, since this was an up-and-back trip, I would climb Mt Bond and Bondcliff twice!  It would be a very long day.)

As I described last time, I had arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  This early start was enabled by staying the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.  Then I had reached the top of Bondcliff at 10:30am, and the summit of Mt Bond at about 11:30am.

Now I would continue to the top of West Bond, and then retrace my steps to Lincoln Woods:

Bond Map - 6c.jpeg

 

So, picking up the story from the top of Mt Bond, the Bondcliff Trail drops down fairly quickly, entering high-altitude forest, mostly pine and ferns.

IMG_1952.jpg

 

After 20 minutes I reached the junction with the spur trail that would take me to the top of West Bond.  I took a left turn here.  The spur trail continues through forest for some distance:

IMG_1955.jpg

IMG_1958.jpg

 

I reached the top of West Bond at 12:30pm, and had lunch there.  The views here were remarkable; it was time for lunch, and I was fortunate to be by myself, so I took my time at the summit.

IMG_1965 (1).jpg

Bondcliff From West Bond

IMG_1972.jpg

At The Summit Of West Bond.  Franconia Ridge And Mt Garfield In The Background.  A Bit Tired!

IMG_1984.jpg

Mt Bond, On The Left, And Bondcliff On The Right

 

Here are two spectacular videos from the top of West Bond.  The first simply shows Bondcliff, with the southern White Mountains in the background:

 

And this second video is more of a full panorama, looking across to Owl’s Head, Franconia Ridge, Garfield, the Twins, Zealand, and back:

 

Isn’t that spectacular?!

After eating lunch at the top of West Bond, I left at a bit before 1pm, and began to retrace my steps towards Lincoln Woods.  To get there, I had to re-climb Mt Bond and Bondcliff.

I reached the top of Mt Bond, for the second time, at 1:20pm.  The view down towards Bondcliff was great!:

IMG_1996.jpg

Bondcliff From The Top Of Mt Bond, Now Descending…

 

Here is a view from near the saddle between Mt Bond and Bondcliff, looking up at the latter:

IMG_2005.jpg

Looking Up At Bondcliff

 

As I passed over Bondcliff, at 2:15pm, I was slowing down, and my feet were starting to be quite sore.  I was beginning to dread the descent down Bondcliff, Wilderness, and Lincoln Woods Trails… it would be a long slog.

Here’s a view from there back up towards Mt Bond:

IMG_2007.jpg

A Glorious White Mountain Day – Mt Bond And West Bond, From Bondcliff

 

But there were still 8 or 9 miles to go!  And since I had declined the kind offer I had received to ferry my car up to Zealand trail-head, which would have saved me 3 miles, I had no other option but to walk back to Lincoln Woods.

It was nearly 5pm by the time I reached the junction with Twinway and the Lincoln Woods Trail.  By that time, I was truly exhausted, and my feet were in great pain, but (as I said) I had no option but to continue to the car: no tent or sleeping bag, no phone service here.

The Lincoln Woods Trail, as I’ve described in more detail elsewhere, is long and flat and wide, following the remnants of an old forest railway:

IMG_2024

IMG_2025

Sleepers From The Old Forestry Railway

 

Scratches from walking poles?

IMG_2026 (1).jpg

 

It was around 5:30 when I got to the intersection with Franconia Notch Trail, which is the path up Owl’s Head.

IMG_2028.jpg

IMG_2034.jpg

 

It was a very long slog down Lincoln Woods Trail – put one foot in front of the other, and repeat!  And repeat and repeat and repeat and repeat …

Finally I reached the Lincoln Woods Visitor Center, where I had parked my car at 6:30am that morning, at 6:40pm, having climbed three 4000-footers, walked 22 miles, and injured my feet in just over 12 hours.

Looking back, I had accomplished a great deal, and the views from the top of three of New Hampshire’s highest and most-beautiful were amazing.  But, at the time, I had little feeling of accomplishment!

IMG_2038 (1).jpg

Knackered!

 

*

Here is the diagram I’ve been using to describe the ChildFund Australia DEF:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework

 

In this article I want to describe two components of the DEF: #2, the Outcome Indicator Surveys; and #12, how we produced “Statements of Impact.”  Together, these two components enabled us to measure the impact of our work.

First, some terminology: as presented in an earlier blog article in this series, we had adopted fairly standard definitions of some related terms, consistent with the logical framework approach used in most mature INGOs:

Screen Shot 2018-05-28 at 2.16.30 PM

 

According to this way of defining things:

  • A Project is a set of Inputs (time, money, technology) producing a consistent set of Outputs (countable things delivered in a community);
  • A Program is a set of Projects producing a consistent set of Outcomes (measurable changes in human conditions related to the organization’s Theory of Change);
  • Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan.

But that definition of “Impact,” though clear and correct, wasn’t nuanced enough for us to design a system to measure it.  More specifically, before figuring out how to measure “Impact,” we needed to grapple with two fundamental questions:

  • How “scientific” did we want to be in measuring impact?  In other words, were we going to build the infrastructure needed to run randomized control group trials, or would we simply measure change in our Outcome Indicators?  Or somewhere in between?;
  • How would we gather data about change in the communities where we worked?  A census, surveying everybody in a community, which would be relatively costly?  If not, what method for sampling would we use that would enable us to claim that our results were accurate (enough)?

*

The question “how ‘scientific’ did we want to be” when we assessed our impact was a fascinating one, getting right to the heart of the purpose of the DEF.  The “gold standard” at that time, in technical INGOs and academic institutions, was to devise “randomized control group” trials, in which you would: implement your intervention in some places, with some populations; identify ahead of time a comparable population that would serve as a “control group” where you would not implement that intervention; and then compare the two groups after the intervention had concluded.

For ChildFund Australia, we needed to decide if we would invest in the capability to run randomized control group trials.  It seemed complex and expensive but, on the other hand, it  would have the virtue of being at the forefront of the sector and, therefore, appealing to technical donors.

When we looked at other comparable INGOs, in Australia and beyond, there were a couple that had gone that direction.  When I spoke with my peers in some of those organizations, they were generally quite cautious about the randomized control trial (“RCT”) approach: though appealing in principle, in practice it was complex, requiring sophisticated technical staff to design and oversee the measurements, and to interpret results.  So RCTs were very expensive.  Because of the cost, people with practical experience in the matter recommended using RCTs, if at all, only for particular interventions that were either expensive or were of special interest for other reasons.

For ChildFund Australia, this didn’t seem suitable, mainly because we were designing a comprehensive system that we hoped would allow us to improve the effectiveness of our development practice, while also involving our local partners, authorities, and people in communities where we worked.  Incorporating RCTs into such a comprehensive system would be very expensive, and would not be suitable for local people in any meaningful way.

The other option we considered, and ultimately adopted, hinged upon an operational definition of “Impact.”  Building on the general definition shown above (“Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan”), operationally we decided that:

Screen Shot 2018-06-18 at 3.06.57 PM.png

 

In other words, we felt that ChildFund could claim that we had made an significant impact in the lives of children in a particular area if, and only if:

  1. There had been a significant, measured, positive change in a ChildFund Australia Outcome Indicator; and
  2. Local people (community members, organizations, and government staff) determined in a rigorous manner that ChildFund had contributed to a significant degree to that positive change.

In other words:

  • If there was no positive change in a ChildFund Australia Outcome Indicator over three years (see below for a discussion of why we chose three years), we would not be able to claim impact;
  • If there was a positive change in a ChildFund Australia Outcome Indicator over three years, and local people determined that we had contributed to that positive change, we would be able to claim impact.

(Of course, sometimes there might be a negative change in a ChildFund Australia Outcome Indicator, which would have been worse if we hadn’t been working in the community.  We were able to handle that situation in practice, in community  workshops.)

I felt that, if we approached measuring impact in this way it would be “good enough” for us – perhaps not as academically robust as using RCT methods, but (if we did it right) certainly good enough for us to work with local people to make informed decisions, together, about improving the effectiveness of our work, and to make public claims of the impact of our work.

So that’s what we did!

*

As a reminder, soon after I had arrived in Sydney we had agreed a “Theory of Change” which enabled us to design a set of organization-wide Outcome Indicators.  These indicators, designed to measure the status of children related to our Theory of Change, were described in a previous article, and are listed here:

Screen Shot 2018-05-28 at 3.16.59 PMScreen Shot 2018-05-28 at 3.17.10 PM

 

These Outcome Indicators had been designed technically, and were therefore robust.  And they had been derived from the ChildFund Australia Vision, Mission, and Program Approach, so they measured changes that would be organically related to the claims we were making in the world.

So we needed to set up a system to measure these Outcome Indicators; this would become component #2 in the DEF (see Figure 1, above).  And we had to design a way for local partners, authorities, and (most importantly) people from the communities where we worked to assess changes to these Outcome Indicators and reach informed conclusions about who was responsible for causing the changes.

First, let me outline how we measured the ChildFund Australia Outcome Indicators.

*

Outcome Indicator Surveys (Component #2 in Figure 1, Above)

Because impact comes rather slowly, an initial, baseline survey was carried out in each location and then, three years later, another measurement was carried out.  A three-year gap was somewhat arbitrary: one year was too short, but five years seemed a bit long.  So we settled on three years!

Even though we had decided not to attempt to measure impact using complex randomized control trials, these survey exercises were still quite complicated, and we wanted the measurements to be reliable.  This was why we ended up hiring a “Development Effectiveness and Learning Manager” in each Country Office – to support the overall implementation of the DEF and, in particular, to manage the Outcome Indicator Surveys.  And these surveys were expensive and tricky to carry out, so we usually hired students from local universities to do the actual surveying.

Then we needed to decide what kind of survey to carry out.  Given the number of people in the communities where we worked, we quickly determined that a “census,” that is, interviewing everybody, was not feasible.

So I contacted a colleague at the US Member of the ChildFund Alliance, who was an expert in this kind of statistical methodology.  She strongly advised me to use the survey method that they (the US ChildFund) were using, called “Lot Quality Assurance Sampling.”  LQAS seemed to be less expensive than other survey methodologies, and it was highly recommended by our expert colleague.

(In many cases, during this period, we relied on technical recommendations from ChildFund US.  They were much bigger than the Australia Member, with excellent technical staff, so this seemed logical and smart .  But, as with Plan International during my time there, the US ChildFund Member had very high turnover, which led to many changes in approach.  This meant, in practice for us, although ChildFund Australia had adopted several of the Outcome Indicators that ChildFund US was using, in the interests of commonality, and – as I said – we had begun to use LQAS for the same reason, soon the US Member was changing their Indicators and abandoning the use of LQAS because new  staff felt it wasn’t the right approach.  This led to the US Member expressing some disagreement with how we, in Australia, were measuring Impact – even though we were following their – previous – recommendations!  Sigh.)

Our next step was to carry out baseline LQAS surveys in each field location.  It took time to accomplish this, as even the relatively-simple LQAS was a complex exercise than we were typically used to.  Surveys were supervised by the DEL Managers, carried out usually by students from local universities.  Finally, the DEL Managers prepared baseline reports summarizing the status of each of the ChildFund Australia Outcome Indicators.

Then we waited three years and repeated the same survey in each location.

(In an earlier article I described how Plan International, where I had worked for 15 years, had failed twice to implement a DEF-like system, at great expense.  One of the several mistakes that Plan had made was that they never held their system constant enough to be comparable over time.  In other words, in the intervening years after measuring a baseline, they tinkered with [“improved”] the system so much that the second measurement couldn’t be compared to the first one!  So it was all for naught, useless.  I was determined to avoid this mistake, so I was very reluctant to change our Outcome Indicators after they were set, in 2010; we did add a few Indicators as we deepened our understanding of our Theory of Change, but that didn’t get in the way of re-surveying the Indicators that we had started with, which didn’t change.)

Once the second LQAS survey was done, three years after the baseline, the DEL Manager would analyze differences and prepare a report, along with a translation of the report that could be shared with local communities, partners, and government staff.  The DEL Manager, at this point, did not attempt to attribute changes to any particular development actor (local government, other NGOs, the community themselves, ChildFund, etc.), but did share the results with the communities for validation.

Rather, the final DEF component I want to describe was used to determine impact.

*

Statements of Impact (Component #12 in Figure 1, Above)

The most exciting part of this process was how we used the changes measured over three years in the Outcome Indicators to assess Impact (defined, as described above, as change plus attribution.)

The heart of this process was a several-day-long workshop at which local people would review and discuss changes in the Outcome Indicators, and attribute the changes to different actors in the area.  In other words, if a particular indicator (say, the percentage of boys and girls between 12 and 16 years of age who had completed primary school) had changed significantly, people at the workshop would discuss why the change had occurred – had the local education department done something to cause the change?  Had ChildFund had an impact?  Other NGOs?  The local community members themselves?

Finally, people in the workshop would decide the level of ChildFund’s contribution to the change (“attribution”) on a five-point scale: none, little, some, a lot, completely.   This assessment, made by local people in an informed and considered way, would then serve as the basic content for a “Statement of Impact” that would be finalized by the DEL Manager together with his or her senior colleagues in-country, Sydney-based IPT staff and, finally, myself.

*

We carried out the very first of these “Impact” workshops in Svay Rieng, Cambodia, in February 2014.  Because this was the first of these important workshops, DEL Managers from Laos and Viet Nam attended, to learn, along with three of us from Sydney.

Here are some images of the ChildFund team as we gathered and prepared for the workshop in Svay Rieng:

IMG_2151

IMG_2169

IMG_2202

 

Here are images of the workshop.  First, I’m opening the session:

IMG_8605

 

Lots of group discussion:

IMG_8758

 

The DEL Manager in Cambodia, Chan Solin, prepared a summary booklet for each participant in the workshop.  These booklets were a challenge to prepare, because they would be used by local government, partners, and community members; but Solin did an outstanding job.  (He also prepared the overall workshop, with Richard Geeves, and managed proceedings very capably.)  The booklet presented the results of the re-survey of the Outcome Indicators as compared with the baseline:

IMG_8817

IMG_8795

 

Here participants are discussing results, and attribution to different organizations that had worked in Svay Rieng District over the three years:

IMG_9612

 

Subgroups would then present their discussions and recommendations for attribution.  Note the headphones – since this was our first Impact Workshop, and ChildFund staff were attending from Laos, Viet Nam, and Australia in addition to Cambodia, we provided simultaneous translation:

IMG_9694

 

Here changes in several Outcome Indicators over the three years (in blue and red) can be seen.  The speaker is describing subgroup deliberations on attribution of impact to the plenary group:

IMG_9703

IMG_9719

IMG_9699

IMG_9701

IMG_9747

IMG_9728

IMG_9763

 

Finally, a vote was taken to agree the attribution of positive changes to Outcome Indicators.  Participants voted according to their sense of ChildFund’s contribution to the change: none, a little, some, a lot, or completely.  Here is a ballot and a tabulation sheet:

IMG_9790

 

Finally, here is an image of the participants in that first Statement of Impact Workshop: Local Community Members, Government Staff, ChildFund Staff (From The Local Area, Country Office, Sydney, and From Neighboring Viet Nam):

IMG_2299

 

*

Once the community workshops were finished, our local Senior Management would review the findings and propose adjustments to our work.  Then the DEL Managers would prepare a final report, which we described as “Statements of Impact.”

Generally speaking, these reports would include:

  • An introduction from the Country Director;
  • A description of the location where the Statement of Impact was produced, and a summary of work that ChildFund had done there;
  • An outline of how the report was produced, noting the three-year gap between baseline and repeat survey;
  • Findings agreed by the community regarding changes to each Outcome Indicator along with any attribution of positive change to ChildFund Australia;
  • Concluding comments and a plan of action for improvement, agreed by the local Country Office team and myself.

Examples of these reports are shared below.

*

This process took some time to get going, because of the three-year delay to allow for re-surveying, but once it commenced it was very exciting.  Seeing the “Statement of Impact” reports come through to Sydney, in draft, from different program countries, was incredible.  They showed, conclusively, that ChildFund was really making a difference in the lives of children, in ways that were consistent with our Theory of Change.

Importantly, they were credible, at least to me, because they showed some areas where we were not making a difference, either because we had chosen not to work in a particular domain (to focus on higher priorities) or because we needed to improve our work.

*

I’m able to share four ChildFund Australia Statements of Impact, downloaded recently from the organization’s website.  These were produced as described in this blog article:

*

Here are a few of the findings from that first “Statement of Impact” in Svay Chrum:

  • ChildFund made a major contribution to the increase in primary-school completion in the district:

Screen Shot 2018-06-27 at 8.49.40 AM.png

 

  • Although the understanding of diarrhea management had improved dramatically, it was concluded that ChildFund had not contributed to this, because we hadn’t implemented any related projects.  “Many development actors contributed to the change”:

Screen Shot 2018-06-27 at 8.52.47 AM.png

 

  • ChildFund had a major responsibility for the improvement in access to hygienic toilets in the district:

Screen Shot 2018-06-27 at 8.49.54 AM.png

 

  • ChildFund made a significant contribution to the increase in access to improved, affordable water in the district:

Screen Shot 2018-06-27 at 8.54.41 AM.png

 

  • ChildFund had made a major contribution to large increases in the percentage of children and youth who reported having opportunities to voice their opinions:

Screen Shot 2018-06-27 at 8.56.08 AM.png

  • Although the percentage of women of child-bearing age in the district who were knowledgeable regarding how to prevent infection with HIV, it was determined the ChildFund had made only a minor contribution to this improvement.  And recommendations were made by the group regarding youth knowledge, which had actually declined:

Screen Shot 2018-06-27 at 8.57.47 AM.png

 

To me, this is fantastic stuff, especially given that the results emerged from deep and informed consultations with the community, local partners, and local authorities.  Really, this was the Holy Grail – accountability, and lots of opportunity for learning.  The results were credible to me, because they seemed to reflect the reality of what ChildFund had worked on, and pointed out areas where we needed to improve; the report wasn’t all positive!

*

For me, the way that the Outcome Indicator Surveys and Statements of Impact worked was a big step forward, and a major accomplishment.  ChildFund Australia now had a robust and participatory way of assessing impact so that we could take steps to confidently improve our work.  With these last two components of the DEF coming online, we had managed to put in place a comprehensive development-effectiveness system, the kind of system that we had not been able to implement in Plan.

As I shared the DEF – its design, the documents and reports it produced – with our teams, partners, Australian government, donors – I began to get lots of positive feedback.   At least for its time, in Australia, the ChildFund Australia DEF was the most comprehensive, robust, participatory, useful system put into place that anybody had ever seen.  Not the most scientific, perhaps, but something much better: usable, useful, and empowering.

*

My congratulations and thanks to the people who played central roles in creating, implementing, and supporting the DEF:

  • In Sydney: Richard Geeves and Rouena Getigan;
  • And the DEL Managers in our Country Offices: Chan Solin (Cambodia), Joe Pasen (PNG), Marieke Charlet (Laos), and Luu Ngoc Thuy and Bui Van Dung (Viet Nam).

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. Mt Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System.

 

 

Mt Bond (36) – “Case Studies” In ChildFund Australia’s Development Effectiveness Framework

June, 2018

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 35 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”

Last time I described the ChildFund Australia “Development Effectiveness Framework,” the system that would help us make sure we were doing what we said we were going to do and, crucially, verifying that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

This time, I want to go into more depth on one component of the DEF, the “Case Studies” that described the lived experience of people that we worked with.  Next time, I’ll describe how we measured the impact of our work.

But first…

*

On 10 August, 2017, I climbed three 4000-footers in one very long day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a tough day, covering 22 miles and climbing three very big mountains.  At the end of the hike, I felt like I was going to lose the toenails on both big toes (which, in fact, I did!) … it was a bit much!

Last time I wrote about climbing to the top of Bondcliff, the first summit of that day.  This time, I will describe the brief walk from there to the top of Mt Bond, the tallest of the three Bonds.  And next time I’ll finish describing that day, with the ascent of West Bond and the return to the trail-head at Lincoln Woods.

*

As I described last time, I arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  I was able to get an early start because I had stayed the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.

It was a bright and mostly-sunny day, with just a few clouds and some haze.  The path between Bondcliff and Mt Bond is quite short – really just dropping down to a saddle, and then back up again, only 1.2 miles:

Bond Map - 6b

 

It took me about an hour to cover that distance and reach the top of Mt Bond from Bondcliff at 11:30am.  The path was rocky as it descended from Bondcliff, in the alpine zone, with many large boulders as I began to go back up towards Mt Bond – some scrambling required.

This photo was taken at the saddle between Bondcliff and Mt Bond: on the left is Bondcliff, on the right is West Bond, and in the middle, in the distance, is Franconia Ridge; Mt Bond is behind me.  A glorious view on an amazing day for climbing:

IMG_1929.jpg

From the Left: Bondcliff, Franconia Ridge, West Bond

 

It got even steeper climbing up from the saddle to the summit, passing through some small pine shrubs, until just before the top.

The views were spectacular at the summit of Mt Bond, despite the sky being slightly hazy – I could see the four 4000-footers of the Franconia Ridge to the west and Owl’s Head in the foreground, the Presidential Range to the east, and several other 4000-footers to the south and south-west:

IMG_1948 (1)

Looking To The West From The Summit Of Mt Bond

 

And I had a nice view back down the short path from the top of Bondcliff:

IMG_1943 (1)

 

There were a few people at the top, and I had a brief conversation with a couple that were walking from Zealand trailhead across the same three mountains I was climbing, and finishing at Lincoln Woods.  This one-way version of what I was doing in an up-and-back trip was possible because they had left a car at Lincoln Woods, driving to the Zealand trailhead in a second vehicle.  They would then ferry themselves back to Zealand from Lincoln Woods.

Kindly, they offered to pick up my car down at Lincoln Woods and drive it to Zealand, which would have saved me three miles.  I should have accepted, because finishing what became 22 miles, and three 4000-foot peaks, would end up hobbling me for a while, and causing two toenails to come off!  But I didn’t have a clear sense of how the day would go, so I declined their offer, with sincere thanks…

Getting to the top of Mt Bond was my 36th 4000-footer – just 12 more to go!

I didn’t stay too long at the top of Mt Bond on the way up, continuing towards West Bond… stay tuned for that next time!

*

Jean and I had moved to Sydney in July of 2009, where I would take up the newly-created position of International Program Director for ChildFund Australia.  It was an exciting opportunity for me to work in a part of the world I knew and loved (Southeast Asia: Cambodia, Laos, Myanmar and Viet Nam) and in a challenging new country (Papua New Guinea).  It was a great chance to work with some really amazing people – in Sydney and in our Country Offices… to use what I had learned to help build and lead effective teams.  Living in Sydney would not be a hardship post, either!  Finally, it was a priceless chance for me to put together a program approach that incorporated everything I had learned to that point, over 25 years working in poverty reduction and social justice.

In the previous article in this series, I described how we developed a “Development Effectiveness System” (“DEF”) for ChildFund Australia, and I went through most of the components of the DEF in great detail.

My ambition for the DEF was to bring together our work into one comprehensive system – building on our Theory of Change and organizational Vision and Mission, creating a consistent set of tools and processes for program design and assessment, and making sure to close the loop with defined opportunities for learning, reflection, and improvement.

Here is the graphic that we used to describe the system:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

As I said last time, I felt that three components of the DEF were particularly innovative, and worth exploring in more detail in separate blog articles:

  • I will describe components #2 (“Outcome Indicator Surveys) and #12 (Statement of Impact) in my next article.  Together, these components of the DEF were meant to enable us to measure the impact of our work in a robust, participatory way, so that we could learn and improve;
  • this time, I want to explore component #3 of the DEF: “Case Studies.”

*

It might seem strange to say it this way, but the “Case Studies” were probably my favorite of all the components of the DEF!  I loved them because they offered direct, personal accounts of the impact of projects and programs from children, youth, men and women from the communities in which ChildFund worked and the staff and officials of local agencies and government offices with whom ChildFund partnered.  We didn’t claim that the Case Studies were random or representative samples; rather, their value was simply as stories of human experience, offering insights would not have been readily gained from quantitative data.

Why was this important?  Why did it appeal to me so much?

*

Over my years working with international NGOs, I had become uneasy with the trend towards exclusive reliance on linear logic and quantitative measurement, in our international development sector.  This is perhaps a little bit ironic, since I had joined the NGO world having been educated as an engineer, schooled in the application of scientific logic and numerical analysis for practical applications in the world.

Linear logic is important, because it introduces rigor in our thinking, something that had been weak or lacking when I joined the sector in the mid-1980s.  And quantitative measurement, likewise, forced us to face evidence of what we had or had not achieved. So both of these trends were positive…

But I had come to appreciate that human development was far more complex than building a water system (for example), much more complicated than we could fully capture in linear models.  Yes, a logical, data-driven approach was helpful in many ways, perhaps nearly all of the time, but it didn’t seem to fit every situation in communities that I came to know in Latin America, Africa, and Asia.  In fact, I began to see that an over-emphasis on linear approaches to human development was blinding us to ways that more qualitative, non-linear thinking could help; we seemed to be dismissing the qualitative, narrative insights that should also have been at the heart of our reflections.  No reason not to include both quantitative and qualitative measures.  But we weren’t.

My career in international development began at a time when the private-sector, business culture, started to influence our organizations in a big way: as a result of the Ethiopian famine of the mid-1980’s, INGOs were booming and, as a result, were professionalizing, introducing business practices.  All the big INGOs started to bring in people from the business world, helping “professionalize” our work.

I’ve written elsewhere about the positive and negative effects that business culture had on NGOs: on the positive side, we benefited from systems and approaches the improved the internal management of our agencies, such as clear delegations of authority, financial planning and audit, etc.  Overall, it was a very good, and very necessary evolution.

But there were some negatives.  In particular, the influx of private-sector culture into our organizations meant that:

  • We began increasingly to view the world as a linear, logical place;
  • We came to embrace the belief that bigger is always better;
  • “Accountability” to donors became so fundamental that sometimes it seemed to be our highest priority;
  • Our understanding of human nature, of human poverty, evolved towards the purely material, things that we could measure quantitatively.

I will attach a copy of the article I wrote on this topic here:  mcpeak-trojan-horse.

In effect, this cultural shift had the effect of emphasizing linear logic and quantitative measures to such a degree, with such force, that narrative, qualitative approaches were sidelined as, somehow, not business-like enough.

As I thought about the overall design of the DEF, I wanted to make 100% sure that we were able to measure the quantitative side of our work, the concrete outputs that we produced and the measurable impact that we achieved (more on that next time).  Because the great majority of our work was amenable to that form of measurement, and being accountable for delivering the outputs (projects, funding) that we had promised was hugely important.

But I was equally determined that we would include qualitative elements that would enable us to capture the lived experience of people who facing poverty.  In other words, because poverty is experienced holistically by people, including children, in ways that can be captured quantitatively and qualitatively, we needed to incorporate both quantitative and qualitative measurement approaches if we were to be truly effective.

The DEF “Case Studies” was one of the ways that we accomplished this goal.  It made me proud that we were successful in this regard.

*

There was another reason that I felt that the DEF Case Studies were so valuable, perhaps just as important as the way that they enabled us to measure poverty more holistically.  Observing our organizations, and seeing my own response to how we were evolving, I clearly saw that the influence of private-sector, business culture was having positive and negative effects.

One of the most negative impacts I saw was an increasing alienation of our people from the basic motivations that led them to join the NGO sector, a decline in the passion for social justice that had characterized us.  Not to exaggerate, but it seemed that we were perhaps losing our human connection with the hope and courage and justice that, when we were successful, we helped make for individual women and men, girls and boys.  The difference we were making in the lives of individual human beings was becoming obscured behind the statistics that we were using, behind the mechanical approaches we were taking to our work.

Therefore, I was determined to use the DEF Case Studies as tools for reconnecting us, ChildFund Australia staff and board, to the reason that we joined in the first place.  All of us.

*

So, what were the DEF Case Studies, and how were they produced and used?

In practice, Development Effectiveness and Learning Managers in ChildFund’s program countries worked with other program staff and partners to write up Case Studies that depicted the lived experience of people involved in activities supported by ChildFund.  The Case Studies were presented as narratives, with photos, which sought to capture the experiences, opinions and ideas of the people concerned, in their own words, without commentary.  They were not edited to fit a success-story format.  As time went by, our Country teams started to add a summary of their reflections to the Case Studies, describing their own responses to the stories told there.

Initially we found that field staff had a hard time grasping the idea, because they were so used to reporting their work in the dry, linear, quantitative ways that we had become used to.  Perhaps program staff felt that narrative reports were the territory of our Communications teams, meant for public-relations purposes, describing our successes in a way that could attract support for our work.  Nothing wrong with that, they seemed to feel, but not a program thing!

Staff seemed at a loss, unable to get going.  So we prepared a very structured template for the Case Studies, specifying length and tone and approach in detail.  This was a mistake, because we really wanted to encourage creativity while keeping the documents brief; emphasizing the “voice” of people in communities rather than our own views; covering failures as much as successes.  Use of a template tended to lead our program staff into a structured view of our work, so once we gained some experience with the idea, as staff became more comfortable with the idea and we began to use these Case Studies, we abandoned the rigid template and encouraged innovation.

*

So these Case Studies were a primary source of qualitative information on the successes and failures of ChildFund Australia’s work, offering insights from children, youth and adults from communities where we worked and the staff of local agencies and government offices with whom ChildFund Australia partnered.

In-country staff reviewed the Case Studies, accepting or contesting the opinions of informants about ChildFund Australia’s projects.  These debates often led to adjustments to existing projects but also triggered new thinking – at the project activity level but also at program level or even the overall program approach.

Case Studies were forwarded to Sydney, where they were reviewed by the DEF Manager; some were selected for a similar process of review by International Program staff, members of the Program Review Committee and, on occasion, by the ChildFund Australia Board.

The resulting documents were stored in a simple cloud-based archive, accessible by password to anyone within the organization.  Some Case Studies were also included on ChildFund Australia’s website; we encouraged staff from our Communications team in Sydney to review the Case Studies and, if suitable, to re-purpose them for public purposes.  Of course, we were careful to obtain informed consent from people included in the documents.

*

Through Case Studies, as noted above, local informants were able to pass critical judgement on the appropriateness of ChildFund’s strategies, how community members perceived our aims and purposes (not necessarily as we intended); and they could alert us to unexpected consequences (both positive and negative) of what we did.

For example, one of the first Case Studies written up in Papua New Guinea revealed that home garden vegetable cultivation not only resulted in increased family income for the villager concerned (and positive impact on children in terms of nutrition and education), it also enhanced his social standing through increasing his capacity to contribute to traditional cultural events.

Here are three images from that Case Study:

Screen Shot 2018-06-09 at 3.07.54 PM

Screen Shot 2018-06-09 at 3.07.27 PM

Screen Shot 2018-06-09 at 3.07.41 PM

 

And here is a copy of the Case Study itself:  PNG Case Study #1 Hillary Vegetable farming RG edit 260111.  Later I was able to visit Hillary at his farm!

Another Case Study came from the ChildFund Connect project, an exciting effort led by my former colleagues Raúl Caceres and Kelly Royds, who relocated from Sydney to Boston in 2016.  I climbed Mt Moriah with them in July, 2017, and also Mt Pierce and Mt Eisenhower in August of 2016.  ChildFund Connect was an innovative project that linked children across Laos, Viet Nam, Australia and Sri Lanka, providing a channel for them directly to build understanding of their differing realities.   This Case Study on their project came from Laos: LAO Case Study #3 Connect DRAFT 2012.

In a future article in this series, I plan on describing work we carried out building the power (collective action) of people living in poverty.  It can be a sensitive topic, particularly in areas of Southeast Asia without traditions of citizen engagement.  Here is a Case Study from Viet Nam describing how ChildFund helped local citizens connect productively with authorities to resolve issues related to access to potable water: VTM Case Study #21 Policy and exclusion (watsan)-FINAL.

*

Dozens of Case Studies were produced, illustrating a wide range of experiences with the development processes supported by ChildFund in all of the countries where we managed program implementation.  Reflections from many of these documents helped us improve our development practice, and at the same time helped us stay in touch with the deeper purpose of our having chosen to work to promote social justice, accompanying people living in poverty as they built better futures.

*

A few of the DEF Case Studies focused, to some extent, on ChildFund Australia itself.  For example, here is the story of three generations of Hmong women in Nonghet District in Xieng Khoung Province in Laos.  It describes how access to education has evolved across those generations:  LAO Case Study #5 Ethnic Girls DRAFT 2012.  It’s a powerful description of change and progress, notable also because one of the women featured in the Case Study was a ChildFund employee, along with her mother and daughter!

Two other influential Case Studies came from Cambodia, both of which touched on how ChildFund was attempting to manage our child-sponsorship mechanisms with our programmatic commitments.  I’ve written separately, some time ago, about the advantages of child sponsorship: when managed well (as we did in Plan and especially in ChildFund Australia), and these two Case Studies evocatively illustrated the challenge, and the ways that staff in Cambodia were making it all work well.

One Case Study describes some of the tensions implicit in the relationship between child sponsorship and programming, and the ways that we were making progress in reconciling these differing priorities: CAM Case Study 6 Sponsorship DRAFT 2012.  This Case Study was very influential, with our staff in Cambodia and beyond, with program staff in Sydney, and with our board.  It powerfully communicated a reality that our staff, and families in communities, were facing.

A second Case Study discussed how sponsorship and programs were successfully integrated in the field in Cambodia: CAM Case Study #10 Program-SR Integration Final.

*

As I mentioned last time, given the importance of the system, relying on our feeling that the DEF was a great success wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

The first review (attached here: External DEF Review – November 2012), which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in my next blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advanced in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

I included an overview of the conclusions reached by both reviewers last time.  Here I want to quote from the first evaluation, with particular reference to the DEF Case Studies:

One of the primary benefits of the DEF is that it equips ChildFund Australia with an increased quantity and quality of evidence-based information for communications with key stakeholders including the Board and a public audience. In particular, there is consolidated output data that can be easily accessed by the communications team; there is now a bank of high quality Case Studies that can be drawn on for communication and reflection; and there are now dedicated resources in-country who have been trained and are required to generate information that has potential for communications purposes. The increase in quantity and quality of information equips ChildFund Australia to communicate with a wide range of stakeholders.

One of the strengths of the DEF recognized by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through Case Studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

This focus on building tools, systems and the overall capacity of the organization places ChildFund Australia in a strong position to tackle a second phase of the DEF which looks at how the organization will use performance information for learning and development. It has already started on this journey, with various parts of the organization using Case Studies for reflection. ChildFund Australia has already undertaken an exercise of coding the bank of Case Studies to assist further analysis and learning. There is lots of scope for next steps with this bank of Case Studies, including thematic reflections. Again, the benefits of this aspect have not been realised yet as the first stages of the DEF roll-out have been focused on data collection and embedding the system in CF practices.

In most Country Offices, Case Studies have provided a new formal opportunity for country program staff to reflect on their work and this has been used as a really constructive process. The Laos Country Office is currently in the process of translating Case Studies so that they can be used to prompt discussion and learning at the country level. In PNG, the team is also interested in using the Case Studies as a communication tool with local communities to demonstrate some of the achievements of ChildFund Australia programs.

In some cases, program staff have found Case Studies confronting when they have highlighted program challenges or weaknesses. The culture of critical reflection may take time to embed in some country offices and may be facilitated by cross-country reflection opportunities. Currently, however, Country Office staff do not know how to access Case Studies from other country programs. ChildFund Australia is exploring how the ‘bank’ of DEF Case Studies would be most accessible and useful to country office personnel.

One of the uses of Case Studies has been as a prompt for discussion and reflection by the programs team in Sydney and by the Board. Case Studies have been seen as a really useful way to provide an insight into a program, practice and ChildFund Australia achievements.

At an organizational level, an indexing and cross-referencing system has been implemented which enables Case Studies to be searched by country and by theme. The system is yet to be introduced to MEL and Program users, but has potential to be a very useful bank of qualitative data for reflection and learning. It also provides a bank of data from which to undertake thematic reflections across and between countries. One idea for consideration is that ChildFund draw on groups of Case Studies to develop practice notes.

In general Case Studies are considered to be the most ‘successful’ part of the DEF by those involved in collecting information.

The second reviewer concentrated on other components, mainly aspects I will describe in more detail in my next article, not so much the Case Studies…

*

So the Case Studies were a very important element in the overall DEF.  I tried very hard to incorporate brief reflections on selected Case Studies at every formal meeting of the International Program Team, of ChildFund Australia’s Program Review Committee, and (less frequently) at meetings of our Board of Directors.  More often than not, time pressures on the agendas of these meetings led to us dropping the Case Studies from discussion, but often enough we did spend time (usually at the beginning of the meetings) reflecting on what we saw in them.

At the beginning, when we first began to use the Case Studies, our discussion tended to be mechanical: pointing out errors in the use of English, or questioning how valid the observations might be, challenging the statistical reliability of the conclusions.  But, over time, I noticed that our teams began to use the Case Studies as they were designed: to gain insight into the lived experience of particular human beings, and to reconnect with the realities of people’s struggle for better lives for themselves and their children.

This was a great success, and really worked as I had hoped.  The Case Studies complemented the more rigorous, quantitative components of the DEF, helping the system be holistic, enabling us to see more deeply into the effect that our work was having while also enhancing our accountability.

*

Next time, I will describe getting to the top of West Bond, and all the way down the 11 miles from there to the Lincoln Woods parking lot, where I staggered back to my car with such damage to my feet that I soon would lose toenails on both my big toes!  And I will share details of the final two components of the DEF that I want to highlight: the Outcome Indicator Surveys and Statements of Impact were probably the culmination of the whole system.

So, stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System.

 

 

Bondcliff (35) – ChildFund Australia’s Development Effectiveness Framework

June, 2018

I began a new journey just over two years ago, in May, 2016, tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers. I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

In each article, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

So, when I wrap things up in this series, there should be 48 articles…

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia, a newly-created position.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team – my intention was to lead and manage with clarity, trust, and inspiration.  A few weeks ago, I wrote describing the role and staffing and structural iterations of ChildFund’s International Program Team and, last time, I outlined the foundational program approach we put in place – a Theory of Change and Outcome and Output Indicators.

Once the program approach was in place, as a strong foundation, we moved forward to build a structured approach to development effectiveness.  I am very proud of what we achieved: the resulting ChildFund Australia “Development Effectiveness Framework” (“DEF”) was, I think, state-of-the-art for international NGOs at the time.  Certainly few (if any) other INGOs in Australia had such a comprehensive, practical, useful system for ensuring the accountability and improvement of their work.

Since the DEF was so significant, I’m going to write three articles about it:

  1. In this article I will describe the DEF – its components, some examples of products generated by the DEF, and how each part of the system worked with the other parts.  I will also share results of external evaluations that we commissioned on the DEF itself;
  2. Next time, I will highlight one particular component of the DEF, the qualitative “Case Studies” of the lived experience of human change.  I was especially excited to see these Case Studies when they started arriving in Sydney from the field, so I want to take a deep dive into what these important documents looked like, and how we attempted to use them;
  3. Finally, I will the last two DEF components that came online (Outcome Indicator Surveys and Statements of Impact), the culmination of the system, where we assessed the impact of our work.

So there will be, in total, three articles focused on the DEF.  This is fitting, because I climbed three mountains on one day in August of 2017…

*

On 10 August, 2017, I climbed three 4000-footers in one day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a very long, very tough day, covering 22 miles and climbing three mountains in one go.  At the end of the hike, I felt like I was going to lose the toenails on both big toes… and, in fact, that’s what happened.  As a result, for the rest of the season I would be unable to hike in boots and had to use hiking shoes instead!

Knowing that the day would be challenging, I drove up from Durham the afternoon before and camped, so I could get the earliest start possible the next morning.  I got a spot at Hancock Campground, right near the trailhead where I would start the climb:

IMG_1871.jpg

 

The East Branch of the Pemigewassit River runs alongside this campground, and I spent a pleasant late afternoon reading a book by Jean Paul Lederach there, and when it was dark I crawled into my sleeping bag and got a good night’s sleep.

IMG_1868

IMG_1869

 

Here is a map of the long ascent that awaited me the next morning, getting to the top of Bondcliff:

Bond Map - 3.jpg

 

After Bondcliff, the plan was that I would continue on to climb Mt Bond and West Bond, and to then return to Lincoln Woods… more on that in the next two articles in this series.  In this one I will describe climbing the first 4000-footer of that day, Bondcliff.

I got an early start on 10 August, packing up my tent-site and arriving at the trailhead at Lincoln Woods at about 6:30am:

IMG_1873.jpg

 

It was just two weeks earlier that I had parked here to climb Owl’s Head, which I had enjoyed a lot.  This time, I would begin the same way – walking up the old, abandoned forestry railway for about 2.6 miles on Lincoln Woods Trail, to where I had turned left up the Franconia Brook Trail towards Owl’s Head.  I arrived at that junction at about 7:30am:

IMG_1883.jpg

IMG_1891.jpg

 

 

This time I would continue straight at that intersection, continuing onto the Wilderness Trail, which winds through forest for a short distance, before opening out again along another old logging railway, complete with abandoned hardware along the way, discarded over 130 years ago:

IMG_1893.jpg

 

At the former (and now abandoned) Camp 16 (around 4.4 miles from the parking lot at Lincoln Woods), I took a sharp left and joined a more normal trail – no more old railway.  I began to ascend moderately, going up alongside Black Brook: now I was on the Bondcliff Trail.

 

I crossed Black Brook twice on the way up after leaving the Wilderness Trail, and then crossed two dry beds of rock, which were either rock slides or upper reaches of Black Brook that were dry that day.

IMG_1898.jpg

 

It’s a long climb up Black Brook; after the second dry crossing, Bondcliff Trail takes a sharp left turn and continues ascending steadily.  Just before reaching the alpine area, and the summit of Bondcliff, there is a short steep section, where I had to scramble up some bigger boulders.  Slow going…

But then came the reward: spectacular views to the west, across Owl’s Head to Franconia Ridge, up to Mt Garfield, and over to West Bond and Mt Bond.  Here Mt Lincoln and Mt Lafayette are on the left, above Owl’s Head, with Mt Garfield to the right:

IMG_1905

Lincoln and Lafayette In The Distance On The Left, Mt Garfield In The Distance On The Right

 

Here is a view looking to the southwest from the top of Bondcliff:

IMG_1907

From The Summit Of Bondcliff

IMG_1920

From The Summit Of Bondcliff

 

And this is the view towards Mt Bond, looking up from the top of Bondcliff:

IMG_1925

West Bond Is On The Left, And Mt Bond On The Right

 

I got to the top of Bondcliff at about 10:30am, just about four hours from the start of the hike.  Feeling good … at this point!  Here is a spectacular view back down towards Bondcliff, taken later in the day, from the top of West Bond:

IMG_1964.jpg

 

I would soon continue the climb, with a short hop from Bondcliff up to the top of Mt Bond.  Stay tuned!

*

Last time I wrote about how we built the foundations for ChildFund Australia’s new program approach: a comprehensive and robust “Theory of Change” that described what we were going to accomplish at a high level, and why; a small number of reliable, measurable, and meaningful “Outcome Indicators” that would enable us to demonstrate the impact of our work as related explicitly to our Outcome Indicators; and a set of “Output Indicators” that would allow us to track our activities in a consistent and comparable manner, across our work across all our programs: in Cambodia, Laos, Papua New Guinea, and Viet Nam.  (Myanmar was a slightly different story, as I will explain later…)

Next, on that foundation, we needed a way of thinking holistically about the effectiveness of our development work: a framework for planning our work in each location, each year; for tracking whether we were doing what we had planned; for understanding how well we were performing; and improving the quality and impact of our work.  And doing all this in partnership with local communities, organizations, and governments.

This meant being able to answer five basic questions:

  1. In light of our organizational Theory of Change, what are we going to do in each location, each year?
  2. how will we know that we are doing what we planned to do?
  3. how will we know that our work makes a difference and gets results consistent with our Theory of Change?;
  4. how will we learn from our experience, to improve the way we work?;
  5. how can community members and local partners directly participate in the planning, implementation, and evaluation of the development projects that ChildFund Australia supports?

Looking back, I feel that what we built and implemented to answer those questions – the ChildFund Australia “Development Effectiveness Framework” (“DEF”) – was our agency’s most important system.  Because what could be more important than the answers to those five questions?

*

I mentioned last time that twice, during my career with Plan International, we had tried to produce such a system, and failed (at great expense).  We had fallen into several traps that I was determined to avoid repeating this time, in ChildFund Australia, as we developed and implemented the DEF:

  • We would build a system that could be used by our teams with the informed participation of local partners and staff, practically – that was “good enough” for its purpose, instead of a system that had to be managed by experts, as we had done in Plan;
  • We would include both quantitative and qualitative information, serving the needs of head and heart, instead of building a wholly-quantitative system for scientific or academic purposes, as we had done in Plan;
  • We would not let “the best be the enemy of the good,” and I would make sure that we moved to rapidly prototype, implement, and improve the system instead of tinkering endlessly, as we had done in Plan.

I go into more detail about the reasons for Plan’s lack of success in that earlier article.

*

Here is a graphic that Caroline Pinney helped me create, which I used very frequently to explain how the DEF was designed, functioned, and performed:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

In this article, I will describe each component of the DEF, outlining how each relates to each other and to the five questions outlined above.

However, I’m going to reserve discussion of three of those components for my next two articles:

  • Next time, I will cover #3 in Figure 1, the “Case Studies” that we produced.  These documents helped us broaden our focus from the purely quantitative to include consideration of the lived experience of people touched by the programs supported by ChildFund Australia.  In the same way, the Case Studies served as valuable tools for our staff, management, and board to retain a human connection to the spirit that motivated us to dedicate our careers to social justice;
  • And, after that, I will devote an article to our “Outcome Indicator Surveys” (#2 in Figure 1, above) and Statements of Impact (#12 in Figure 1). The approach we took to demonstrating impact was innovative and very participatory, and successful.  So I want to go into a bit of depth describing the two DEF components involved.

Note: I prepared most of what follows.  But I have included and adapted some descriptive material produced by the two DEF Managers that worked in the International Program Team:  Richard Geeves and Rouena Getigan.  Many thanks to them!

*

Starting Points

The DEF was based on two fundamental statements of organizational identity.  As such, it was built to focus us on, and enable us to be accountable for, what we were telling the world we were:

  1. On the bottom left of the DEF schematic (Figure 1, above) we reference the basic documents describing ChildFund’s identity: our Vision, Mission, Strategic Plan, Program Approach, and Policies – all agreed and approved by our CEO (Nigel Spence) and Board of Directors.  The idea was that the logic underlying our approach to Development Effectiveness would therefore be grounded in our basic purpose as an organization, overall.  I was determined that the DEF would serve to bring us together around that purpose, because I had seen Plan tend to atomize, with each field location working towards rather different aims.  Sadly, Plan’s diversity seemed to be far greater than required if it were simply responding to the different conditions we worked in.  For example, two Field Offices within 20 km of each other in the same country might have very different programs.  This excessive diversity seemed to relate more to the personal preferences of Field Office leadership than to any difference in the conditions of child poverty or the local context.  The DEF would help ChildFund Australia cohere, because our starting point was our organizational identity;
  2. But each field location did need a degree flexibility to respond to their reality, within ChildFund’s global identity, so at the bottom of the diagram we placed the Country Strategy Paper (“CSP”), quite centrally.  This meant that, in addition to building on ChildFund Australia’s overall purpose and identity globally, we would also build our approach to Development Effectiveness on how we chose to advance that basic purpose in each particular country where we worked, with that country’s particular characteristics.

Country Strategy Paper

The purpose and outline of the CSP was included in the ChildFund Australia Program Handbook:

To clarify, define, communicate and share the role, purpose and structure of ChildFund in-country – our approach, operations and focus. The CSP aims to build a unity of purpose and contribute to the effectiveness of our organisation.

When we develop the CSP we are making choices, about how we will work and what we will focus on as an organisation. We will be accountable for the commitments we make in the CSP – to communities, partners, donors and to ourselves.

While each CSP will be different and reflect the work and priorities of the country program, each CSP will use the same format and will be consistent with ChildFund Australia’s recent program development work.

During the development of the CSP it is important that we reflect on the purpose of the document. It should be a useful and practical resource that can inform our development work. It should be equally relevant to both our internal and external stakeholders. The CSP should be clear, concise and accessible while maintaining a strategic perspective. It should reflect clear thinking and communicate our work and our mission. It should reflect the voice of children.  Our annual work plans and budgets will be drawn from the CSP and we will use it to reflect on and review our performance over the three year period.

Implementation of the DEF flowed from each country’s CSP.

More details are found in Chapter 5 of the Program Handbook, available here: Program Handbook – 3.3 DRAFT.  Two examples of actual ChildFund Australia Country  Strategy Papers from my time with the organization are attached here:

For me, these are clear, concise documents that demonstrate coherence with ChildFund’s overall purpose along with choices driven by the situation in each country.

*

Beginning from the Country Strategy Paper, the DEF branches in two inter-related (in fact, nested) streams, covering programs (on the left side) and projects (on the right side).  Of course, projects form part of programs, consistent with our program framework:

Screen Shot 2018-05-28 at 2.16.30 PM

Figure 2: ChildFund Australia Program Framework

 

But it was difficult to depict this embedding on the two dimensions of a graphic!  So Figure 1 showed programs on one side and projects on the other.

Taking the “program” (left) side first:

Program Description

Moving onto the left side of Figure 1, derived from the Country Strategy Paper, and summarized in the CSP, each Country Office defined a handful (some countries had 3, others ended up with 5) “Program Descriptions” (noted as #1 in Figure 1), each one describing how particular sets of projects would create impact, together, as measured using ChildFund Australia’s Outcome Indicators – in other words, a “Theory of Change,” detailing how the projects included in the program linked together to create particular  positive change.

The purpose and outline of the Program Description was included in the ChildFund Australia Program Handbook:

ChildFund Australia programs are documented and approved through the use of “Program Descriptions”.  All Program Descriptions must be submitted by the Country Director for review and approval by the Sydney International Program Director, via the International Program Coordinator.

For ChildFund Australia: a “program” is an integrated set of projects that, together, have direct or indirect impact on one or more of our agreed organisational outcome indicators.   Programs normally span several geographical areas, but do not need to be implemented in all locations; this will depend on the geographical context.  Programs are integrated and holistic. They are designed to achieve outcomes related to ChildFund Australia’s mission, over longer periods, while projects are meant to produce outputs over shorter timeframes.

Program Descriptions were summarized in the CSP, contained a listing of the types of projects (#5 in Figure 1) that would be implemented, and were reviewed every 3 or 4 years (Program Review, #4 in Figure 1).

To write a Program Description, ChildFund staff (usually program managers in a particular Country Office) were expected to review our program implementation to-date, carry out extensive situational analyses of government policies, plans and activities in the sector and of communities’ needs in terms of assets, aspirations and ability to work productively with local government officials responsible for service provision. The results of ChildFund’s own Outcome Indicator surveys and community engagement events obviously provided very useful evidence in this regard.

Staff then proposed a general approach for responding to the situation and specific strategies which could be delivered through a set of projects.  They would also show that the approach and strategies proposed are consistent with evidence from good practice both globally and in-country, demonstrated that their choices were evidence-based.

Here are 2 examples of Program Descriptions:

Producing good, high-quality Program Descriptions was a surprising challenge for us, and I’m not sure we ever really got this component of the DEF right.  Probably the reason that we struggled was that these documents were rather abstract, and our staff weren’t used to operating at this level of abstraction.

Most of the initial draft Program Descriptions were quite superficial, and were approved only as place-holders.  Once we started to carry out “Program Reviews” (see below), however, where more rigor was meant to be injected into the documents, we struggled.  It was a positive, productive struggle, but a struggle nonetheless!

We persisted, however, because I strongly believed that our teams should be able to articulate why they were doing what they were doing, and the Program Descriptions were the basic tool for that exact explanation.  So we perservered, hoping that the effort would result in better programs, more sophisticated and holistic work, and more impact on children living in poverty.

*

 

 

Program Reviews

For the same reasons outlined above, in my discussion of the “Program Descriptions” component of the DEF, we also struggled with the “Program Review” (#4 in Figure 1, above).  In these workshops, our teams would consider an approved “Program Description” (#1 in Figure 1) every three or four years, subjecting the document to a formal process of peer review.

ChildFund staff from other countries visited the host country to participate in the review process and then wrote a report making recommendations for how the Program under review might be improved.  The host country accepted (or debated and adjusted) the  recommendations, acted on them and applied them to a revision of the Program Description: improving it, tightening up the logic, incorporating lessons learned from implementation, etc.

Program Reviews were therefore fundamentally about learning and improvement, so we made sure that, in addition to peers from other countries, the host Country Office invited in-country partners and relevant experts.  And International Program Coordinators from Sydney were asked to always attend Program Reviews in the countries that they were supporting, again for learning and improvement purposes.

The Program Reviews that I attended were useful and constructive, but I certainly sensed a degree of frustration.  In addition to struggling with the relatively-high levels of abstraction required, our teams were not used to having outsiders (even their peers other ChildFund offices) critique their efforts.  So, overall, this was a good and very-important component of the DEF, designed correctly, but it needed more time for our teams to learn how to manage this process and to be open to such a public process of review.

*

Projects and Quarterly Reports

As shown on the right hand side of Figure 1, ChildFund’s field staff and partners carried out routine monitoring of projects (#6 in the Figure) to ensure that they were on track, and on which they based their reporting on activities and outputs.  Project staff summarized their monitoring through formal Quarterly Reports (#7) on each project documenting progress against project plans, budgets, and targets to ensure projects are well managed.  These Quarterly Reports were reviewed in each Country Office and most were also forwarded to ChildFund’s head office in Sydney (and, often, donors) for review.

When I arrived, ChildFund Australia’s Quarterly reporting was well-developed and of high quality, so I didn’t need to focus on this aspect of our work.  We simply incorporated it into the more-comprehensive DEF.

*

Quarterly Output Tracking

As described last time, ChildFund developed and defined a set of Outputs which became standard across the organization in FY 2011-12.  Outputs in each project were coded and  tracked from Quarter to Quarter by project.  Some of the organizational outputs were specific to a sector such as education, health and water sanitation or a particular target group such as children, youth or adults.  Other Outputs were generic and might be found in any project, for example, training or awareness raising, materials production and consultation.

Organizational Outputs were summarized for all projects in each country each Quarter and country totals were aggregated in Sydney for submission to our Board of Directors (#8 in Figure 1, above).  In March 2014 there were a total of 47 organizational Outputs – they were listed in my last article in this series.

One purpose of this tracking was to enhance our accountability, so a summary was reviewed every Quarter in Sydney by the International Program Team and our Program Review Committee.

Here is an example of how we tracked outputs: this is a section of a Quarterly Report produced by the International Program Team for our Board and Program Review Committee: Output Report – Q4FY15.

*

Project Evaluations

ChildFund also conducted reviews or evaluations of all projects (#9 in Figure 1, above) – in different ways.  External evaluators were employed under detailed terms of reference to evaluate multi-year projects with more substantial budgets or which were significant for learning or to a particular donor.  Smaller projects were generally evaluated internally.  All evaluators were expected to gather evidence of results against output targets and performance indicators written against objectives.

*

All development effectiveness systems have, at their heart, mechanisms for translating operational experiences into learning and program improvement.  In the representation of ChildFund’s DEF in Figure 1, this was represented by the central circle in the schematic which feeds back evidence from a variety of sources into our organizational and Country Strategy Papers, Program Descriptions and project planning and design.

Our program staff found that their most effective learning often occurred during routine monitoring through observation of project activities and conversations in communities with development partners.  Through thoughtful questioning and attentive listening, staff could make the immediate decisions and quick adjustments which kept project activities relevant and efficient.

Staff also had more formal opportunities to document and reflect on learning.  The tracking of outputs and aggregation each Quarter drew attention to progress and sometimes signaled the need to vary plans or redirect resources.

Project evaluations (#9 in Figure 1, above) provided major opportunities for learning, especially when external evaluators bring their different experiences to bear and offer fresh perspectives on a ChildFund project.

*

The reader can easily grasp that, for me, the DEF was a great success, a significant asset for ChildFund Australia that enabled us to be more accountable and effective.  Some more-technically-focused agencies were busy carrying out sophisticated impact evaluations, using control groups and so forth, but that kind of effort didn’t suit the vast majority of INGOs.  We could benefit from the learnings that came from those scientific evaluations, but we didn’t have the resources to introduce such methodologies ourselves.  And so, though not perfect, I am not aware of any comparable organization that succeeded as we did with our DEF.

While the system built on what I had learned over nearly 30 years, and even though I felt that it was designed comprehensively and working very well, that was merely my opinion!

Given the importance of the system, relying on my opinion (no matter how sound!) wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

*

The first review, which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in an upcoming blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advance in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

In that light, this first external review of the DEF concluded the following:

The development of the DEF places ChildFund Australia in a sound position within the sector in the area of development effectiveness. The particular strength of ChildFund Australia’s framework is that it binds the whole organisation to a set of common indicators and outputs. This provides a basis for focussing the organisation’s efforts and ensuring that programming is strategically aligned to common objectives. The other particular strength that ChildFund Australia’s framework offers is that it provides a basis for aggregating its achievements across programs, thereby strengthening the organisation’s overall claims of effectiveness.

Within ChildFund Australia, there is strong support for the DEF and broad agreement among key DEF stakeholders and users that the DEF unites the agency on a performance agenda. This is in large part due to dedicated resources having been invested and the development of a data collection system has been integrated into the project management system (budgeting and planning, and reporting), thereby making DEF a living and breathing function throughout the organisation. Importantly, the definition of outcomes and outputs indicators provides clarity of expectations across ChildFund Australia.

One of the strengths of the DEF recognised by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through case studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

Significantly, the DEF signals a focus on effectiveness to donors and the sector. One of the benefits already felt by ChildFund Australia is that it is able to refer to its effectiveness framework in funding submissions and in communication with its major donors who have an increasing interest on performance information.

Overall, the review found that the pilot of the DEF has been implemented well, with lots of consultation and engagement with country offices, and lots of opportunity for refinement. Its features are strong, enabling ChildFund to both measure how much it is doing, and the changes that are experienced by communities over time. The first phase of the DEF has focused on integrating effectiveness measurement mechanisms within program management and broader work practices, while the second phase of the DEF will look at the analysis, reflection and learning aspects of effectiveness. This second phase is likely to assist various stakeholders involved in collecting effectiveness information better understand and appreciate the linkages between their work and broader organisational learning and development. This is an important second phase and will require ongoing investment to maximise the potential of the DEF. It place ChildFund Australia in a strong position within the Australian NGO sector to engage in the discourse around development effectiveness and demonstrate its achievements.

A full copy of this first review, removing only the name of the author, is attached here: External DEF Review – November 2012.

In early 2015 we carried out a second review.  This time, we had implemented the entire DEF, carrying out (for example) Statement of Impact workshops in several locations.  The whole system was now working.

At that point, we were very confident in the DEF – from our point of view, all components were working well, producing good and reliable information that was being used to improve our development work.  Our board, program-review committee, and donors were all enthusiastic.  More importantly, local staff and communities were positive.

The only major concern that remained related to the methodology we used in the Outcome Indicator Surveys.  I will examine this issue in more detail in an upcoming blog article in this series; but the reader will notice that this second formal, external evaluation focuses very much on the use of the LQAS methodology in gathering information for our Outcome Indicator workshops and Statements of Impact.

That’s why the external evaluator we engaged to carry out this second review was an expert in survey methodologies (in general) and in the LQAS (in particular.)

In that light, this second external review of the DEF concluded the following:

ChildFund Australia is to be commended for its commitment to implementing a comprehensive and rigorous monitoring and evaluation framework with learning at its centre to support and demonstrate development effectiveness. Over the past five years, DEL managers in Cambodia, Laos, Papua New Guinea and Vietnam, with support and assistance from ChildFund Australia, country directors and program managers and staff, have worked hard to pilot, refine and embed the DEF in the broader country programs.  Implementing the DEF, in particular the Outcome Indicator Survey using LQAS, has presented several challenges.  With time, many of the early issues have been resolved, tools improved and guidelines developed.  Nevertheless, a few issues remain that must be addressed if the potential benefits are to be fully realised at the organisational, country and program levels.

Overall, the DEF is well suited for supporting long-term development activities in a defined geographic area.  The methodologies, scope and tools employed to facilitate Outcome Indicator Surveys and to conduct Community Engagement and Attribution of Impact processes are mostly fit for purpose, although there is considerable room for improvement.  Not all of the outcome indicators lend themselves to assessment via survey; those that are difficult to conceptualise and measure being most problematic. For some indicators in some places, a ceiling effect is apparent limiting their value for repeated assessment. While outcome indicators may be broadly similar across countries, both the indicators and the targets with which they are to be compared should be locally meaningful if the survey results are to be useful—and used—locally.

Used properly, LQAS is an effective and relatively inexpensive probability sampling method.  Areas for improvement in its application by ChildFund include definition of the lots, identification of the sampling frame, sample selection, data analysis and interpretation, and setting targets for repeated surveys.

Community Engagement and the Attribution of Impact processes have clearly engaged the community and local stakeholders.  Experience to date suggests that they can be streamlined to some extent, reducing the burden on staff as well as communities.  These events are an important opportunity to bring local stakeholders together to discuss local development needs and set future directions and priorities.  Their major weakness lies in the quality of the survey results that are presented for discussion, and their interpretation.  This, in turn, affects the value of the Statement of Impact and other documents that are produced.

The DEF participatory processes have undoubtedly contributed to the empowerment of community members involved. Reporting survey results in an appropriate format, together with other relevant data, in a range of inviting and succinct documents that will meet the needs of program staff and partners is likely to increase their influence.

A full copy of this second review, removing only the name of the author, is attached here: DEF Evaluation – April 2015.

*

Great credit is due to ChildFund staff that contributed to the conceptualization, development, and implementation of the DEF.  In particular, Richard Geeves and Rouena Getigan in the International Program Team in Sydney worked very hard to translate my sometimes overly-ambitious concepts into practical guidelines, and ably supported our Country Offices.

One of the keys to the success of the DEF was that we budgeted for dedicated in-country support, with each Country Office able to hire a DEL Manager (two in Viet Nam, given the scale of our program there.)

Many thanks to Solin in Cambodia, Marieke in Laos, Joe in Papua New Guinea, and Thuy and Dung in Viet Nam: they worked very hard to make the DEF function in their complex realities.  I admire how that made it work so well.

*

In this article, I’ve outlined how ChildFund Australia designed a comprehensive and very robust Development Effectiveness System.  Stay tuned next time, when I describe climbing Mt Bond, and then go into much more depth on one particular component (the Case Studies, #3 in Figure 1, above).

After that, in the following article, I plan to cover reaching the top of West Bond and descending back across Mt Bond and Bondcliff (and losing toenails on both big toes!) and go into some depth to describe how we carried out Outcome Indicator Surveys (#2 in Figure 1) and Statements of Impact (#12) – in many ways, the culmination of the DEF.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change.

 

 

Owl’s Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change

May, 2018

I began a new journey just over two years ago (May, 2016), tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

So, in each article in this series, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team with clarity, trust, and inspiration.  Last time I described the role and staffing and structural iterations of the International Program Team there.

This time, I want to begin to unpack the program approach that we put in place, building on what was already there, and on the lessons I had learned in the previous 25 years.

But first…

*

Owl’s Head (4025ft, 1227m) is described by many hikers as uninteresting, boring, and challenging – something that “should not be left to the end” of the 48 peaks.  I guess that’s because climbers want to finish their long voyage up so many great mountains in a blaze of glory, but they find Owl’s Head to be a letdown after the challenges and thrills of the other 47 4000-footers.

I climbed Owl’s Head on 26 July, 2017, and enjoyed every minute of it!

Yes, it’s long and mostly in the forest.  Yes, getting up the rock slide on the western side of Owl’s Head is tough going.  Yes, there are several river crossings which can be problematic when the water’s high.  And, yes, it’s not a ridge walk, so the views are (mostly) obstructed.  But on this late-July day, the walking was fantastic, the river crossings were nerve-wracking but doable, and the views going up (and coming down) the rock slide, looking across at Franconia Ridge, were fantastic.

I left Durham at about 6am, getting an early start because my calculations were that the ascent would be over 6 hours, just getting to the top.  Figuring in a descent of 4 hours, at least, made me want to get walking as soon as possible.  As has been my normal routine these days, I stopped in Tilton for coffee, and I bought a sandwich for lunch in Lincoln, very near the trailhead.

Screen Shot 2017-07-29 at 5.12.23 PM.png

 

I had brought sandals to carry with me for the river crossings, just in case.

After parking at the Lincoln Woods Visitor Center, I started the hike at 8:10am.

IMG_1546.jpg

 

It was a beautiful, cool, sunny day.  Just beyond the Visitor Center, two trails head up the East Branch of the Pemigewasset River: the Pemi East Side Trail and the Lincoln Woods Trail.  To get to the Lincoln Woods Trail, which I would take, I crossed a suspension bridge and took a right turn to head north:

IMG_1558.jpg

IMG_1561.jpg

 

The Lincoln Woods Trail runs along an old forest railway, and is wide and straight for over two miles.  Dappled, high forest, just gorgeous, crisp day.  Nervous about how long I thought it would take me to reach Owl’s Head, and return, I flew up this first easy part, almost trotting up the gentle incline:

IMG_1566.jpg

Lincoln Woods Trail – Formerly a Forest Railway, Straight and Wide

 

Old railway ties can be seen in the image, above.  Here is an image of one of the nails in a tie:

IMG_1710.jpg

 

There were a few other hikers heading up the Lincoln Woods Trail along with me, more than I expected on a summer Wednesday, but it wasn’t crowded.  I reached the junction with the Osseo Trail at 8:33am, and Black Point Trail at 8:53am:

 

Just before 9am, I arrived at the junction with Franconia Brook Trail.  So it had taken me about 50 minutes to walk up the 2.6 miles from the Lincoln Woods Visitor Center.  It had been gently up hill the whole way so far.

Here, just after a small footbridge over Franconia Brook, I would turn left, up the Franconia Brook Trail:

IMG_1572.jpg

Footbridge Over Franconia Brook

IMG_1576.jpg

 

(A few weeks later I would come to this junction once again, but would continue straight on the Bondcliff Trail.)

Franconia Brook Trail was a real trail, at least at the beginning, but soon, as I headed north up the Franconia Brook, there were long sections that must have also been old railway – straight, and wide, and gradually uphill.  Pleasant walking!  I thought that coming down would be even faster.

From here, the water level in Franconia Brook didn’t look too high:

IMG_1574.jpg

 

I hiked up Franconia Brook Trail, 1.7 miles, and reached the junction with Lincoln Brook Trail at 9:33am.  I was still making very good time – 1.7 miles in about 30 minutes.  But I didn’t feel that I was rushing, it was very nice hiking through the woods on the wide trail!

Here I would swing west to walk around Owl’s Head in a clockwise sense, following (and repeatedly crossing) the Lincoln Brook until reaching Owl’s Head Path:

IMG_1580.jpg

 

I would cross Franconia Brook four times going up the west side of Owl’s Head, and four times coming back down, retracing my steps.  The first crossing, at 9:44am, was the most difficult, and I almost gave my boots a good bath that time.  It was a little dicey…

Of course, as I climbed up the valley, the Brook became smaller as I walked above different streams that were feeding into it.  So the first (and last, when returning) crossing had the most water.

IMG_1583.jpg

IMG_1671.jpg

IMG_1588.jpg

IMG_1680.jpg

IMG_1593.jpg

 

 

The trail was less maintained here, certainly not an old forest railway, though I did see two trail crews working on it that day.

I reached the turnoff for Owl’s Head Path at 11:08am.  I had become nervous that I had passed it, feeling that I should have reached the turnoff some time before, and there were no signs.  By the time I reached the cairns marking the turnoff I was quite anxious and was thinking vaguely about turning back.  But, luckily, as I was approaching the cairns that can be seen in the next image, a young woman came down from having gone up Owl’s Head, and she confirmed that I had reached the junction!

IMG_1656.jpg

The Junction With Owl’s Head Path – Steeply Up From Here!

 

So it had taken me nearly an hour and a half to walk Lincoln Brook Trail, from Franconia Brook Trail to Owl’s Head Path, including four stream crossings.  Since Owl’s Head Path was supposed to be quite steep for some time, up a rock slide, I decided to leave some weight here at the bottom; so I took a quart of water and my sandals out of my pack and hid them at the junction.

I started up Owl’s Head at 11:17am, a bit lighter, after having a snack.  Soon I reached the famous rock slide, which was very steep, indeed.  Mostly gravel, so lots of sliding downward which made it heavy going.

IMG_1647.jpg

 

It was slippery and challenging, and did I mention that it was very steep?  Another young person came down and we crossed paths; she was very unhappy and had turned back before reaching the summit.  It was too dangerous and she was giving up, and was vocal about how unpleasant it was.  This would have been summit number 29 for her, but when carrying a full pack it wasn’t possible.  It was very heavy going, relentless and challenging!

But the views from the rock slide were fantastic, looking back towards Franconia Ridge I could see all four of the 4000-footers there: Flume, Liberty, Lincoln and Lafayette.  The light was still good, not yet noon, so the sun shined on the ridge from the east:

IMG_1642.jpg

Flume Is On The Far Left, Then Liberty, Lincoln, And Then Lafayette.

IMG_1643.jpg

 

Here is a video of that view from the rock slide, looking over to Franconia Ridge:

The White Mountain Guide indicates that the top of Owl’s Head is not very accessible, and that the end of Owl’s Head Path, which is just short of the actual summit, qualifies as reaching the top.  Apparently, at least when my edition of the Guide was published, reaching the actual summit involved a fair amount of bush-whacking.

Owl’s Head Path began to flatten out at about 12:09pm, and I reached what (I think) was the former end of the Path at about 12:15pm.

IMG_1628.jpg

The End Of Owl’s Head Path – Not The Summit!

 

Here I was able to turn left, to the north, and there was a path heading towards the actual summit – not a very wide path, switching back and forth a lot, but certainly not bush-whacking.

I got to the actual top at about 12:30pm, and had lunch.  Though I had seen a few other climbers after I passed the discouraged young woman, I had the summit to myself for lunch – it was very pleasant!

IMG_1615

Owl’s Head Summit

IMG_1620

IMG_1617.jpg

Some Vestiges Of Lunch Are Visible!

 

I had really really enjoyed the walk so far… maybe partly because expectations had been so low?

I left the summit, after a nice lunch, still wet with sweat, at about 12:45pm.  I could see Franconia Ridge to the west, through the forest:

IMG_1630.jpg

 

And there were some views to the east, towards the Bonds, but the Owl’s Head ridge was more forested that way, so no photos were possible.  I got back to the top of Owl’s Head Path at about 1pm, and to the beginning of the rock slide about 20 minutes later.  I dropped down the slide, taking care and many photos, reaching the junction with Lincoln Woods Trail at about 2pm.  So, about an hour to descend carefully.

The walk back down Lincoln Woods Trail was pleasant:

IMG_1662

IMG_1663

 

Recrossing Lincoln Brook four times – simpler this time – and passing the trail-maintenance crews again, I got back to the junction with Franconia Brook Trail at about 3:36pm.  Here I turned south and walked back down that old railway line:

 

There was a bit of old railway hardware along the side of the trail:

IMG_1678.jpg

 

For much of this section, there were a few mosquitoes, but the walking was pleasant, on a soft bed of pine needles:

IMG_1688.jpg

 

I passed a young woman resting on the side of the trail, with a very full pack.  “You’re carrying a lot!” I said, and she replied: “I’m ready to let it go!” in a resigned tone of voice…

Ups and down … mostly downward gently.  Long and level and wide.  I reached the junction with Lincoln Woods Trail at about 4:11pm, and the Trail got even wider and straighter and easier.  Funnily enough, there is a section of measured length here, which (of course) I had passed on the way up: 200 yards.  The idea is to measure how many paces it took.  On the way up, I counted 41 (double) paces, and 44 on the way back.  So I was walking with shorter paces on the way down!

 

I reached the Lincoln Woods Visitor Center, and my car, at about 5:15pm.  It had taken me almost 9 hours to climb Owl’s Head, which was substantially less than I had calculated: from the White Mountain Guide, just the ascent, walking up, should have been about 6 1/2 hours.

But it was a great hike on a wonderful day.  I enjoyed every minute of it!

*

As I arrived in Sydney to take up the newly-created position of International Program Director, one of my biggest priorities was to clarify our program approach.  This would involve lots of internal discussion, research and reflection, and I was determined to bring to this task the lessons I had learned in the previous 25 years of working in the sector (and described in the articles in this series!)

I understood that our program approach needed to be built on a clear understanding of what we were going to achieve, and why.  After completing the staffing of the first iteration of the International Program Team in Sydney, getting to know our programs in Cambodia, Papua New Guinea, and Viet Nam, and settling in with other Sydney-based senior managers and our board, I got going!

*

I had first heard of the concept of “Theory of Change” when I asked Alan Fowler to critique an early draft of the UUSC Strategic Plan in 2005.  He had, quite rightly, pointed out that the draft Strategy was good, but that it didn’t really clarify why we wanted to do what we were describing: how did we understand the links between our actions and our vision and mission?

Reflecting on Alan’s observation, I understood that we should put together a clear statement of causality, linking our actions with the impact we sought in the world.  So we did that, and ended up with a very important statement that really helped UUSC be clear about things:

Human rights and social justice have never advanced without struggle. It is increasingly clear that sustained, positive change is built through the work of organized, transparent and democratic civic actors, who courageously and steadfastly challenge and confront oppression. 

UUSC’s strategy derived from that statement in a powerful way.

Perhaps a better definition of the concept comes from the “Theory of Change Community”:

Theory of Change is essentially a comprehensive description and illustration of how and why a desired change is expected to happen in a particular context. It is focused in particular on mapping out or “filling in” what has been described as the “missing middle” between what a program or change initiative does (its activities or interventions) and how these lead to desired goals being achieved. It does this by first identifying the desired long-term goals and then works back from these to identify all the conditions (outcomes) that must be in place (and how these related to one another causally) for the goals to occur. These are all mapped out in an Outcomes Framework.

The Outcomes Framework then provides the basis for identifying what type of activity or intervention will lead to the outcomes identified as preconditions for achieving the long-term goal. Through this approach the precise link between activities and the achievement of the long-term goals are more fully understood. This leads to better planning, in that activities are linked to a detailed understanding of how change actually happens. It also leads to better evaluation, as it is possible to measure progress towards the achievement of longer-term goals that goes beyond the identification of program outputs.

At ChildFund Australia, one of my earliest actions was to develop and finalize a Theory of Change and the associated Outcomes Framework and Outputs.  In this article, I want to describe how we did this, and what we achieved.

*

First, some definitions.  Strangely, my experience is that when we in the INGO community try to agree on a common set of definitions, we usually end up arguing intensely and never agreeing!  The concepts we seek to define can be viewed productively in different ways; for me, it seemed most useful to find definitions that we could all live with, and use them, rather than trying to reach full consensus (which, over time, seemed to be an impossible dream!)

Here is the visual framework and definitions that we used in ChildFund Australia:

Screen Shot 2018-05-28 at 2.16.30 PM.png

 

A set of Inputs producing a consistent set of Outputs is a Project; a set of Projects producing a consistent set of Outcomes is a Program; a set of Programs producing a consistent set of Impacts is a Strategic Plan.

Note that:

  • “Inputs” are usually time or money;
  • “Outputs” are tangible and concrete products delivered by or through ChildFund: for example, a training course, a trip or meeting, a publication, rent, a latrine – see below;
  • “Outcomes” are changes in the Outcome Indicators that we developed – see below;
  • “Impact” is the highest-level of organisational achievement, related directly to the achievement of our mission.

This is pretty standard stuff, nothing particularly innovative.  But ChildFund Australia hadn’t formally adopted these definitions, which now began to provide a common language for our program work.

*

When we began to develop ChildFund Australia’s Theory of Change, Outcomes Framework, and Outputs, I took care to bring into the process several important lessons I had learned from previous experiences:

  • As mentioned above, from my experience at UUSC I had learned that the creation of a Theory of Change had the potential to be energizing and unifying, if it was carried out in a participatory manner;
  • Along the way, as the loyal reader of this series will have seen, my own view of development and poverty had grown to incorporate elements of social justice, collective action, and human rights.  I wanted to recognize these important elements into ChildFund Australia’s understanding of child poverty and development;
  • I recognized the significant complexity and cost associated with crafting and measuring Outcome Indicators, which would essentially articulate how we would hold ourselves accountable to our purpose.  Outcome Indicators are complex to use and expensive to measure.  So I felt that we should rely on the work done by technical agencies (the UNDP and UNICEF, other INGOs, and other ChildFund members) whenever possible, and to rely on national-government measurement systems when available and credible.  That meant that using MDG-related indicators, where appropriate, would be our first priority, because of the enormous effort that had been put into creating and measuring them around most of the world;
  • From my work with CCF, especially having participated in their child-poverty study, I had learned that children experience poverty in a more-complex way than we had earlier recognized: as deprivation, certainly; but also as exclusion and vulnerability.  We would incorporate this DEF framework now in Australia;
  • In my next blog article, I will describe how we created a “Development Effectiveness Framework” for ChildFund Australia.  The “DEF” would describe and detail the processes and products through which we would use the Theory of Change, Outcomes Framework, and Outcomes to operationally improve the effectiveness of our development work.  Twice, during my career with Plan International, we had tried to produce such a system, and failed comprehensively (and at great expense.)  We had failed due to several fundamental mistakes that I was determined to avoid making in Australia:
    • At Plan, we fell into the trap of designing a system whose purpose was, mostly, the demonstration of impact rather than learning and improvement of programming.   This led to a complex, and highly-technical system that was never actually able to be implemented.  I wanted, this time, to do both – to demonstrate impact and to improve programs – but fundamentally to create a practical system that could be implemented in the reality of our organization;
    • One of the consequences of the complexity of the systems we tried to design at Plan was that community members were simply not able to participate in the system in any meaningful way, except by using the data to participate in project planning.  We would change this at ChildFund, and build in many more, meaningful areas for community involvement;
    • Another mistake we made at Plan was to allow the creation of hundreds of “outputs.”  It seemed that everybody in that large organization felt that their work was unique, and had to have unique descriptors.  I was determined to keep the DEF as simple and practical as possible;
    • The Plan system was entirely quantitative, in keeping with its underlying (and fallacious) pseudo-scientific purpose.  But I had learned that qualitative information was just as valid as quantitative information, illustrating a range of areas for program improvement that complemented and extended the purely quantitative.  So I was going to work hard to include elements in the DEF that captured the human experience of change in narrative ways;
    • Both times we tried to create a DEF-like system in Plan, we never really quite finished, the result was never fully finalized and rolled out to the organization.  So, on top of the mistakes we made in developing the system, at great expense, the waste was even more appalling because little good came of the effort of so many people, and the spending of so much time and money.  In ChildFund, we would not let “the best be the enemy of the good,” and I would make sure to move to rapidly prototype, implement, and improve the system;
  • Finally, I had learned of the advantages and disadvantages of introducing this kind of fundamental change quickly, or slowly:
    • Moving slowly enables more participation and ownership, but risks getting bogged down and losing windows of opportunity for change are often short-lived;
    • Moving quickly allows the organization to make the change and learn from it within that short window of enthusiasm and patience.  The risk is that, at least for organizations that are jaded by too many change initiatives, the process can be over before people actually take it seriously, which can lead to a perception that participation was lacking.

I decided to move quickly, and our CEO (Nigel Spence) and board of directors seemed comfortable with that choice.

*

The ChildFund Australia Theory of Change

Upon arrival in Sydney in July of 2009, I moved quickly to put in place the basic foundation of the whole system: our Theory of Change.  Once staffing in the IPT was in place, we began.  Firstly, since we knew that effective programs address the causes of the situation they seek to change, building on the work of Amartya Sen, we defined poverty as the deprivation of the capabilities and freedoms people need to live the life they value.

Then I began to draft and circulate versions of a Theory of Change statement, incorporating input from our board, senior managers (in Sydney and in our Country Offices in Cambodia, Papua New Guinea and Viet Nam), and program staff across the agency.

This process went very well, perhaps because it felt very new to our teams.  Quickly we settled on the following statement:

Theory of Change.001

The ChildFund Australia “Theory of Change”

 

Note here that we had included a sense of social justice and activism in the Theory of Change, by incorporating “power” (which, practically, would mean “collective action”) as one central pillar.  And it’s clear that the CCF “DEV” framework was also incorporated explicitly.

The four dot-points at the end of the Theory of Change would come to fundamentally underpin our new program approach.  We would:

  • Build human, capital, natural and social assets around the child, including the caregiver.  This phrasing echoed the Ford Foundation’s work on asset-based development, and clarified what we would do to address child deprivation;
  • Build the voice and agency of poor people and poor children.  This pillar incorporated elements of “empowerment,” a concept we had pioneered in Plan South America long before, along with notions of stages of child and human development; and
  • Build the power of poor people and poor children.  Here we were incorporating the sense that development is related to human rights, and that human rights don’t advance without struggle and collective action; and we would
  • Work to ensure that children and youth are protected from risks in their environments.  Our research had shown that poverty was increasingly being experienced by children as related to vulnerability, and that building their resilience and the resilience of the caregivers and communities around them was crucial in the modern context.

This Theory of Change would serve admirably, and endure unchanged, through the next five years of program development and implementation.

*

Output Indicators

Now, how would we measure our accomplishment of the lofty aims articulated in the Theory of Change?  We would need to develop a set of Outcome and Output Indicators.

Recall that, according to the definitions that we had agreed earlier, Outputs were seen as: tangible and concrete products delivered by or through ChildFund: for example, a training course, a trip or meeting, a publication, rent, a latrine.

Defining Outputs was an important step for several reasons, mostly related to accountability.  Project planning and monitoring, in a classical sense, focuses on determining the outputs that are to be delivered, tracking whether or not they are actually produced, and adjusting implementation along the way.

For ChildFund Australia, and for our public stakeholders, being able to accurately plan and track the production of outputs represented a basic test of competence: did we know what we were doing?  Did we know what we had done?  Being able to answer those questions (for example, “we planned to drill 18 wells, and train 246 new mothers, and ended up drilling 16 wells and training 279 new mothers”) would build our creditability.  Perhaps more pungently, if we could not answer those questions (“we wanted to do the best we could, but don’t really know where our time and the budget went…”!) our credibility would suffer.  Of course, we wanted to know much more than that – our DEF would measure much more – but tracking outputs was basic and fundamental.

To avoid the trap we had fallen into in Plan, where we ended up with many hundreds of Outputs, I was determined to keep things simple.  We had already planned to bring all our Program Managers to Sydney in October of 2009, for another purpose, and I managed to commandeer this key group for a day.  I locked them in a meeting room for a day with the task of listing all the outputs that they were producing, and agreeing a short and comprehensive list.  We would then work with this draft and use it as a starting point.

The process worked very well.  Our Program Managers produced a list of around 35 Output Indicators that covered, well-enough, pretty much all the work they were doing.  Over the next three years, as our programming evolved and matured, we ended up adding about 15 more Output Indicators, with the final list (as of March, 2014) as follows:

Screen Shot 2018-05-28 at 3.01.27 PM.png

 

This listing worked very well, enabling us to design, approve, monitor and manage project activities in an accountable way.  As will be seen when I describe our Development Effectiveness Framework, in the next article in this series, we incorporated processes for documenting ChildFund Australia’s planning for Output production through the project-development process, and for tracking actual Output delivery.

Outcome Indicators

Designing Outcome Indicators was a bigger challenge.  Several of our colleague ChildFund agencies (mostly the US member) had developed indicators that were Outcome-like, and I was aware of the work of several other INGOs that we could “borrow.”  Most importantly, as outlined above, I wanted to align our child-focused Outcome Indicators with the Millennium Development Goals as much as possible.  These were robust, scientific, reliable and, in most countries, measured fairly accurately.

As we drafted sets of Outcome Indicators and circulated them for comment with our Board Program Review Committee, Senior Management, and program staff, our CEO (Nigel Spence) was insistent that we kept the number of Outcome Indicators as small as possible.

I agreed with Nigel, in general (“keep things simple”) and in particular (in Plan we had been swamped by too many indicators, and never actually implemented either system).  But it was a big challenge to measure the lofty concepts included in our Theory of Change with just a few indicators!

When we finalized the first iteration, approved by our Board of Directors in June of 2010, we had only 16 Outcome Indicators:

Screen Shot 2018-05-28 at 3.16.59 PM.png

Screen Shot 2018-05-28 at 3.17.10 PM.png

 

 

Nigel thought this was too many; I thought we had missed covering several crucial areas.  So it seemed a good compromise!

It would take some time to work out the exact mechanism for measuring these Indicators in our field work, but in the end we were able to keep things fairly simple and we began to work with communities to assess change and determine attribution (more on that in the next article in this series.)

Additional Outcome Indicators were introduced over the next few years, elaborating especially the domains of “Protection” and “Power,” which were relatively undeveloped in that initial package of 16, finalized in June of 2010.

*

So, by the time I was celebrating one year at ChildFund Australia, we had agreed and  approved a clear and comprehensive Theory of Change, a coherent and concise set of robust Outcome Indicators, and a complete set of (not too many) Output Indicators.

*

Looking back, I think we got this right.  The process was very inclusive and participatory, yet agile and productive.  The results were of high quality, reflecting the state of the art of our sector, and my own learning through the years.  It was a big step forward for ChildFund Australia.

This meant that the foundation for a strong Development Effectiveness Framework was in place, a framework which would help us make our program work as effective as possible in building brighter futures for children.  This was (if I do say so myself!), a huge achievement in such a complex organization, especially that we accomplished it in only one year.

From the perspective of 2018, there is little I would change about how we took on this challenge, and what we produced.

*

My next article in this series will describe how we build the ChildFund Australia Development Effectiveness Framework on the foundation of our Theory of Change and Outcome and Output Indicators.  Stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (1): the ChildFund Australia International Program Team.

Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team

May, 2018

I began a new journey two years ago (May, 2016), tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

*

Picking up the story in July of 2009, I flew to Sydney for what would become 6 great years as ChildFund Australia’s first International Program Director; Jean would join me there in a few weeks.  In two previous blog entries of this series, I described how I was thinking about programming, and about putting together a strong set of teams in Sydney and overseas, as I approached this exciting new challenge.

In many ways, as I headed towards Sydney, I was hoping to put it all together: 25 years in international development and social justice, designing and implementing programs and partnerships, building cohesive and high-performing teams – this was my chance to start afresh with all of the lessons learned over those decades.  To really get it right.

In this article, I want to introduce the program team at ChildFund’s head office in Sydney – the “International Program Team” – share a bit about the great people I worked with, along with a description of how the team’s staffing and structure evolved.  I would approach this task very mindful of what I had learned in Plan International, especially how we refocused and restructured the agency; and keeping the lessons about building strong teams in complex situations, that I had learned at UUSC in my mind also.

But first: I climbed Mt Moriah (4049ft, 1234m) on July 22, 2017, with Kelly Royds and Raúl Caceres, friends from Australia who, coincidentally, had worked at ChildFund (but not on the IPT.)  We had hiked up Mt Pierce and Mt Eisenhower a year earlier, in August of 2016…

*

Raúl and Kelly came up from Cambridge the day before, and we left Durham at about 7:45am, and stopped for sandwiches and coffee in Ossipee on the way north.  Traffic was heavy on this summer weekend, so it wasn’t until 10:45am that we reached the trailhead in Gorham, in the northern reaches of the White Mountains.

IMG_1489.jpg

Raúl and Kelly at the Beginning of the Climb

 

Mt Moriah is the northern-most of the six 4000-foot peaks in the Wildcat-Carter range.  I had climbed Wildcat “D” and Wildcat Mountain (on one day), and Middle and South Carter (on the next day), in September of 2016, solo; and I had summited Carter Dome on 9 July, 2017, with our friend Draco.  So Moriah was the last of these six that I would climb.

Moriah is an “easy to moderate” hike, and we had a nice day for the climb: not too hot, partly cloudy and a bit misty.   It’s 4.5 miles up, and we retraced our steps from the top.

Screen Shot 2017-07-24 at 10.44.47 AM.png

 

 

 

The trail climbs moderately from the trailhead just outside of Gorham, reaching a ledge outlook at Mt Surprise.  There, views opened to the west towards the Presidential range:

IMG_1495.jpg

 

This ledge area seemed unusual to me – not so high in altitude, but quite alpine in nature: low pines and large areas of lichens, as if we were at a much greater elevation:

IMG_1524

IMG_1523

 

I guess the conditions here were affected by a combination of elevation and latitude: since we were at the northern end of the White Mountains, perhaps the winter weather would be a bit more severe?

It was a bit misty, with views that were not quite as dramatic as last time I was up on this range, but still very impressive.  There is a short boggy area near the top.

IMG_1497.jpg

Nearing the Top – Raúl, With Kelly in the Background

IMG_1498.jpg

 

We reached the spur path to the summit of Mt Moriah at 2pm.  From here it wasn’t too far to the top.

IMG_1508.jpg

 

The summit itself is a small, rocky clearing, and on this day it was quite crowded, so we ate a late lunch at an outlook a short distance from the top, with great views to the north:

IMG_1503.jpg

The (Crowded) Summit Of Mt Moriah

 

An outcropping was visible to the south, without anybody on it.  It looked like there would be great views from there so, after lunch I thought that I would go ahead to try to get to it.  Maybe the view there would be worth the walk.  When we reached the intersection with the Kenduskeag Trail, Kelly and Raúl decided to wait for me there; I kept going, hoping to reach that outcropping.

IMG_1509.jpg

 

Taking first the right-hand turn (along the Carter-Moriah Trail, coincident with the Appalachian Trail here) and then doubling back to explore briefly along the Kenduskeag Trail, I just couldn’t find that outcrop.  After poking around a bit, I headed back to where I had last seen Raúl and Kelly, but they had gone.  So I began the descent back to the trailhead.

Soon I passed a couple who were climbing up, and I asked them if they had seen my friends.  No!  Whoops!  Clearly we had gotten separated at the top, so I asked the couple to tell Raúl and Kelly that I had begun the descent.  A few minutes later I was able to get reception on my cellphone, and rang Raúl: sure enough, they were waiting for me back at the summit!  In retrospect, I should have thought of that – of course they would want to wait where there was a view! – but I had been too tired to climb back up there, and assumed that they were feeling similarly.

From their perspective, as I was descending, Raúl and Kelly became worried that I was lost and perhaps injured.  Finally they decided that it would be best to walk to the trailhead, and then I called Raúl and we realized what had happened.

I descended as we had ascended, a beautiful day for a hike in the White Mountains:

IMG_1519

IMG_1520

IMG_1521

Presidential Range, Now Backlit In The Afternoon

 

Nearing the trailhead, I came across part of an old car that I hadn’t seen on the way up:

IMG_1528.jpg

 

I arrived back at the trailhead at 5pm, and Kelly and Raúl finished at about 6:30pm.  Despite our inadvertent (and temporary!) separation, all turned out OK and it was a pleasant and enjoyable day.  Mt Moriah, peak number 33, was climbed – 15 more to go!

*

When I arrived in Sydney, one of my first tasks was to finalize the structure of the new International Program Team (“IPT”), and complete its staffing.  Having spent lots of time and energy worrying about structure in previous roles (particularly at the International Headquarters of Plan International – see this blog), I was thinking about this in two ways:

  1. Because structure has a strong influence on behavior, I wanted to keep the IPT’s structure lean, flat and close to the field, and efficient, cost-wise;
  2. Because with the right people, structure wasn’t the most important thing, I wanted to not worry about it too much: just get the right people, make as good a structural decision as I could, and then get on with the work and let things evolve through intentional, restorative, reflective learning.

It turns out that these two aims are fairly consistent.  Yes, structure does have a strong influence on behavior, so it’s important not to get it wrong.  And, within reason, flatter structures are better: fewer levels of bureaucracy between field and senior management keeps things a bit more grounded in the reality of our NGO work.  Flatter structures help keep head-office costs down, also.  But I had also learned an important lesson along the way: hire great people, make roles very clear and connected to the organization’s mission and people’s passions, and then let things evolve, reflecting and learning-by-doing.  Don’t obsess too much about structure.

So that’s what I did.

*

When I arrived in Sydney, the IPT was in flux.  Even though ChildFund Australia had been working in Papua New Guinea for fifteen years, and in Viet Nam for a decade, the main role of Sydney-based program staff at that point was to oversee projects funded through the Australia-NGO Cooperation Program (“ANCP”) of AusAID (the Australian Government’s overseas aid agency), implemented by the US member of the ChildFund Alliance, confusingly-named ChildFund International.  This meant that the IPT had little role with regards to ChildFund Australia’s own programming…

In 2009, ChildFund Australia was preparing for growth: our private income was growing strongly, and because the new Labor government was promising to strongly-increase overseas development assistance in line with international commitments, and we had just become top-tier ANCP “Partners” with AusAID, it looked like that income stream was also going to grow quite rapidly.  Part of that preparation for growth resulted in the creation of my new role as International Program Director, which would assume the management of ChildFund Australia’s three (becoming five) Country Directors.

ChildFund Australia’s organizational structure as I arrived in Sydney looked something like this:

IPD Structure - 1.002

 

  • Five Department Directors worked with Nigel: Bandula Gonsalkorale (Finance and IT); Jan Jackson (HR); Lynne Joseph (Sponsor Relations); Di Mason (Fundraising and Marketing); and me;
  • Initially we had three Country Directors, handling program implementation and reporting to me: Carol Mortenson (Cambodia); Smokey Dawson (PNG); and Peter Walton (Viet Nam).  Peter also handled regional responsibilities for the Mekong, supervising ChildFund’s research into setting up operations in Laos.  I will share more about these Country Directors, and their successors and teams, in upcoming articles in this series…

And in the Sydney Program Department, five positions were in the FY2010 budget (in addition to my own):

  • Veronica Bell had just left ChildFund, taking up a position at the Human Rights Council for New South Wales.  So her International Program Manager position was vacant;
  • Richard Geeves had just joined, only a few days before my arrival, as International Program Coordinator for the Mekong programs (Cambodia and Viet Nam).  Richard had long experience in the education sector in Australia (including in indigenous areas), and was recently returned to Australia after many years working from Cambodia;
  • Rouena (“Ouen”) Getigan had joined ChildFund several years earlier, and therefore was our repository of wisdom and knowledge; the rest of us were new, but Ouen knew how things worked!  She handled relations with our ChildFund partners in Africa and Asia that were funded through the ANCP program, and did an outstanding job of building and maintaining these partnerships.  In addition, to support a large regional HIV and AIDS project in Africa, Ouen supervised a very capable Kampala-based project coordinator, Evas Atwine;
  • Terina Stibbard, like Richard, had just joined ChildFund, only a few days before my arrival, as International Program Coordinator for Papua New Guinea.  Overflowing with passion for the work, and with a tireless commitment, Terina took on what was perhaps our biggest challenge: building a strong program in PNG.  I will write much more about PNG in a future blog post in this series.  Also, among other things, Terina introduced us to the concept of “critical friend,” which perfectly captured the IPC role with our Country Offices: without direct authority, but able to advise and speak truth directly without harming relationships;
  • And Nigel had left one position undefined, for me to consider.

Interviews for the Mekong and PNG roles had begun before I was hired, but before finalizing things with Richard and Terina, Nigel and Jan had consulted me, asking if I wanted them to wait until I got to Sydney before finalizing these hires.  But Richard and Terina looked great to me, on paper, and I saw no reason to delay.

In terms of the program-team’s structure, I didn’t see any reason, at this point, for the extra structural level implied by the “International Program Manager” role.  Over time, I saw things might evolve in three general domains:

IPD Structure - 1.001

 

In the Program Support domain, one group of staff in Sydney would accompany Country Directors and, most directly, the Program Managers in our program countries, helping develop projects and programs with the greatest impact on the causes of child poverty in each location.  In the Program Development area, Sydney staff would provide technical and systems support, establishing standards and helping measure results.  Finally, of course, we had a general function of Program Implementation – our Country Directors.

As we will see, in fact, the IPT structure did in fact evolve in this way.

*

So here is the first iteration of the IPT structure, put in place soon after my arrival:

IPD Structure - 1.003

 

Richard, Ouen, and Terina focused mainly on “Program Support” duties, working directly with our Program Managers in Cambodia, PNG, and Viet Nam, and with ChildFund partner offices in Asia and Africa to help them develop and implement, and learn from, increasingly sophisticated programming.  Two new hires, Jackie Robertson and Cory Steinhauer, joined ChildFund to support program development: Jackie was focused on developing the policies and standards that would govern our work; and Cory would focus on building a development-effectiveness framework through which we would design our programs and measure our results.

Here are some images of that first IP team:

Copy of IMG_1483.jpg

Terina, Richard, Cory, Jackie, Me and Ouen

OLYMPUS DIGITAL CAMERA

Richard, Terina, Me

OLYMPUS DIGITAL CAMERA

Ouen

*

This structure worked very well.  In terms of how I managed the team, from the beginning I tried to put in place a range of “restorative” practices, aimed at keeping the team together, keeping us grounded and motivated:

  • Every Monday morning at 10am, we had a team checkin.  I had learned how to do this from Atema Eclai at UUSC, though I had to adapt it quite a bit: Australians weren’t enthusiastic about the “touchy-feely” aspects of checkins like Atema’s.  So we limited things to a brief general chat followed by a discussion of priorities for the week.  This seemed to work very well, settling us into the week smoothly, and was replicated by the team even when I was away;
  • Every month or two, we had a formal IPT Meeting.  These events had agendas, minutes, pending-action-items lists, etc.  They were business meetings, which I would chair, meant to be efficient fora for decision-making and accountability.  They worked very well.  For example, I had learned how to use “pending-action-items” lists from Max van der Schalk while working at Plan’s international headquarters, and the introduction of this tool was very important at ChildFund: decisions that required action went onto the list, organized in order by date, and stayed on the list until they were completed or the IPT agreed to remove them.  This provided a strong element of accountability and was a helpful irritant that kept us from neglecting decisions and becoming less accountable.  Once the ChildFund Australia “Development Effectiveness Framework” (“DEF”) was completed, we introduced a short reflection from a field case study at the beginning of each IPT Meeting, to help ground us in the reality of our work; much more detail about the DEF will come in a future blog in this series;
  • My intention was to complement the formal IPT meetings with periodic reflection meetings about a topic related to our work.  These sessions wouldn’t have agendas or minutes, much less structured and more relaxed than the IPT Meetings, and would be chaired by different members of the IPT who had expressed an interest in a particular topic – micro-finance, human rights, direct giving, etc.  These sessions were always interesting and useful, and energizing, so I regret not organizing more of them.  Somehow they seemed to drop off the agenda with the fast pace of work, over time, and I don’t think that I fully realized the potential of this concept;
  • I tried to have an open door policy, available for IPT members at any time.  I made sure to close my door only when necessary, and to invite any team member to come in and sit down whenever they dropped by.  I think this was helpful in creating and sustaining a culture of caring and support, clearly communicating to everybody that helping the team was my main job.  Of course, there were times when my office door needed to be closed – for the discussion of sensitive matters, particularly on the phone with our Country Directors – but I had learned at UUSC to be quite mindful of asking permission to close my door, to enhance transparency and make sure people were comfortable.  As with the team checkins, it seemed that our mostly-Australian staff viewed this habit of mine – asking permission to close my door – as a bit silly.  But I think it was helpful;
  • Of course, I carried out an annual performance review of each member of the team, and spent lots of time preparing these documents.  I tried to be balanced, but to always include areas for improvement – loyal readers of this blog will remember my experience with Pham Thu Ba, back in Plan Viet Nam: when I finished her first performance review, which was stellar, she told me I wasn’t doing my job if I couldn’t help her improve!  This made a big impression on me, and even though western culture these days seems to only value praise, I wanted to honor Thu Ba’s example in my work in Australia.  This worked well, most of the time!;
  • In addition to the yearly performance review process, I tried to have some less-formal, one-on-one time with each IPT member every year.  I’d invite them for coffee or lunch, and have an open, unstructured chat about how things were going. I wasn’t able to make this happen as often as I wanted, but it was a very useful mechanism, helping surface concerns and opportunities that I might not have appreciated otherwise;
  • Finally, also dating from my time in Viet Nam, I adapted and used a “Team Effectiveness Assessment” for use with the IPT, and was able to use this tool to formally assess how we were progressing.   The framework I used came from a great workbook that I had discovered at Asia Books in the Bangkok Airport, back when I worked in Hanoi in the late 1990’s: “The Team-Building Workshop,” by Vivette Payne.  The approach included in the book outlined eight elements of team effectiveness, and a survey was included that could be used to measure the status of a particular team.  Starting with how I had used the survey in Hanoi, I now adapted the survey and used it four times with the IPT, tracking results and identifying areas that we could focus on to improve (in yellow):

Screen Shot 2018-05-10 at 1.26.17 PM

You can see that our overall score, a measure of team effectiveness, improved from 197.1 in March, 2011 to 230.6 in December of 2011, and then moved back down to 204.1 in January of 2014.  I think that the decrease in score reflects the arrival of several new IPT members, and the corresponding need to settle the team down into new roles and relationships.

Each time we used this tool, we identified areas for focus, which are initiated in yellow: for example, in October of 2012 we looked to focus on “Roles” and “Team Relationships” and “Skills & Learning.”  I found the tool to be practical and very useful, though not to be taken too literally; discussion of results and team reflection on next steps was more important than the numerical scoring… and the fact that I was using this tool periodically gave the team a message that I was taking our effectiveness seriously, and investing my time, and all of our time, on improving the team environment.

*

In one of my blogs about UUSC, I described how I had created the “UUSC Handbook,” to enhance clarity of how things would be done in that agency.  From my perspective, as Executive Director there, the UUSC Handbook was a big success, notwithstanding its large size.  Given the tensions that existed in that agency, having an agreed, approved set of standards and procedures was helpful, and since it mainly simply codified and clarified existing practices, it didn’t create too much bureaucracy.

I replicated this approach at ChildFund Australia, creating the ChildFund “Program Handbook.”  Like its UUSC predecessor, the Program Handbook was quite complex and bulky to produce and update, which happened periodically … they were both meant to be living documents.   And it contained much content that was already existing, just needing to be codified.

But, unlike the UUSC Handbook, ChildFund’s document contained much that was new: our Theory of Change and our Development Effectiveness Framework, and a range of program policies – these were new, developed by the new IPT, and represented the ongoing maturing of ChildFund’s programming.

A copy of a version of the ChildFund Australia Program Handbook is here (Program Handbook – 3.3 DRAFT ); even though this is marked as “Draft,” I think it was the final update that we issued before I left Sydney in 2015.

*

Two years later, in 2011, ChildFund Australia was growing strongly, and we had commenced operations in Laos.  The IPT structure in Sydney evolved consistently with this growth:

IPD Structure - 1.004

 

Carol Mortensen continued as CD in Cambodia, but changes had been made in PNG and Viet Nam, and we had started operations in Lao PDR:

  • Andrew Ikupu, a very-experienced Papua New Guinean, had replaced Smokey Dawson as CD in PNG.  Andrew had long experience working in development in his country, and had a PhD from the University of South Australia in Adelaide;
  • Deb Leaver had taken over from Peter Walton in Viet Nam.  I had first met Deb in late 2009, when I visited ActionAid Australia, where Deb was Program Director, and she had been probably the most welcoming of my peers in Sydney.  We were lucky to hire Deb to follow Peter;
  • Chris Mastaglio, with his able colleague Keoamphone Souvannaphoum, had helped ChildFund with the initial research into why, how, and where we should work in Laos.  Once we made the decision to start working there, we were fortunate that both Chris and Keo were available to join ChildFund: Chris as CD, and Keo as Program Manager (and, later, as CD when Chris transitioned to head up a regional sport-for-development program).

We were very lucky to have Andrew, Chris, Deb and Keo join ChildFund Australia.

In Sydney, things had also evolved.  Cory Steinhauer had departed, and Richard Geeves had moved over from Program Support (where he had served as IPC for the Mekong) to work on Development Effectiveness.  He was quite good at this role: while I was the primary architect of ChildFund’s Development Effectiveness Framework, which I will describe in detail in a future blog post in this series, Richard was an able foil, working to keep things simple and practical, and he had a good touch with the field, keeping the implementation of what was a new, challenging system on track, with good humor.

John Fenech joined the Program Development team, helping our Country Offices prepare grant proposals.  Relative to our size, ChildFund Australia had a lower proportion of income from technical grants (bilateral, multi-lateral, foundation) than our peer organizations, and John’s role was to build our portfolio.  Although John was one of the younger members of the IPT, he brought a vivid countercultural sense, sometimes seeming to date more from the 1970’s than from the 2010’s.  In a good way…

Terina remained engaged with PNG, and she was doing a fantastic job working with Andrew Ikupu and his Program Manager Manish Joshi (later becoming CD there).  As a result, our programs in PNG were really taking off – growing in size, impact, and sophistication, and diversifying in income source.  And Ouen continued to work with our ChildFund International partners across Africa and Asia as they implemented an increasing number of ANCP-funded projects.

As our programs were expanding, two new IPCs had joined, working with our programs in the Mekong: Caroline Pinney took over support from Cambodia and Laos, and Maria Attard worked with our team in Viet Nam, while also coordinating research and initial engagement in Myanmar.  Caroline brought long experience in Asia, with AVI (the Australian volunteer-sending agency), and a very strong level of dedication and passion for our work.  Maria’s work had been in Cambodia (working with women and children that had suffered from domestic violence) and the Pacific (in the disability sector), before returning to Australia (continuing in the disability sector).  Maria brought a welcome sense of activism to the team, building on her advocacy work in the disability sector.  Both Caroline and Maria showed remarkable dedication to the heavy workload and complicated realities of the programs that they supported.

Finally, in this second iteration of the IPT structure, we decided that the scale of operations was large enough to merit a program officer to provide a range of support services to the team.  Initially we wanted to hire an indigenous Australian, accessing subsidy programs offered by the government.  This was Terina’s idea, and was a very good one, but we were never able to make it work due to complicated and dysfunctional bureaucracy on the government side.

So we shifted concepts, and decided instead to look towards recent graduates in international development.  Given how many people were finishing degrees in the field, and how few jobs there were, we thought it would be good to make the position time-limited – giving new graduates some real work experience, and some income, while taking some administrative load off of the rest of the IPT.  And then booting them out into the real world.

We recruited externally, and were able to hire a very smart, extremely hard-working new graduate, Mai Nguyen.  From then on, Mai handled a range of administrative and program-support duties with great efficiency and good humor.

Here are images of that iteration of the IPT:

Cropped Team Photo

IPT in February, 2012.  From Left To Right: John Fenech, Ouen Getigan, Me, Maria Attard, Terina Stibbard, Mai Nguyen, Caroline Pinney, and Richard Geeves.  Missing: Jackie Robertson

IPT in March 2012.jpg

IPT in March, 2012.  From Left To Right: Ouen Getigan, Maria Attard, Terina Stibbard (seated), Caroline Pinney, Jackie Robertson, Me (seated), John Fenech, Mai Nguyen, and Richard Geeves

 

*

In 2014 we introduced IPT’s third structural evolution, the last version of my time as International Program Director.  At this point, our scale had grown further, with the addition of Myanmar and, with 11 direct reports, I was having trouble providing proper individual attention to everybody.  So we introduced a new level, partly to break my “span of control”: so Ouen and Richard became “Managers”:

 

IPD Structure - 1.001

 

Ouen would be handling Program Development and the support of our Development Effectiveness Framework, and Richard moved to manage Program Support while also serving as IPC for PNG (after Terina Stibbard departed.)  This allowed me to give priority attention to the five Country Directors now reporting to me, and to Ouen and Richard.

Here is an image of that final iteration of the Sydney-based team:

IMG_3483

IPT in November, 2014: John Fenech, Sanwar Ali, Caroline Pinney, Richard Geeves, Maria Attard, Me, Manasi Kogekar, Mai Nguyen, Sarah Hunt, and Ouen Getigan.  Missing: Jackie Robertson.

 

We had upped our technical support capacity, by recruiting Sanwar Ali from Oxfam Australia; he would head support for our increasing Disaster Risk Reduction and Emergency Response efforts.  John Fenech had moved to serve as IPC for Cambodia, allowing Caroline Pinney to focus on Cambodia, and Mai Nguyen had moved to serve as IPC for Myanmar, allowing Maria Attard to focus on Viet Nam.  To replace John in the grant-development role, we (re)hired Sarah Hunt, who was quickly very successful in bringing in additional resources to the program; Sarah had served on the IPT before my arrival, and we were lucky to bring her back, thanks to Ouen’s strong recommendation.  Sarah made grant development look easy, which it certainly isn’t!

This team worked very well, and seemed harmonious and effective.  Ouen and Richard were good, supportive managers of their teams, and I was able to spend much more time with our Country Directors.

*

In my last article in this series, I shared a framework that I developed over time, for thinking about effective teams in NGO settings:

Clarity Trust Inspiration - 1.001

 

In that article, I said that “… our INGO teams will perform strongly if:

  • their task is clear, accountability is clear, what we are supposed to do, and why, is clear, and if how to carry out our tasks is clear;
  • we operate in a context of high trust;
  • the inspiration that we bring to our work is refreshed periodically.  And:
  • the normal wear-and-tear on our human relationships, the harm done over time, is restored intentionally.”

How did we do in ChildFund Australia’s IPT?

  • Clarity: We did fairly well here.  I was careful to engage with the IPT to make sure that their roles and jobs were clear, and the work we did to develop a programmatic Theory of Change and Development Effectiveness Framework also greatly enhanced clarity.  The preparation and frequent updating of the Program Handbook also provided clarity, though perhaps was viewed as a bit bureaucratic by some.  But, overall, I’d say things were clear;
  • Trust: this is a bit harder to judge, for me, because it was my job to create and maintain an environment of trust.  Trust comes from a combination of competence and honesty, and I feel that IPT members viewed me as quite competent and honest.  For example, I decided to share minutes of all Senior Management Team meetings with IPT members at our IPT Meetings – orally, in summary, and omitting any confidential content.  I think that sharing this information helped reinforce a sense of transparency.  But of course many factors were beyond my control, and I was imperfect in my communications skills;
  • Inspiration: I think we did fairly well here, I tried to bring a sense of the realities in the field into all our meetings, and into board and Senior-Management meetings, using (for example) case studies from our Development Effectiveness Framework to reconnect us with the deeper motivations that brought us into the NGO sector.  Again, I was imperfect in this, but I think we did pretty well;
  • Restorative Practices: earlier in this article I described my efforts to build restorative practices into the ongoing context of the IPT, and I think these worked very well.

Overall, perhaps a solid B+.

*

That’s some of the story of ChildFund Australia’s International Program Team, from 2009 through 2015.  ChildFund’s work expanded enormously during that time, and the IPT  managed to support that expansion smoothly, with increasing attention to the quality and sophistication of our programming.

It did come at a financial cost: program support increased from around 4% of funds remitted to international programming in 2010, to 6.7% in 2015.  My sense is that the gains in effectiveness and impact were well worth this investment – I will explore this in more depth in an upcoming post in this series.

I enjoyed working with the IPT, and learned a lot from them.  Morale was good, consistently, and though I can’t take sole credit for that success, I think that the approach we took helped.

With gratitude and warm appreciation to:

  • Sanwar Ali
  • Maria Attard
  • John Fenech
  • Richard Geeves
  • Rouena Getigan
  • Sarah Hunt
  • Manasi Kogekar
  • Mai Nguyen
  • Caroline Pinney
  • Jackie Robertson
  • Cory Steinhauer
  • Terina Stibbard

*

Stay tuned for more blog posts about ChildFund Australia: our Theory of Change and Development Effectiveness Framework, our work and great teams in Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam, and much  more…

*

Here are links to blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.

Why Activism? Inspiration from Ta-Nehisi Coates

April, 2018

I’ve just finished “Between the World And Me,” by Ta-Nehisi Coates.  An important book, highly recommended.

I found myself stopping and re-reading many times, just to drill down into something that took me by surprise, or really reached me.  Here is one quote that I like very much.

“History is not solely in our hands. And still you are called to struggle, not because it assures you victory but because it assures you an honorable and sane life.”

Ta-Nehisi Coates, “Between the World And Me, page 97.

Coates’s insight is consistent with a thought from Wendell Berry, but from a slightly-different angle:

“Protest that endures, I think, is moved by a hope far more modest than that of public success: namely, the hope of preserving qualities in one’s own heart and spirit that would be destroyed by acquiescence.” 

Wendell Berry, from “A Poem of Difficult Hope,” 1990 

We could think of these messages as pessimistic.  My sense is that Coates and Berry are encouraging us to reflect on a deeper sense of how we can sustain meaning for ourselves.  Being outwardly directed is important, we have to keep up our work to transform the world.  Seeing this work as a way of being true to who we are, and who we want to be, helps us keep at it when the climb so often seems too steep.

*

Check out my series about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall (1219m) and, each time, reflecting a bit on the journey since I began to work in social justice, 30 years ago: on development, human rights, conflict, experiences along the way, etc.

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.

Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration

April, 2018

I began a new journey nearly two years ago (May, 2016), tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

*

Picking up the story as I arrived in Sydney in July, 2009, to take up the newly-created position of “International Program Director” for ChildFund Australia, I was thinking a lot about how to build great programs for children and youth.  I wrote about that last time.

And I was also thinking about the other big part of my new job: building strong teams.  Next time I will introduce some of the people I worked with in those teams – in Sydney, Port Moresby, Hanoi, Phnom Penh, Vientiane, and Yangon.

This time I want to share thoughts about how to build teams, in particular in the context of international non-governmental organizations.  Through my career in the INGO sector, I was fortunate to work in, and lead, teams across the world, and learning a lot about how to build strong, high-performing teams.  Learning-by-doing, from watching others, and from my own mistakes.

I was determined to bring this learning to ChildFund Australia.  But before diving into that topic…

*

I climbed both Galehead Mountain and Mount Garfield on 19 July, 2017.  My plan that day was to walk up Gale River Trail to join the Garfield Ridge Trail, and then take the Frost Trail to reach the top of Galehead Mountain, which would be number 31 of the 48 4000-footers.  Then I would loop around Garfield Ridge to go up Mt Garfield, and return to meet up with Jean at the bottom of Garfield Trail.

Jean had driven up from Durham with me, and left me at the trailhead of the Gale River Trail.  She would spend the day with an old friend from high school, planning to pick me up at the end of the day.

I reached the top of Galehead Mountain at a little after noon.  When I had arrived at the Garfield Ridge Trail, going up, it seemed that I was making great time.  But by the time I dropped down from Galehead, and left Galehead Hut to head towards Mt Garfield, I was much less optimistic: to reach the trailhead by 5-6pm, as arranged with Jean, I thought I needed to leave Mt Garfield by 3pm, at the very latest.  I had less than three hours to get to the next peak.

So I headed down from Galehead and tried to keep up a good pace.

Screen Shot 2017-07-20 at 12.10.22 PM.png

 

I got back to the junction with Twinway and Garfield Ridge at about 1pm, and continued towards Garfield.  The walking was, at first, quite pleasant as I retraced my steps down to where I had come up Gale River:

IMG_1420.jpg

 

From there, it was pleasant walking along Garfield Ridge.  Continuing along the ridge in a westerly direction, I reached the junction with the Franconia Brook Trail (at the saddle of Garfield Ridge Trail, between Galehead and Garfield) at about 2:15pm.

IMG_1427

View Looking Down Franconia Brook

IMG_1426

Looking Back, Galehead Hut Is Just Visible In The Saddle, With South Twin Above It To The Left, And Galehead Mountain Above It To The Right

 

I was getting nervous: I had calculated that I needed to start descending from the summit of Mt Garfield by 3pm, in order to reach the trailhead, where Jean would be waiting, by 5-6pm.  But from the saddle, well below the summit, at 2:15pm, Mt Garfield towered over me, and the next section of the hike looked to be very steep.  VERY steep.

In all of these climbs, all 32 of them thus far, I don’t think I have ever been as tired as I was now.  The climb up from the saddle between Galehead Mountain and Mt Garfield felt unrelenting, up up up.  It was very hot, very humid, and I was down to one liter of water, of the 2.5 liters I had started with.  Luckily, I passed by Garfield Ridge campsite, and there is a wonderful spring there, so I drank a full liter of cool, clean mountain water – a great relief!  Fantastic!

But, even so, the climb was unrelenting.  It was very challenging, a really tough climb up 0.7 miles from the saddle to the top.

I reached the junction with the Garfield Trail at just after 3pm, and decided to drop my backpack there, and finish the climb to the summit with just a bottle of water and my walking stick:

IMG_1435.jpg

 

At least I had water.

Luckily, though the last section was very steep, I got there at about 3:15pm.  Though I was exhausted, the views from the top of Mt Garfield were stunning, with just enough clouds to produce a nice contrast as I looked around.  I could see Owl’s Head in front of me, and the peaks of Flume, Liberty, Lincoln and Lafayette to the west.

IMG_1436

Summit of Mt Garfield – Foundation of the Former Fire Lookout Tower

IMG_1438

From The Summit Of Mt Garfield: Galehead Mountain Is In The Foreground, South Twin In The Background

IMG_1441

Franconia Ridge, On The Right, and Owl’s Head Below, To The Left

IMG_1444

Looking Back Towards Galehead, and The Twins

 

Sadly, my camera seriously fogged up at the top of Mt Garfield, so the photos I took towards Franconia Ridge were spoiled.  This video panorama of the view is also fogged up, but perhaps the beauty of the day can be inferred here?

 

I couldn’t stay too long at the top, though it was beautiful, because I was worried about reaching the parking lot too late.  So I headed back down to the junction with Garfield Trail, picked up my backpack, and started down from there at 3:30pm, a half hour later than I had hoped.  Here I’m looking back up at the junction as I began the descent down Garfield Trail:

IMG_1470.jpg

 

Luckily, because I was so exhausted, the 4.8 miles down Garfield Trail were not challenging, just long long long.  By about 4pm, I hadn’t seen anybody at all, which was quite a change from the steady stream of hikers, and through-hikers, up on the ridge.  But, at a very awkward moment, a young hiker passed by me, walking quickly, just saying hello.  If she had been just a few moments earlier, it would have been quite embarrassing (probably for us both!)

 

The walking was fairly easy, gently downward, on a beautiful White-Mountains day:

IMG_1473.jpg

 

My feet were sore and I was very ready to finish the hike by the time I arrived at the end of Garfield Trail, at 5:30pm – nicely within the range I had predicted.  It had been two hours, and Jean was waiting there!  Happily, she had only been waiting a few minutes!

IMG_1474

IMG_1476

5:33pm At The Trailhead!  I Look Fresher Than I Felt!

 

What a great day – two 4000-footers on a beautiful day.  But far more challenging that I had expected!

*

As I flew towards Sydney in mid-July, 2009 (Jean would join me there two months later), I was thinking a lot about two aspects of my new role.  On the one hand, my role was “International Program Director,” which meant that I was expected to lead the thinking and strategy related to ChildFund Australia’s development and humanitarian work.  In my last blog entry I outlined some of what I was thinking about when I was thinking about great INGO programming…

At the same time, I would lead several teams and be a member of others.  In Sydney, I would lead the “International Program Team” (“IPT” – I will write more about this team next time), and I would be a member of the two “Senior Management” teams that Nigel Spence, ChildFund Australia’s CEO, had recently established: first, there was the Sydney-based “Business Support Leadership Team” (“BSLT,” chaired by Nigel), which was comprised of Nigel and the five Department Directors based in Sydney.  The BSLT was focused on leading the functions that made our programs possible: fundraising, finance, IT, human resources, sponsor relations, governance support, etc.  The role of the BSLT was described in the team’s charter:

The Business Support Leadership Team is responsible and accountable for developing and implementing systems, policies, procedures, guidelines and controls that enable the organisation to meet strategic and business objectives. The Business Support Team is also responsible and accountable for securing resources and determining resource allocation. 

And then there was my relationship with ChildFund Australia’s overseas teams in Hanoi, Port Moresby, and Phnom Penh.  As Nigel and I had discussed my new role, we looked at two possibilities:

  • Nigel could continue to directly manage ChildFund’s three Country Directors (located in Cambodia, Papua New Guinea and Viet Nam), as he had been doing.  This option would put me in a “staff” role in relation to overseas operations, “line” managing only IPT members in Sydney.  This would be similar in some ways to my role at Plan’s headquarters;
  • I could take over Nigel’s “line” management of the overseas CDs in addition to managing IPT members in Sydney.

Loyal readers of this blog will recall an earlier discussion of the tradeoffs involved here: as I moved from being Plan’s Regional Director for South America to the post of Program Director for the global organization, Max van der Schalk (Plan’s CEO at the time) and I had looked at two similar options.

In that case, we decided that I would not manage Plan’s Regional Directors, leaving him as their “line” manager; this left me in a “staff” role.  This would keep the organization’s structure a little bit flatter, but would burden Max with a broader span of control.  But that’s the way we went, and we made my new title reflect the difference: instead of following Marjorie Smit as “Program Director,” we decided my title would be “Director of Planning and Program Support.”  A rose by any other name…

So I was free to focus on strategy and structure, without being distracted by the daily dramas involved in line management – spending pressures, audit responses, personnel issues, etc.  It felt right at the time, and I certainly had more than enough power to get my job done; but later I did feel that the additional clout that line management would have given my role might have been helpful in making the transformational changes (in Plan’s goals, structure, and resource allocation) we achieved.  But I was happy with the choice we made, and we did make those changes.

I described the tradeoffs as I saw them to Nigel, and left the decision to him; I felt that I could go either way.  But I was delighted when he decided that I would become the line manager of ChildFund Australia’s three Country Directors … though, I quickly discovered that the CDs felt quite differently about what they felt was a loss of status.

So I would also lead and manage those three people, which became five as we expanded into Laos and Myanmar in the next few years.  The second “Senior Management” team that Nigel had recently formed was the “Program Operations Team,” (“POT”), which was comprised of him, me, and the three Country Directors; I would chair that team.  The role of the POT was described in its charter:

The Program Operations Team is responsible and accountable for operations: individually in their countries and head office; and collectively for the wider organization.  The Program Operations Team is focused on program strategy, managing the daily operations of the organization and furthering the achievement of ChildFund Australia’s programmatic goals.

This meant that I was going to be in three teams in my new role, leading two and joining the third as a member.  (I’d also co-chair the ChildFund Alliance Program Committee, but that’s a different story…)

*

Over the previous 25 years, I had learned a lot about working in, and leading, teams.  I had learned that people working in INGOs, generally speaking, are intrinsically motivated.  We join our agencies because we felt driven to help improve the world, with a passion for making a difference – not everybody was like that in my experience, but most were.  I saw this across all the organizations I had worked in, and all the locations where I had worked – we could almost take motivation for granted.  This was a luxury, something that many private-sector organizations work very hard to produce.

And that intrinsic motivation is a gift that could be spoiled if not handled correctly.  For example, my sense was that if a team leader managed as if motivation were a problem, and put in place mechanisms of control based (in part) on distrust, that kind of management culture would clash with the nature of our people, and would demotivate staff.  This accounted for some of the trouble that Alberto Neri got himself into in Plan

As I have discussed in an earlier blog post in this series, I had also learned that leading teams of INGO people did not mean that everything was going to be positive and nice.  Our organizations have plenty of internal complexities and might even have more-pervasive politics and ego than some for-profit environments.  There were dishonest people in our agencies.

In that earlier article I noted that:

… there is no inherent, inevitable contradiction between being clear and firm about roles, being fair but strict about adherence to procedures and performance, and the ideals of a nonprofit organization dedicated to social justice.  

And, for me, the way to successfully navigate the terrain between principle and pragmatism is to learn how to manage conflict while developing a deep sense of humility and self-awareness, mindfulness and equanimity, and engaged non-attachment.

*

Looking back, it seems to me that it boils down to four key domains that I would try to focus on during those years in Australia:

  • Teams, and team members, needed to be completely clear (1) about their task, their role, and the way that they were meant to carry out their duties;
  • They needed to work in an environment of trust (2), where they felt motivated, and
  • Inspired (3) to achieve their best in an important endeavor.  And, finally,
  • The whole effort needed to be founded on maintaining and restoring (4) relationships.  The most fundamental aspect of INGO management, in this model, is building and preserving authentic relationships in a context of clear accountability.

The rest of this blog post will describe how I tried to draw from what I had learned to make things clear, build trust, inspire, and restore relationships in the teams I worked with at ChildFund Australia.  It worked much (but certainly not all) of the time…

*

One aspect of team leadership that seemed to be essential when dealing with INGO people was establishing a clear aim, clear strategy, clear logic, and a clear way of measuring progress.

So the first element I thought about was clarity.  Clarity, in practical terms, meant building a shared understanding of what our teams were going to do, why we were going to do that, how we were going to do it, and how we would track what we accomplished to be accountable for our use of time and resources, and to learn from it.

Clarity Trust Inspiration - 1.002.jpeg

Building Strong INGO Teams: An Emerging Venn Diagram (1)

 

Building clarity was probably my biggest focus during my first year or two in Sydney.   I was lucky that I was able to build on the solid, existing statements of vision and mission for the overall organization:

ChildFund Australia’s vision is of a global community, free from poverty, where children are protected and have the opportunity to reach their full potential.

ChildFund Australia works in partnership with children and their communities to create lasting and meaningful change by supporting long-term community development and promoting children’s rights. 

 

These statements were great foundations, but they weren’t detailed enough to provide the clear, measurable foundation for our program work that I was looking for, the clarity that would be needed to foster high-performing program teams.

So we moved quickly, in the first few months of my tenure at ChildFund Australia, to develop a Theory of Change, outcome indicators, and a measurement framework.  In future blog posts in this series I will describe each of these elements of our program design in much more detail, because I think that they were state-of-the-art at the time; I mention them in passing here, because they created a clear and shared understanding of our program work.  The resulting “Theory of Change” (that I will unpack in a later blog entry in this series) was:

Theory of Change.001.jpeg

 

This Theory of Change draws in particular from two sources: the CCF Child Poverty Study, and from my own learning from the development of the UUSC Strategic Plan.

The overall program framework (which, again, I will describe in detail later) looked like this:

Slide1.jpg

ChildFund Australia Development Effectiveness Framework (DEF)

 

 

Once programmatic clarity began to emerge, in those first months, I started to assemble another key element of clarity and accountability: the ChildFund Australia “Program Handbook.”  Here I built on the “UUSC Handbook” that I had created several years earlier.  The Program Handbook ended up being a very long, complex document, but to me it seemed vital – an unambiguous reference that I could point to whenever I felt that things were starting to diverge in an unnecessary way.

These, and other, elements of clarity were put in place fairly quickly, and we spent a lot of time over the next five years using that framework as a basis for planning, learning, and accountability.

*

Along with clarity, I was thinking a lot about trust.  Knowing the character of our INGO people, and the culture of our organizations, it seemed to me that once we had a strong sense of clarity, the next essential ingredient in making a high-performance team was trust.  If people were motivated (which, as I said above, was something we could count on, at least until we harmed it!), clear about their purpose, learning from their work, and accountable for their behavior, then I had learned that they would get on with the job and fly.

But trust was essential, because without trust then the old management tools of management-by-objective, tight job descriptions, payment for performance, etc., would be necessary, and culture would surely shift in the wrong direction.  Motivation would drop because those old management tools were developed, and are suitable only (in my view) in contexts where people fit in to simpler, more-linear processes such as manufacturing or bookkeeping.

Clarity Trust Inspiration - 1.003.jpeg

Building Strong INGO Teams: An Emerging Venn Diagram (2)

 

That’s a major lesson I had learned from watching Alberto Neri’s work in Plan long before: what he wanted to do was right and good, but the way that he put his initiatives in place destroyed motivation and led him to failure as Plan’s CEO.

How to build trust in a team?  It’s a truism that trust takes years to develop, but only an instant to destroy.  I had learned how to build trust, and how I had damaged trust, along the way:

  • Trust has two elements:
    • You know that the person you trust knows what they are talking about.  They are competent;
    • You know that the person you trust is honest with you, has your best interests at heart, and works to maintain an authentic, human relationship with you.

If either of those two elements are not in place, then trust will be very elusive.  If both are in place, over time, trust can build.

As I thought about my new position at ChildFund Australia, it seemed to me that my own competence was probably unquestioned.  I had worked in the field for over 20 years, in similar, larger, organizations, across the world, and I had done a very similar job (in Plan) before.  I had served as Executive Director of an INGO.  I was very familiar with working in globally-federated organizations (as ChildFund Australia was), and had even been very involved in creating the program approach used by a key member of the ChildFund Alliance.  So even though I would be new to ChildFund Australia, I felt confident that my own competence would be recognized.

So, to build trust, I had to build on that sense of competence by being honest and straight with people on my teams, in a way that demonstrated that I had their best interests at heart, while trying to build and maintain an authentic relationship with them.  This didn’t mean that I would always agree with them, or that I would never discipline people, but that I would strive to be clear and honest and authentic in my management actions.

*

I had a feeling, as I flew towards Sydney, that if I could build clarity and trust, anything would be possible.  But there was one element missing: inspiration.  Given the motivation that is intrinsic in our INGO people, even if they were clear about the test and worked in a culture with high levels of trust, as time went by I felt that they would still need to be inspired to do their very best.

Clarity Trust Inspiration - 1.004.jpeg

Building Strong INGO Teams: An Emerging Venn Diagram (3)

 

Inspiration would be necessary because much of our work in INGOs isn’t particularly exciting.  Yes, it’s an honor to visit the field and work alongside people fighting for justice, for better futures.  Real inspiration comes from those visits.  But we also have to compete for funding, deal with reports and other paperwork, participate in performance reviews, deal with difficult people, (often) cut budgets, change plans, etc.  And we spend most of our time on those mundane tasks, which can create a sense of alienation from the source of our motivation.

That means that we need refreshing of our motivation periodically.  When I worked with ChildFund Australia I tried to make that happen in various ways.  In the Sydney office I organized occasional, open reflection meetings at which we would consider a range of topics that related to our program work, in a freewheeling way.  For example, one time we discussed the notion of direct cash transfers, something that challenged our program approach.

Another way of keeping us connected with the source of our motivation involved using the “case studies” that were produced frequently as part of our Development Effectiveness Framework – see element 3 in the diagram included above.  At our regular, formal IPT meetings, and even (when possible) at board committee meetings, I started our work with a quick reflection on one of those “case studies” to ground our work in the real, lived experience of  people who faced poverty and injustice.  I will describe the DEF, and the “case studies” in much more detail in a future blog, but for now I think that these, and other elements of my approach helped to keep up our teams’ levels of motivation and inspiration.

*

Finally, even with clarity, trust, and inspiration, over time, harm is done.  That’s because the normal, natural interaction in any team produces friction, and that friction takes a toll on the human beings within the team.  Luckily there is a range of principles and practices that are designed to restore harm.

Clarity Trust Inspiration - 1.001.jpeg

Building Strong INGO Teams: An Emerging Venn Diagram (4)

 

Late in my time at ChildFund Australia, as I worked through my Masters in Dispute Resolution at the University of New South Wales, I would study restorative justice in detail, which would help gel this topic for me.  But at this point my intention was to model some of the practices that I had seen Atema Eclai use at UUSC: frequent checkins with the team, and with each member; considering not just how people on the team were doing in their work lives, but as human beings; working in circles instead of around square tables; rotating the chairing of meetings around the teams.  Atema had clearly achieved very high levels of morale and loyalty, motivation and trust, which in part seemed to come from having spent lots of time building real, caring relationships with her team.

(At UUSC this seemed to veer into a sense of disunity, of aloofness and separation of Atema’s team from the rest of the organization, which was not a positive result.  But, overall, her team was very high-performing and, in part, this was due to Atema’s management approach.)

So I tried to put some of those mechanisms in place, and they worked pretty well.  Some of them ended up clashing with the very straightforward culture that is common in Australia, and which I came to appreciated.  But I tried to adapt things.

*

That’s what I was thinking about as I began to plan for my new post.  It makes sense to me, and reflects lots of learning over the years: our INGO teams will perform strongly if:

  • their task is clear, accountability is clear, what we are supposed to do, and why, is clear, and if how to carry out our tasks is clear;
  • we operate in a context of high trust;
  • the inspiration that we bring to our work is refreshed periodically.  And:
  • the normal wear-and-tear on our human relationships, the harm done over time, is restored intentionally.

Yes, we needed formality and controls.  And firm management.  I had learned that too much control, too many private-sector management tools, would harm team performance in INGOs.  But if I could create a management culture of clarity, trust, inspiration, and authentic human relationships, we might achieve a lot.

I’m sure there’s more to it, but that’s what I was thinking about as I flew towards Sydney!

*

Here are some random images of teams I’ve worked with:

This slideshow requires JavaScript.

 

Next time I will introduce the teams I worked with during my six years in Australia:

  • The Sydney-based International Program Team;
  • The Country Directors I worked with, in Papua New Guinea, Viet Nam, Cambodia, Laos, and Myanmar;
  • The senior managers in Sydney, at ChildFund Australia’s head office.

Imperfectly, doing the best I could, I tried to live up to an ambition to make sure that these teams were clear, trusted, and inspired.  Stay tuned!

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.