Mt Bond (36) – “Case Studies” In ChildFund Australia’s Development Effectiveness Framework

June, 2018

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 35 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”

Last time I described the ChildFund Australia “Development Effectiveness Framework,” the system that would help us make sure we were doing what we said we were going to do and, crucially, verifying that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

This time, I want to go into more depth on one component of the DEF, the “Case Studies” that described the lived experience of people that we worked with.  Next time, I’ll describe how we measured the impact of our work.

But first…

*

On 10 August, 2017, I climbed three 4000-footers in one very long day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a tough day, covering 22 miles and climbing three very big mountains.  At the end of the hike, I felt like I was going to lose the toenails on both big toes (which, in fact, I did!) … it was a bit much!

Last time I wrote about climbing to the top of Bondcliff, the first summit of that day.  This time, I will describe the brief walk from there to the top of Mt Bond, the tallest of the three Bonds.  And next time I’ll finish describing that day, with the ascent of West Bond and the return to the trail-head at Lincoln Woods.

*

As I described last time, I arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  I was able to get an early start because I had stayed the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.

It was a bright and mostly-sunny day, with just a few clouds and some haze.  The path between Bondcliff and Mt Bond is quite short – really just dropping down to a saddle, and then back up again, only 1.2 miles:

Bond Map - 6b

 

It took me about an hour to cover that distance and reach the top of Mt Bond from Bondcliff at 11:30am.  The path was rocky as it descended from Bondcliff, in the alpine zone, with many large boulders as I began to go back up towards Mt Bond – some scrambling required.

This photo was taken at the saddle between Bondcliff and Mt Bond: on the left is Bondcliff, on the right is West Bond, and in the middle, in the distance, is Franconia Ridge; Mt Bond is behind me.  A glorious view on an amazing day for climbing:

IMG_1929.jpg

From the Left: Bondcliff, Franconia Ridge, West Bond

 

It got even steeper climbing up from the saddle to the summit, passing through some small pine shrubs, until just before the top.

The views were spectacular at the summit of Mt Bond, despite the sky being slightly hazy – I could see the four 4000-footers of the Franconia Ridge to the west and Owl’s Head in the foreground, the Presidential Range to the east, and several other 4000-footers to the south and south-west:

IMG_1948 (1)

Looking To The West From The Summit Of Mt Bond

 

And I had a nice view back down the short path from the top of Bondcliff:

IMG_1943 (1)

 

There were a few people at the top, and I had a brief conversation with a couple that were walking from Zealand trailhead across the same three mountains I was climbing, and finishing at Lincoln Woods.  This one-way version of what I was doing in an up-and-back trip was possible because they had left a car at Lincoln Woods, driving to the Zealand trailhead in a second vehicle.  They would then ferry themselves back to Zealand from Lincoln Woods.

Kindly, they offered to pick up my car down at Lincoln Woods and drive it to Zealand, which would have saved me three miles.  I should have accepted, because finishing what became 22 miles, and three 4000-foot peaks, would end up hobbling me for a while, and causing two toenails to come off!  But I didn’t have a clear sense of how the day would go, so I declined their offer, with sincere thanks…

Getting to the top of Mt Bond was my 36th 4000-footer – just 12 more to go!

I didn’t stay too long at the top of Mt Bond on the way up, continuing towards West Bond… stay tuned for that next time!

*

Jean and I had moved to Sydney in July of 2009, where I would take up the newly-created position of International Program Director for ChildFund Australia.  It was an exciting opportunity for me to work in a part of the world I knew and loved (Southeast Asia: Cambodia, Laos, Myanmar and Viet Nam) and in a challenging new country (Papua New Guinea).  It was a great chance to work with some really amazing people – in Sydney and in our Country Offices… to use what I had learned to help build and lead effective teams.  Living in Sydney would not be a hardship post, either!  Finally, it was a priceless chance for me to put together a program approach that incorporated everything I had learned to that point, over 25 years working in poverty reduction and social justice.

In the previous article in this series, I described how we developed a “Development Effectiveness System” (“DEF”) for ChildFund Australia, and I went through most of the components of the DEF in great detail.

My ambition for the DEF was to bring together our work into one comprehensive system – building on our Theory of Change and organizational Vision and Mission, creating a consistent set of tools and processes for program design and assessment, and making sure to close the loop with defined opportunities for learning, reflection, and improvement.

Here is the graphic that we used to describe the system:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

As I said last time, I felt that three components of the DEF were particularly innovative, and worth exploring in more detail in separate blog articles:

  • I will describe components #2 (“Outcome Indicator Surveys) and #12 (Statement of Impact) in my next article.  Together, these components of the DEF were meant to enable us to measure the impact of our work in a robust, participatory way, so that we could learn and improve;
  • this time, I want to explore component #3 of the DEF: “Case Studies.”

*

It might seem strange to say it this way, but the “Case Studies” were probably my favorite of all the components of the DEF!  I loved them because they offered direct, personal accounts of the impact of projects and programs from children, youth, men and women from the communities in which ChildFund worked and the staff and officials of local agencies and government offices with whom ChildFund partnered.  We didn’t claim that the Case Studies were random or representative samples; rather, their value was simply as stories of human experience, offering insights would not have been readily gained from quantitative data.

Why was this important?  Why did it appeal to me so much?

*

Over my years working with international NGOs, I had become uneasy with the trend towards exclusive reliance on linear logic and quantitative measurement, in our international development sector.  This is perhaps a little bit ironic, since I had joined the NGO world having been educated as an engineer, schooled in the application of scientific logic and numerical analysis for practical applications in the world.

Linear logic is important, because it introduces rigor in our thinking, something that had been weak or lacking when I joined the sector in the mid-1980s.  And quantitative measurement, likewise, forced us to face evidence of what we had or had not achieved. So both of these trends were positive…

But I had come to appreciate that human development was far more complex than building a water system (for example), much more complicated than we could fully capture in linear models.  Yes, a logical, data-driven approach was helpful in many ways, perhaps nearly all of the time, but it didn’t seem to fit every situation in communities that I came to know in Latin America, Africa, and Asia.  In fact, I began to see that an over-emphasis on linear approaches to human development was blinding us to ways that more qualitative, non-linear thinking could help; we seemed to be dismissing the qualitative, narrative insights that should also have been at the heart of our reflections.  No reason not to include both quantitative and qualitative measures.  But we weren’t.

My career in international development began at a time when the private-sector, business culture, started to influence our organizations in a big way: as a result of the Ethiopian famine of the mid-1980’s, INGOs were booming and, as a result, were professionalizing, introducing business practices.  All the big INGOs started to bring in people from the business world, helping “professionalize” our work.

I’ve written elsewhere about the positive and negative effects that business culture had on NGOs: on the positive side, we benefited from systems and approaches the improved the internal management of our agencies, such as clear delegations of authority, financial planning and audit, etc.  Overall, it was a very good, and very necessary evolution.

But there were some negatives.  In particular, the influx of private-sector culture into our organizations meant that:

  • We began increasingly to view the world as a linear, logical place;
  • We came to embrace the belief that bigger is always better;
  • “Accountability” to donors became so fundamental that sometimes it seemed to be our highest priority;
  • Our understanding of human nature, of human poverty, evolved towards the purely material, things that we could measure quantitatively.

I will attach a copy of the article I wrote on this topic here:  mcpeak-trojan-horse.

In effect, this cultural shift had the effect of emphasizing linear logic and quantitative measures to such a degree, with such force, that narrative, qualitative approaches were sidelined as, somehow, not business-like enough.

As I thought about the overall design of the DEF, I wanted to make 100% sure that we were able to measure the quantitative side of our work, the concrete outputs that we produced and the measurable impact that we achieved (more on that next time).  Because the great majority of our work was amenable to that form of measurement, and being accountable for delivering the outputs (projects, funding) that we had promised was hugely important.

But I was equally determined that we would include qualitative elements that would enable us to capture the lived experience of people who facing poverty.  In other words, because poverty is experienced holistically by people, including children, in ways that can be captured quantitatively and qualitatively, we needed to incorporate both quantitative and qualitative measurement approaches if we were to be truly effective.

The DEF “Case Studies” was one of the ways that we accomplished this goal.  It made me proud that we were successful in this regard.

*

There was another reason that I felt that the DEF Case Studies were so valuable, perhaps just as important as the way that they enabled us to measure poverty more holistically.  Observing our organizations, and seeing my own response to how we were evolving, I clearly saw that the influence of private-sector, business culture was having positive and negative effects.

One of the most negative impacts I saw was an increasing alienation of our people from the basic motivations that led them to join the NGO sector, a decline in the passion for social justice that had characterized us.  Not to exaggerate, but it seemed that we were perhaps losing our human connection with the hope and courage and justice that, when we were successful, we helped make for individual women and men, girls and boys.  The difference we were making in the lives of individual human beings was becoming obscured behind the statistics that we were using, behind the mechanical approaches we were taking to our work.

Therefore, I was determined to use the DEF Case Studies as tools for reconnecting us, ChildFund Australia staff and board, to the reason that we joined in the first place.  All of us.

*

So, what were the DEF Case Studies, and how were they produced and used?

In practice, Development Effectiveness and Learning Managers in ChildFund’s program countries worked with other program staff and partners to write up Case Studies that depicted the lived experience of people involved in activities supported by ChildFund.  The Case Studies were presented as narratives, with photos, which sought to capture the experiences, opinions and ideas of the people concerned, in their own words, without commentary.  They were not edited to fit a success-story format.  As time went by, our Country teams started to add a summary of their reflections to the Case Studies, describing their own responses to the stories told there.

Initially we found that field staff had a hard time grasping the idea, because they were so used to reporting their work in the dry, linear, quantitative ways that we had become used to.  Perhaps program staff felt that narrative reports were the territory of our Communications teams, meant for public-relations purposes, describing our successes in a way that could attract support for our work.  Nothing wrong with that, they seemed to feel, but not a program thing!

Staff seemed at a loss, unable to get going.  So we prepared a very structured template for the Case Studies, specifying length and tone and approach in detail.  This was a mistake, because we really wanted to encourage creativity while keeping the documents brief; emphasizing the “voice” of people in communities rather than our own views; covering failures as much as successes.  Use of a template tended to lead our program staff into a structured view of our work, so once we gained some experience with the idea, as staff became more comfortable with the idea and we began to use these Case Studies, we abandoned the rigid template and encouraged innovation.

*

So these Case Studies were a primary source of qualitative information on the successes and failures of ChildFund Australia’s work, offering insights from children, youth and adults from communities where we worked and the staff of local agencies and government offices with whom ChildFund Australia partnered.

In-country staff reviewed the Case Studies, accepting or contesting the opinions of informants about ChildFund Australia’s projects.  These debates often led to adjustments to existing projects but also triggered new thinking – at the project activity level but also at program level or even the overall program approach.

Case Studies were forwarded to Sydney, where they were reviewed by the DEF Manager; some were selected for a similar process of review by International Program staff, members of the Program Review Committee and, on occasion, by the ChildFund Australia Board.

The resulting documents were stored in a simple cloud-based archive, accessible by password to anyone within the organization.  Some Case Studies were also included on ChildFund Australia’s website; we encouraged staff from our Communications team in Sydney to review the Case Studies and, if suitable, to re-purpose them for public purposes.  Of course, we were careful to obtain informed consent from people included in the documents.

*

Through Case Studies, as noted above, local informants were able to pass critical judgement on the appropriateness of ChildFund’s strategies, how community members perceived our aims and purposes (not necessarily as we intended); and they could alert us to unexpected consequences (both positive and negative) of what we did.

For example, one of the first Case Studies written up in Papua New Guinea revealed that home garden vegetable cultivation not only resulted in increased family income for the villager concerned (and positive impact on children in terms of nutrition and education), it also enhanced his social standing through increasing his capacity to contribute to traditional cultural events.

Here are three images from that Case Study:

Screen Shot 2018-06-09 at 3.07.54 PM

Screen Shot 2018-06-09 at 3.07.27 PM

Screen Shot 2018-06-09 at 3.07.41 PM

 

And here is a copy of the Case Study itself:  PNG Case Study #1 Hillary Vegetable farming RG edit 260111.  Later I was able to visit Hillary at his farm!

Another Case Study came from the ChildFund Connect project, an exciting effort led by my former colleagues Raúl Caceres and Kelly Royds, who relocated from Sydney to Boston in 2016.  I climbed Mt Moriah with them in July, 2017, and also Mt Pierce and Mt Eisenhower in August of 2016.  ChildFund Connect was an innovative project that linked children across Laos, Viet Nam, Australia and Sri Lanka, providing a channel for them directly to build understanding of their differing realities.   This Case Study on their project came from Laos: LAO Case Study #3 Connect DRAFT 2012.

In a future article in this series, I plan on describing work we carried out building the power (collective action) of people living in poverty.  It can be a sensitive topic, particularly in areas of Southeast Asia without traditions of citizen engagement.  Here is a Case Study from Viet Nam describing how ChildFund helped local citizens connect productively with authorities to resolve issues related to access to potable water: VTM Case Study #21 Policy and exclusion (watsan)-FINAL.

*

Dozens of Case Studies were produced, illustrating a wide range of experiences with the development processes supported by ChildFund in all of the countries where we managed program implementation.  Reflections from many of these documents helped us improve our development practice, and at the same time helped us stay in touch with the deeper purpose of our having chosen to work to promote social justice, accompanying people living in poverty as they built better futures.

*

A few of the DEF Case Studies focused, to some extent, on ChildFund Australia itself.  For example, here is the story of three generations of Hmong women in Nonghet District in Xieng Khoung Province in Laos.  It describes how access to education has evolved across those generations:  LAO Case Study #5 Ethnic Girls DRAFT 2012.  It’s a powerful description of change and progress, notable also because one of the women featured in the Case Study was a ChildFund employee, along with her mother and daughter!

Two other influential Case Studies came from Cambodia, both of which touched on how ChildFund was attempting to manage our child-sponsorship mechanisms with our programmatic commitments.  I’ve written separately, some time ago, about the advantages of child sponsorship: when managed well (as we did in Plan and especially in ChildFund Australia), and these two Case Studies evocatively illustrated the challenge, and the ways that staff in Cambodia were making it all work well.

One Case Study describes some of the tensions implicit in the relationship between child sponsorship and programming, and the ways that we were making progress in reconciling these differing priorities: CAM Case Study 6 Sponsorship DRAFT 2012.  This Case Study was very influential, with our staff in Cambodia and beyond, with program staff in Sydney, and with our board.  It powerfully communicated a reality that our staff, and families in communities, were facing.

A second Case Study discussed how sponsorship and programs were successfully integrated in the field in Cambodia: CAM Case Study #10 Program-SR Integration Final.

*

As I mentioned last time, given the importance of the system, relying on our feeling that the DEF was a great success wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

The first review (attached here: External DEF Review – November 2012), which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in my next blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advanced in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

I included an overview of the conclusions reached by both reviewers last time.  Here I want to quote from the first evaluation, with particular reference to the DEF Case Studies:

One of the primary benefits of the DEF is that it equips ChildFund Australia with an increased quantity and quality of evidence-based information for communications with key stakeholders including the Board and a public audience. In particular, there is consolidated output data that can be easily accessed by the communications team; there is now a bank of high quality Case Studies that can be drawn on for communication and reflection; and there are now dedicated resources in-country who have been trained and are required to generate information that has potential for communications purposes. The increase in quantity and quality of information equips ChildFund Australia to communicate with a wide range of stakeholders.

One of the strengths of the DEF recognized by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through Case Studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

This focus on building tools, systems and the overall capacity of the organization places ChildFund Australia in a strong position to tackle a second phase of the DEF which looks at how the organization will use performance information for learning and development. It has already started on this journey, with various parts of the organization using Case Studies for reflection. ChildFund Australia has already undertaken an exercise of coding the bank of Case Studies to assist further analysis and learning. There is lots of scope for next steps with this bank of Case Studies, including thematic reflections. Again, the benefits of this aspect have not been realised yet as the first stages of the DEF roll-out have been focused on data collection and embedding the system in CF practices.

In most Country Offices, Case Studies have provided a new formal opportunity for country program staff to reflect on their work and this has been used as a really constructive process. The Laos Country Office is currently in the process of translating Case Studies so that they can be used to prompt discussion and learning at the country level. In PNG, the team is also interested in using the Case Studies as a communication tool with local communities to demonstrate some of the achievements of ChildFund Australia programs.

In some cases, program staff have found Case Studies confronting when they have highlighted program challenges or weaknesses. The culture of critical reflection may take time to embed in some country offices and may be facilitated by cross-country reflection opportunities. Currently, however, Country Office staff do not know how to access Case Studies from other country programs. ChildFund Australia is exploring how the ‘bank’ of DEF Case Studies would be most accessible and useful to country office personnel.

One of the uses of Case Studies has been as a prompt for discussion and reflection by the programs team in Sydney and by the Board. Case Studies have been seen as a really useful way to provide an insight into a program, practice and ChildFund Australia achievements.

At an organizational level, an indexing and cross-referencing system has been implemented which enables Case Studies to be searched by country and by theme. The system is yet to be introduced to MEL and Program users, but has potential to be a very useful bank of qualitative data for reflection and learning. It also provides a bank of data from which to undertake thematic reflections across and between countries. One idea for consideration is that ChildFund draw on groups of Case Studies to develop practice notes.

In general Case Studies are considered to be the most ‘successful’ part of the DEF by those involved in collecting information.

The second reviewer concentrated on other components, mainly aspects I will describe in more detail in my next article, not so much the Case Studies…

*

So the Case Studies were a very important element in the overall DEF.  I tried very hard to incorporate brief reflections on selected Case Studies at every formal meeting of the International Program Team, of ChildFund Australia’s Program Review Committee, and (less frequently) at meetings of our Board of Directors.  More often than not, time pressures on the agendas of these meetings led to us dropping the Case Studies from discussion, but often enough we did spend time (usually at the beginning of the meetings) reflecting on what we saw in them.

At the beginning, when we first began to use the Case Studies, our discussion tended to be mechanical: pointing out errors in the use of English, or questioning how valid the observations might be, challenging the statistical reliability of the conclusions.  But, over time, I noticed that our teams began to use the Case Studies as they were designed: to gain insight into the lived experience of particular human beings, and to reconnect with the realities of people’s struggle for better lives for themselves and their children.

This was a great success, and really worked as I had hoped.  The Case Studies complemented the more rigorous, quantitative components of the DEF, helping the system be holistic, enabling us to see more deeply into the effect that our work was having while also enhancing our accountability.

*

Next time, I will describe getting to the top of West Bond, and all the way down the 11 miles from there to the Lincoln Woods parking lot, where I staggered back to my car with such damage to my feet that I soon would lose toenails on both my big toes!  And I will share details of the final two components of the DEF that I want to highlight: the Outcome Indicator Surveys and Statements of Impact were probably the culmination of the whole system.

So, stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System.

 

 

Bondcliff (35) – ChildFund Australia’s Development Effectiveness Framework

June, 2018

I began a new journey just over two years ago, in May, 2016, tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers. I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

In each article, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

So, when I wrap things up in this series, there should be 48 articles…

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia, a newly-created position.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team – my intention was to lead and manage with clarity, trust, and inspiration.  A few weeks ago, I wrote describing the role and staffing and structural iterations of ChildFund’s International Program Team and, last time, I outlined the foundational program approach we put in place – a Theory of Change and Outcome and Output Indicators.

Once the program approach was in place, as a strong foundation, we moved forward to build a structured approach to development effectiveness.  I am very proud of what we achieved: the resulting ChildFund Australia “Development Effectiveness Framework” (“DEF”) was, I think, state-of-the-art for international NGOs at the time.  Certainly few (if any) other INGOs in Australia had such a comprehensive, practical, useful system for ensuring the accountability and improvement of their work.

Since the DEF was so significant, I’m going to write three articles about it:

  1. In this article I will describe the DEF – its components, some examples of products generated by the DEF, and how each part of the system worked with the other parts.  I will also share results of external evaluations that we commissioned on the DEF itself;
  2. Next time, I will highlight one particular component of the DEF, the qualitative “Case Studies” of the lived experience of human change.  I was especially excited to see these Case Studies when they started arriving in Sydney from the field, so I want to take a deep dive into what these important documents looked like, and how we attempted to use them;
  3. Finally, I will the last two DEF components that came online (Outcome Indicator Surveys and Statements of Impact), the culmination of the system, where we assessed the impact of our work.

So there will be, in total, three articles focused on the DEF.  This is fitting, because I climbed three mountains on one day in August of 2017…

*

On 10 August, 2017, I climbed three 4000-footers in one day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a very long, very tough day, covering 22 miles and climbing three mountains in one go.  At the end of the hike, I felt like I was going to lose the toenails on both big toes… and, in fact, that’s what happened.  As a result, for the rest of the season I would be unable to hike in boots and had to use hiking shoes instead!

Knowing that the day would be challenging, I drove up from Durham the afternoon before and camped, so I could get the earliest start possible the next morning.  I got a spot at Hancock Campground, right near the trailhead where I would start the climb:

IMG_1871.jpg

 

The East Branch of the Pemigewassit River runs alongside this campground, and I spent a pleasant late afternoon reading a book by Jean Paul Lederach there, and when it was dark I crawled into my sleeping bag and got a good night’s sleep.

IMG_1868

IMG_1869

 

Here is a map of the long ascent that awaited me the next morning, getting to the top of Bondcliff:

Bond Map - 3.jpg

 

After Bondcliff, the plan was that I would continue on to climb Mt Bond and West Bond, and to then return to Lincoln Woods… more on that in the next two articles in this series.  In this one I will describe climbing the first 4000-footer of that day, Bondcliff.

I got an early start on 10 August, packing up my tent-site and arriving at the trailhead at Lincoln Woods at about 6:30am:

IMG_1873.jpg

 

It was just two weeks earlier that I had parked here to climb Owl’s Head, which I had enjoyed a lot.  This time, I would begin the same way – walking up the old, abandoned forestry railway for about 2.6 miles on Lincoln Woods Trail, to where I had turned left up the Franconia Brook Trail towards Owl’s Head.  I arrived at that junction at about 7:30am:

IMG_1883.jpg

IMG_1891.jpg

 

 

This time I would continue straight at that intersection, continuing onto the Wilderness Trail, which winds through forest for a short distance, before opening out again along another old logging railway, complete with abandoned hardware along the way, discarded over 130 years ago:

IMG_1893.jpg

 

At the former (and now abandoned) Camp 16 (around 4.4 miles from the parking lot at Lincoln Woods), I took a sharp left and joined a more normal trail – no more old railway.  I began to ascend moderately, going up alongside Black Brook: now I was on the Bondcliff Trail.

 

I crossed Black Brook twice on the way up after leaving the Wilderness Trail, and then crossed two dry beds of rock, which were either rock slides or upper reaches of Black Brook that were dry that day.

IMG_1898.jpg

 

It’s a long climb up Black Brook; after the second dry crossing, Bondcliff Trail takes a sharp left turn and continues ascending steadily.  Just before reaching the alpine area, and the summit of Bondcliff, there is a short steep section, where I had to scramble up some bigger boulders.  Slow going…

But then came the reward: spectacular views to the west, across Owl’s Head to Franconia Ridge, up to Mt Garfield, and over to West Bond and Mt Bond.  Here Mt Lincoln and Mt Lafayette are on the left, above Owl’s Head, with Mt Garfield to the right:

IMG_1905

Lincoln and Lafayette In The Distance On The Left, Mt Garfield In The Distance On The Right

 

Here is a view looking to the southwest from the top of Bondcliff:

IMG_1907

From The Summit Of Bondcliff

IMG_1920

From The Summit Of Bondcliff

 

And this is the view towards Mt Bond, looking up from the top of Bondcliff:

IMG_1925

West Bond Is On The Left, And Mt Bond On The Right

 

I got to the top of Bondcliff at about 10:30am, just about four hours from the start of the hike.  Feeling good … at this point!  Here is a spectacular view back down towards Bondcliff, taken later in the day, from the top of West Bond:

IMG_1964.jpg

 

I would soon continue the climb, with a short hop from Bondcliff up to the top of Mt Bond.  Stay tuned!

*

Last time I wrote about how we built the foundations for ChildFund Australia’s new program approach: a comprehensive and robust “Theory of Change” that described what we were going to accomplish at a high level, and why; a small number of reliable, measurable, and meaningful “Outcome Indicators” that would enable us to demonstrate the impact of our work as related explicitly to our Outcome Indicators; and a set of “Output Indicators” that would allow us to track our activities in a consistent and comparable manner, across our work across all our programs: in Cambodia, Laos, Papua New Guinea, and Viet Nam.  (Myanmar was a slightly different story, as I will explain later…)

Next, on that foundation, we needed a way of thinking holistically about the effectiveness of our development work: a framework for planning our work in each location, each year; for tracking whether we were doing what we had planned; for understanding how well we were performing; and improving the quality and impact of our work.  And doing all this in partnership with local communities, organizations, and governments.

This meant being able to answer five basic questions:

  1. In light of our organizational Theory of Change, what are we going to do in each location, each year?
  2. how will we know that we are doing what we planned to do?
  3. how will we know that our work makes a difference and gets results consistent with our Theory of Change?;
  4. how will we learn from our experience, to improve the way we work?;
  5. how can community members and local partners directly participate in the planning, implementation, and evaluation of the development projects that ChildFund Australia supports?

Looking back, I feel that what we built and implemented to answer those questions – the ChildFund Australia “Development Effectiveness Framework” (“DEF”) – was our agency’s most important system.  Because what could be more important than the answers to those five questions?

*

I mentioned last time that twice, during my career with Plan International, we had tried to produce such a system, and failed (at great expense).  We had fallen into several traps that I was determined to avoid repeating this time, in ChildFund Australia, as we developed and implemented the DEF:

  • We would build a system that could be used by our teams with the informed participation of local partners and staff, practically – that was “good enough” for its purpose, instead of a system that had to be managed by experts, as we had done in Plan;
  • We would include both quantitative and qualitative information, serving the needs of head and heart, instead of building a wholly-quantitative system for scientific or academic purposes, as we had done in Plan;
  • We would not let “the best be the enemy of the good,” and I would make sure that we moved to rapidly prototype, implement, and improve the system instead of tinkering endlessly, as we had done in Plan.

I go into more detail about the reasons for Plan’s lack of success in that earlier article.

*

Here is a graphic that Caroline Pinney helped me create, which I used very frequently to explain how the DEF was designed, functioned, and performed:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

In this article, I will describe each component of the DEF, outlining how each relates to each other and to the five questions outlined above.

However, I’m going to reserve discussion of three of those components for my next two articles:

  • Next time, I will cover #3 in Figure 1, the “Case Studies” that we produced.  These documents helped us broaden our focus from the purely quantitative to include consideration of the lived experience of people touched by the programs supported by ChildFund Australia.  In the same way, the Case Studies served as valuable tools for our staff, management, and board to retain a human connection to the spirit that motivated us to dedicate our careers to social justice;
  • And, after that, I will devote an article to our “Outcome Indicator Surveys” (#2 in Figure 1, above) and Statements of Impact (#12 in Figure 1). The approach we took to demonstrating impact was innovative and very participatory, and successful.  So I want to go into a bit of depth describing the two DEF components involved.

Note: I prepared most of what follows.  But I have included and adapted some descriptive material produced by the two DEF Managers that worked in the International Program Team:  Richard Geeves and Rouena Getigan.  Many thanks to them!

*

Starting Points

The DEF was based on two fundamental statements of organizational identity.  As such, it was built to focus us on, and enable us to be accountable for, what we were telling the world we were:

  1. On the bottom left of the DEF schematic (Figure 1, above) we reference the basic documents describing ChildFund’s identity: our Vision, Mission, Strategic Plan, Program Approach, and Policies – all agreed and approved by our CEO (Nigel Spence) and Board of Directors.  The idea was that the logic underlying our approach to Development Effectiveness would therefore be grounded in our basic purpose as an organization, overall.  I was determined that the DEF would serve to bring us together around that purpose, because I had seen Plan tend to atomize, with each field location working towards rather different aims.  Sadly, Plan’s diversity seemed to be far greater than required if it were simply responding to the different conditions we worked in.  For example, two Field Offices within 20 km of each other in the same country might have very different programs.  This excessive diversity seemed to relate more to the personal preferences of Field Office leadership than to any difference in the conditions of child poverty or the local context.  The DEF would help ChildFund Australia cohere, because our starting point was our organizational identity;
  2. But each field location did need a degree flexibility to respond to their reality, within ChildFund’s global identity, so at the bottom of the diagram we placed the Country Strategy Paper (“CSP”), quite centrally.  This meant that, in addition to building on ChildFund Australia’s overall purpose and identity globally, we would also build our approach to Development Effectiveness on how we chose to advance that basic purpose in each particular country where we worked, with that country’s particular characteristics.

Country Strategy Paper

The purpose and outline of the CSP was included in the ChildFund Australia Program Handbook:

To clarify, define, communicate and share the role, purpose and structure of ChildFund in-country – our approach, operations and focus. The CSP aims to build a unity of purpose and contribute to the effectiveness of our organisation.

When we develop the CSP we are making choices, about how we will work and what we will focus on as an organisation. We will be accountable for the commitments we make in the CSP – to communities, partners, donors and to ourselves.

While each CSP will be different and reflect the work and priorities of the country program, each CSP will use the same format and will be consistent with ChildFund Australia’s recent program development work.

During the development of the CSP it is important that we reflect on the purpose of the document. It should be a useful and practical resource that can inform our development work. It should be equally relevant to both our internal and external stakeholders. The CSP should be clear, concise and accessible while maintaining a strategic perspective. It should reflect clear thinking and communicate our work and our mission. It should reflect the voice of children.  Our annual work plans and budgets will be drawn from the CSP and we will use it to reflect on and review our performance over the three year period.

Implementation of the DEF flowed from each country’s CSP.

More details are found in Chapter 5 of the Program Handbook, available here: Program Handbook – 3.3 DRAFT.  Two examples of actual ChildFund Australia Country  Strategy Papers from my time with the organization are attached here:

For me, these are clear, concise documents that demonstrate coherence with ChildFund’s overall purpose along with choices driven by the situation in each country.

*

Beginning from the Country Strategy Paper, the DEF branches in two inter-related (in fact, nested) streams, covering programs (on the left side) and projects (on the right side).  Of course, projects form part of programs, consistent with our program framework:

Screen Shot 2018-05-28 at 2.16.30 PM

Figure 2: ChildFund Australia Program Framework

 

But it was difficult to depict this embedding on the two dimensions of a graphic!  So Figure 1 showed programs on one side and projects on the other.

Taking the “program” (left) side first:

Program Description

Moving onto the left side of Figure 1, derived from the Country Strategy Paper, and summarized in the CSP, each Country Office defined a handful (some countries had 3, others ended up with 5) “Program Descriptions” (noted as #1 in Figure 1), each one describing how particular sets of projects would create impact, together, as measured using ChildFund Australia’s Outcome Indicators – in other words, a “Theory of Change,” detailing how the projects included in the program linked together to create particular  positive change.

The purpose and outline of the Program Description was included in the ChildFund Australia Program Handbook:

ChildFund Australia programs are documented and approved through the use of “Program Descriptions”.  All Program Descriptions must be submitted by the Country Director for review and approval by the Sydney International Program Director, via the International Program Coordinator.

For ChildFund Australia: a “program” is an integrated set of projects that, together, have direct or indirect impact on one or more of our agreed organisational outcome indicators.   Programs normally span several geographical areas, but do not need to be implemented in all locations; this will depend on the geographical context.  Programs are integrated and holistic. They are designed to achieve outcomes related to ChildFund Australia’s mission, over longer periods, while projects are meant to produce outputs over shorter timeframes.

Program Descriptions were summarized in the CSP, contained a listing of the types of projects (#5 in Figure 1) that would be implemented, and were reviewed every 3 or 4 years (Program Review, #4 in Figure 1).

To write a Program Description, ChildFund staff (usually program managers in a particular Country Office) were expected to review our program implementation to-date, carry out extensive situational analyses of government policies, plans and activities in the sector and of communities’ needs in terms of assets, aspirations and ability to work productively with local government officials responsible for service provision. The results of ChildFund’s own Outcome Indicator surveys and community engagement events obviously provided very useful evidence in this regard.

Staff then proposed a general approach for responding to the situation and specific strategies which could be delivered through a set of projects.  They would also show that the approach and strategies proposed are consistent with evidence from good practice both globally and in-country, demonstrated that their choices were evidence-based.

Here are 2 examples of Program Descriptions:

Producing good, high-quality Program Descriptions was a surprising challenge for us, and I’m not sure we ever really got this component of the DEF right.  Probably the reason that we struggled was that these documents were rather abstract, and our staff weren’t used to operating at this level of abstraction.

Most of the initial draft Program Descriptions were quite superficial, and were approved only as place-holders.  Once we started to carry out “Program Reviews” (see below), however, where more rigor was meant to be injected into the documents, we struggled.  It was a positive, productive struggle, but a struggle nonetheless!

We persisted, however, because I strongly believed that our teams should be able to articulate why they were doing what they were doing, and the Program Descriptions were the basic tool for that exact explanation.  So we perservered, hoping that the effort would result in better programs, more sophisticated and holistic work, and more impact on children living in poverty.

*

 

 

Program Reviews

For the same reasons outlined above, in my discussion of the “Program Descriptions” component of the DEF, we also struggled with the “Program Review” (#4 in Figure 1, above).  In these workshops, our teams would consider an approved “Program Description” (#1 in Figure 1) every three or four years, subjecting the document to a formal process of peer review.

ChildFund staff from other countries visited the host country to participate in the review process and then wrote a report making recommendations for how the Program under review might be improved.  The host country accepted (or debated and adjusted) the  recommendations, acted on them and applied them to a revision of the Program Description: improving it, tightening up the logic, incorporating lessons learned from implementation, etc.

Program Reviews were therefore fundamentally about learning and improvement, so we made sure that, in addition to peers from other countries, the host Country Office invited in-country partners and relevant experts.  And International Program Coordinators from Sydney were asked to always attend Program Reviews in the countries that they were supporting, again for learning and improvement purposes.

The Program Reviews that I attended were useful and constructive, but I certainly sensed a degree of frustration.  In addition to struggling with the relatively-high levels of abstraction required, our teams were not used to having outsiders (even their peers other ChildFund offices) critique their efforts.  So, overall, this was a good and very-important component of the DEF, designed correctly, but it needed more time for our teams to learn how to manage this process and to be open to such a public process of review.

*

Projects and Quarterly Reports

As shown on the right hand side of Figure 1, ChildFund’s field staff and partners carried out routine monitoring of projects (#6 in the Figure) to ensure that they were on track, and on which they based their reporting on activities and outputs.  Project staff summarized their monitoring through formal Quarterly Reports (#7) on each project documenting progress against project plans, budgets, and targets to ensure projects are well managed.  These Quarterly Reports were reviewed in each Country Office and most were also forwarded to ChildFund’s head office in Sydney (and, often, donors) for review.

When I arrived, ChildFund Australia’s Quarterly reporting was well-developed and of high quality, so I didn’t need to focus on this aspect of our work.  We simply incorporated it into the more-comprehensive DEF.

*

Quarterly Output Tracking

As described last time, ChildFund developed and defined a set of Outputs which became standard across the organization in FY 2011-12.  Outputs in each project were coded and  tracked from Quarter to Quarter by project.  Some of the organizational outputs were specific to a sector such as education, health and water sanitation or a particular target group such as children, youth or adults.  Other Outputs were generic and might be found in any project, for example, training or awareness raising, materials production and consultation.

Organizational Outputs were summarized for all projects in each country each Quarter and country totals were aggregated in Sydney for submission to our Board of Directors (#8 in Figure 1, above).  In March 2014 there were a total of 47 organizational Outputs – they were listed in my last article in this series.

One purpose of this tracking was to enhance our accountability, so a summary was reviewed every Quarter in Sydney by the International Program Team and our Program Review Committee.

Here is an example of how we tracked outputs: this is a section of a Quarterly Report produced by the International Program Team for our Board and Program Review Committee: Output Report – Q4FY15.

*

Project Evaluations

ChildFund also conducted reviews or evaluations of all projects (#9 in Figure 1, above) – in different ways.  External evaluators were employed under detailed terms of reference to evaluate multi-year projects with more substantial budgets or which were significant for learning or to a particular donor.  Smaller projects were generally evaluated internally.  All evaluators were expected to gather evidence of results against output targets and performance indicators written against objectives.

*

All development effectiveness systems have, at their heart, mechanisms for translating operational experiences into learning and program improvement.  In the representation of ChildFund’s DEF in Figure 1, this was represented by the central circle in the schematic which feeds back evidence from a variety of sources into our organizational and Country Strategy Papers, Program Descriptions and project planning and design.

Our program staff found that their most effective learning often occurred during routine monitoring through observation of project activities and conversations in communities with development partners.  Through thoughtful questioning and attentive listening, staff could make the immediate decisions and quick adjustments which kept project activities relevant and efficient.

Staff also had more formal opportunities to document and reflect on learning.  The tracking of outputs and aggregation each Quarter drew attention to progress and sometimes signaled the need to vary plans or redirect resources.

Project evaluations (#9 in Figure 1, above) provided major opportunities for learning, especially when external evaluators bring their different experiences to bear and offer fresh perspectives on a ChildFund project.

*

The reader can easily grasp that, for me, the DEF was a great success, a significant asset for ChildFund Australia that enabled us to be more accountable and effective.  Some more-technically-focused agencies were busy carrying out sophisticated impact evaluations, using control groups and so forth, but that kind of effort didn’t suit the vast majority of INGOs.  We could benefit from the learnings that came from those scientific evaluations, but we didn’t have the resources to introduce such methodologies ourselves.  And so, though not perfect, I am not aware of any comparable organization that succeeded as we did with our DEF.

While the system built on what I had learned over nearly 30 years, and even though I felt that it was designed comprehensively and working very well, that was merely my opinion!

Given the importance of the system, relying on my opinion (no matter how sound!) wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

*

The first review, which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in an upcoming blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advance in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

In that light, this first external review of the DEF concluded the following:

The development of the DEF places ChildFund Australia in a sound position within the sector in the area of development effectiveness. The particular strength of ChildFund Australia’s framework is that it binds the whole organisation to a set of common indicators and outputs. This provides a basis for focussing the organisation’s efforts and ensuring that programming is strategically aligned to common objectives. The other particular strength that ChildFund Australia’s framework offers is that it provides a basis for aggregating its achievements across programs, thereby strengthening the organisation’s overall claims of effectiveness.

Within ChildFund Australia, there is strong support for the DEF and broad agreement among key DEF stakeholders and users that the DEF unites the agency on a performance agenda. This is in large part due to dedicated resources having been invested and the development of a data collection system has been integrated into the project management system (budgeting and planning, and reporting), thereby making DEF a living and breathing function throughout the organisation. Importantly, the definition of outcomes and outputs indicators provides clarity of expectations across ChildFund Australia.

One of the strengths of the DEF recognised by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through case studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

Significantly, the DEF signals a focus on effectiveness to donors and the sector. One of the benefits already felt by ChildFund Australia is that it is able to refer to its effectiveness framework in funding submissions and in communication with its major donors who have an increasing interest on performance information.

Overall, the review found that the pilot of the DEF has been implemented well, with lots of consultation and engagement with country offices, and lots of opportunity for refinement. Its features are strong, enabling ChildFund to both measure how much it is doing, and the changes that are experienced by communities over time. The first phase of the DEF has focused on integrating effectiveness measurement mechanisms within program management and broader work practices, while the second phase of the DEF will look at the analysis, reflection and learning aspects of effectiveness. This second phase is likely to assist various stakeholders involved in collecting effectiveness information better understand and appreciate the linkages between their work and broader organisational learning and development. This is an important second phase and will require ongoing investment to maximise the potential of the DEF. It place ChildFund Australia in a strong position within the Australian NGO sector to engage in the discourse around development effectiveness and demonstrate its achievements.

A full copy of this first review, removing only the name of the author, is attached here: External DEF Review – November 2012.

In early 2015 we carried out a second review.  This time, we had implemented the entire DEF, carrying out (for example) Statement of Impact workshops in several locations.  The whole system was now working.

At that point, we were very confident in the DEF – from our point of view, all components were working well, producing good and reliable information that was being used to improve our development work.  Our board, program-review committee, and donors were all enthusiastic.  More importantly, local staff and communities were positive.

The only major concern that remained related to the methodology we used in the Outcome Indicator Surveys.  I will examine this issue in more detail in an upcoming blog article in this series; but the reader will notice that this second formal, external evaluation focuses very much on the use of the LQAS methodology in gathering information for our Outcome Indicator workshops and Statements of Impact.

That’s why the external evaluator we engaged to carry out this second review was an expert in survey methodologies (in general) and in the LQAS (in particular.)

In that light, this second external review of the DEF concluded the following:

ChildFund Australia is to be commended for its commitment to implementing a comprehensive and rigorous monitoring and evaluation framework with learning at its centre to support and demonstrate development effectiveness. Over the past five years, DEL managers in Cambodia, Laos, Papua New Guinea and Vietnam, with support and assistance from ChildFund Australia, country directors and program managers and staff, have worked hard to pilot, refine and embed the DEF in the broader country programs.  Implementing the DEF, in particular the Outcome Indicator Survey using LQAS, has presented several challenges.  With time, many of the early issues have been resolved, tools improved and guidelines developed.  Nevertheless, a few issues remain that must be addressed if the potential benefits are to be fully realised at the organisational, country and program levels.

Overall, the DEF is well suited for supporting long-term development activities in a defined geographic area.  The methodologies, scope and tools employed to facilitate Outcome Indicator Surveys and to conduct Community Engagement and Attribution of Impact processes are mostly fit for purpose, although there is considerable room for improvement.  Not all of the outcome indicators lend themselves to assessment via survey; those that are difficult to conceptualise and measure being most problematic. For some indicators in some places, a ceiling effect is apparent limiting their value for repeated assessment. While outcome indicators may be broadly similar across countries, both the indicators and the targets with which they are to be compared should be locally meaningful if the survey results are to be useful—and used—locally.

Used properly, LQAS is an effective and relatively inexpensive probability sampling method.  Areas for improvement in its application by ChildFund include definition of the lots, identification of the sampling frame, sample selection, data analysis and interpretation, and setting targets for repeated surveys.

Community Engagement and the Attribution of Impact processes have clearly engaged the community and local stakeholders.  Experience to date suggests that they can be streamlined to some extent, reducing the burden on staff as well as communities.  These events are an important opportunity to bring local stakeholders together to discuss local development needs and set future directions and priorities.  Their major weakness lies in the quality of the survey results that are presented for discussion, and their interpretation.  This, in turn, affects the value of the Statement of Impact and other documents that are produced.

The DEF participatory processes have undoubtedly contributed to the empowerment of community members involved. Reporting survey results in an appropriate format, together with other relevant data, in a range of inviting and succinct documents that will meet the needs of program staff and partners is likely to increase their influence.

A full copy of this second review, removing only the name of the author, is attached here: DEF Evaluation – April 2015.

*

Great credit is due to ChildFund staff that contributed to the conceptualization, development, and implementation of the DEF.  In particular, Richard Geeves and Rouena Getigan in the International Program Team in Sydney worked very hard to translate my sometimes overly-ambitious concepts into practical guidelines, and ably supported our Country Offices.

One of the keys to the success of the DEF was that we budgeted for dedicated in-country support, with each Country Office able to hire a DEL Manager (two in Viet Nam, given the scale of our program there.)

Many thanks to Solin in Cambodia, Marieke in Laos, Joe in Papua New Guinea, and Thuy and Dung in Viet Nam: they worked very hard to make the DEF function in their complex realities.  I admire how that made it work so well.

*

In this article, I’ve outlined how ChildFund Australia designed a comprehensive and very robust Development Effectiveness System.  Stay tuned next time, when I describe climbing Mt Bond, and then go into much more depth on one particular component (the Case Studies, #3 in Figure 1, above).

After that, in the following article, I plan to cover reaching the top of West Bond and descending back across Mt Bond and Bondcliff (and losing toenails on both big toes!) and go into some depth to describe how we carried out Outcome Indicator Surveys (#2 in Figure 1) and Statements of Impact (#12) – in many ways, the culmination of the DEF.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change.

 

 

Trust

April, 2018

I’ve been reading about trust these days, partly as I prepare my next “4000-footer” blog.  I came across this quote, that I like very much:

“‘Management,’ in most of its incarnations, is an institutionalized form of distrust.”(1)

That’s not to say that ‘management’ isn’t necessary.  But that, in contexts of high trust, traditional ways of ‘managing’ (job descriptions, management by objectives, for example) aren’t appropriate or needed.  In fact, I think that in the context of our INGOs, a very different form of ‘management’ is called for.

This seems right to me.  If so, then the question of how to create contexts of high trust becomes very important.

Interesting food for thought!  Stay tuned for more on this topic in my next article.

 

(1) “Building Trust in Business, Politics, Relationships and Life,” Solomon and Flores, Oxford University Press, 2001, page 43.

Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101.

October, 2017

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

Since then, across 25 posts (so far), I’ve described climbing 25 4000-foot mountains in New Hampshire, and I’ve reflected on: two years as a Peace Corps Volunteer in Ecuador; my 15 years with Plan International; the deep, disruptive changes in the development sector over that time; and, most recently, the two years I spent consulting with CCF, developing a new program approach for that agency that we called “Bright Futures.”

This time I want to conclude my description of those Bright Futures years by sharing our attempt to encourage a new set of values and attitudes in CCF’s staff, through a weeklong immersion, experiential training workshop we called “Bright Futures 101.”

*

Peter Drucker is supposed to have said that “culture eats strategy for breakfast.”  This certainly seemed to be true as CCF moved into the pilot testing and rollout of Bright Futures – the agency was investing in new systems and new structures in a big way.  But Bright Futures would only realise its promise of more effective work for children living in poverty if the culture of the organisation shifted how it viewed its work, how it viewed the people it worked for.

*

But first… I climbed both Mt Lincoln and Mt Lafayette on 22 June, 2017, on a beautiful, mostly-sunny day.  My plan had been to do this loop back in September of 2016, with my brother, but my fall and the resulting injuries (broken rib, torn rotator cuff) forced a postponement.

That morning I left Durham at 6:45am, and drove up through Concord, stopping in Tilton for a coffee, and in Lincoln to buy a sandwich for lunch.  So I didn’t get to the trailhead until just after 9am.

The parking lot at Lafayette Place was nearly-full, with lots of people arriving, getting ready to hike on what was a clear, cool day, perfect for hiking.  It was a bit surprising for a Thursday; I was glad not to be doing this climb on the weekend!

I know that I climbed both Lincoln and Lafayette in the distant past, probably in the 1980’s, but I don’t really have any clear memory of the hike.  So it was new to me, again, perhaps 30+ years later!

On this day, I had arrived at the trailhead for both the “Falling Waters” trail, and for the “Old Bridle Path.”  I planned to walk up Falling Waters, across Franconia Ridge to Mt Lincoln and Mt Lafayette, and then down the Old Bridle Path, back to Lafayette Place.

Screen Shot 2017-07-13 at 12.11.06 PM.png

 

As I started out, there were many people walking along with me, so it took some time to get sorted into a fairly-stable pack.  It took me about 15 minutes to reach the beginning of the Falling Waters Trail; I would return here later in the day, coming down the Old Bridle Path.  So far, it was a beautiful day for hiking!  But lots of people…

I continued up the Falling Waters trail, along the stream with many small waterfalls (so, the trail is aptly named!)  I took lots of photos and several videos of the waterfalls.  The trail ascended steadily, moderately, along the brook.

IMG_0888

 

The walk was typical White-Mountains rock-hopping, moderately and steadily upward in the shadow of Mt Lincoln.  I was working pretty hard, and gradually more space opened up between groups of hikers.  There were no insects during this part of the hike – indeed, there would be none until I got to Greenleaf Hut later in the afternoon.

I started to emerged from the forest into scrub pine at about 11am, and the views across to Franconia Notch became remarkable:

IMG_0888

 

Then, suddenly, I was out of the trees, ascending Little Haystack, and the views were just spectacular:

IMG_0907

Mt Lafayette and Franconia Notch

IMG_0902

Mt Lincoln Just North Of Mt Haystack

IMG_0901

Looking North Towards Mt Lincoln

IMG_0896

Franconia Notch.  Cannon Mountain is Clearly Visible At The Top Of Franconia Notch

IMG_0894

North and South Kinsman Visible Across Franconia Notch

IMG_0891

Cannon Mountain and the Kinsmans

 

I reached the top of Little Haystack at 11:25am, where I joined the Franconia Ridge Trail:

IMG_0908.jpg

 

I had been ascending the western slopes of Mt Lincoln; once I got up onto Franconia Ridge, views to the east were just as amazing: I was above Owl’s Head, and could easily see Bond Mountain, West Bond, and Bondcliff (all of which I would climb on a very long day in September, later that year), and out across the Twins to Washington and the Presidential Range in the distance.  Maybe I could see the Atlantic Ocean far in the distance.

IMG_0911

Looking East Towards Owl’s Head and the Bonds

IMG_0915

Looking South Towards Mt Liberty and Mt Flume

IMG_0913

Looking North Towards Mt Lincoln

 

There were many people at the top of Little Haystack, some of whom were probably staying at the nearby Greenleaf AMC Hut., which I would pass on my way down, later.  But many also were doing the same loop that I was doing, across Lincoln and Lafayette.  One amazing boy, maybe 4 years old, was zipping along ahead of his mother, who kept calling him back.  He seemed full of energy, and wanted to fly ahead.  I wondered how long his energy would last, but he certainly kept it up for the whole time I saw him… weaving in and out of my path, with his mother calling out to him all the way.

The walk along Franconia Ridge, to Mt Lincoln, was spectacular.

 

I arrived at the summit of Mt Lincoln right at noon, and rested briefly.  It had taken about 2 hours and 40 minutes to the top from the Lafayette Place parking area.

IMG_0927.jpg

 

It was too early for lunch, so I soon left Mt Lincoln and headed north towards Mt Lafayette.  I will describe that hike, and the trek back down, next time!

*

Last time I described how we had piloted the Bright Futures program approach in CCF, further developing and testing the methods, systems, and structures that had been defined through our research and internal and external benchmarking.  It was a very exciting process, and I was lucky to be asked to accompany the pilot offices in Ecuador, the Philippines, and Uganda as they explored the disruptive changes implied in Bright Futures.  Lots of travel, and lots of learning and comradeship.

Near the end of that period, I came into contact with the Unitarian Universalist Service Committee (UUSC), a human-rights, social-justice campaigning organization based in Cambridge, Massachusetts.  In late 2004, as I was finishing my consulting time with CCF as acting Regional Representative for East Africa, based in Addis Ababa, I was offered a position at UUSC as Executive Director (initially as “Deputy Director”) working for Charlie Clements, UUSC’s dynamic and charismatic president and CEO.

Working at UUSC would be a big and exciting shift for me, out of international development and into social justice campaigning.  But the move felt like a natural extension of what we had been doing in CCF, where we had included an explicit focus on building the power of excluded people into Bright Futures.  I was able to use what I had learned across 20 years in the international development sector, leading and managing large international agencies, to lead and manage operations at UUSC, while also learning about campaigning and advocacy (and working in a unionized context!)

I’ll begin to describe my years at UUSC next time.  For now, I want to skip forward a few years, to my second, brief incarnation with CCF.

*

In early 2009, a few former colleagues at CCF, now rebranded as ChildFund International, got back in touch.  At that point I had transitioned to the 501c4 branch of UUSC, which we had created in 2008, and I had some spare time after the federal election the year before.  (More on that in a future post.)

Between 2004 and 2009, ChildFund had continued to roll out Bright Futures, but there had been major changes in leadership.  Sadly, John Schulz, CCF’s president, had taken a leave of absence to fight cancer, and had then died.  Though I had never worked directly with John, I had always appreciated his leadership and his unwavering support to Daniel Wordsworth and Michelle Poulton as they redesigned the agency’s program approach.

The internal leadership changes that took place after John’s departure led to Daniel and Michelle leaving CCF, as Anne Goddard became the agency’s new CEO in 2007.  Initially, at least, it seemed that the global transition to Bright Futures continued to be a priority for ChildFund.  (Later, that would change, as I will describe below…)

During that period, as Bright Futures was scaled up across the agency, many structural and systems-related challenges were addressed, and staff inside ChildFund’s program department were busy addressing these issues – updating their financial systems, transitioning long partnerships, training new staff in new positions.  In particular, Mike Raikovitz, Victoria Adams, Jason Schwartzman, and Dola Mohapatra were working very hard to sort out the nuts and bolts of the change.

It is a truism, attributed to Peter Drucker, that “culture eats strategy for breakfast.”  Alongside their important, practical work, Jason and Dola in particular were learning that lesson, and as a result they began to focus also on the cultural side of the change involved in Bright Futures: the attitudes and values of ChildFund staff.  Systems and structures were vital elements of Bright Futures, but nothing would work if staff retained their old attitudes toward their work, toward the people they worked with and for.  And there was a clear need, from Jason’s and Dola’s perspective, for attitude shifts; in fact, it seemed to them that the biggest obstacle to implementing Bright Futures were old values and attitudes among existing staff.

*

Dola worked as Deputy Regional Director for ChildFund Asia, a brilliant and highly-committed professional.  I worked closely with Dola in the design and implementation of BF101, and I enjoyed every moment of it; I admired Dola’s passion and commitment to ChildFund’s work, and his dedication to improving the effectiveness of ChildFund’s programming.

DSC01210

Dola Mohapatra, at the BF101 workshop

 

Jason managed a range of program-related special projects from ChildFund’s headquarters in Richmond, Virginia.  Jason was (and is) a gifted and insightful professional, who I had met back during my tenure as Plan’s program director, when he had worked with CCF’s CEO in a collaboration with Plan and Save and World Vision.  Jason had rejoined ChildFund to help develop an approach to working with youth.

Screen Shot 2017-10-25 at 11.16.09 AM.png

Jason Schwartzman, on the left, during our community immersion

 

In addition to Dola and Jason, I worked closely with Evelyn Santiago, who was ChildFund Asia’s program manager.  Evelyn brought key skills and experience to the design of our workshop.

DSC01519 (1024x768)

Evelyn Santiago at the BF101 Workshop

Screen Shot 2017-10-25 at 4.21.43 PM.png

Jason, Me, Dola and Evelyn

 

As noted above, Dola and Jason had identified the need to reinforce the values and attitudes side of Bright Futures, and felt that a deep, experiential-learning event might help better align staff with the principles of the new program approach.  They approached me for help and, as I had some time, we worked together to design and carry out a ten-day workshop that we called “Bright Futures 101” – in other words, the basics of Bright Futures, with a big emphasis on values and attitudes.

Working with Jason, Dola and Evelyn was a privilege – they were and are smart, experienced professionals whose commitment to social justice, and to the principles and values of Bright Futures were strong.

In this blog post, I want to describe “BF101” – our approach, the design, and how it went.

*

Rather than being just introduction to the tools incorporated into Bright Futures, our purpose was to promote and encourage the kinds of personal transformations required to make the new program approach a reality.  So we prepared something that ChildFund had never tried before – a long, experiential workshop with a village stay.

From the beginning, we agreed that BF101 would have two overall objectives:

  1. to build a comprehensive understanding of the principles underlying ChildFund’s Bright Futures program approach; and
  2. to build a questioning, exploring, and adaptive approach to program development and implementation that was aligned with ChildFund’s value of fostering and learning from its own innovation.

So, implicitly, we wanted to shift ChildFund’s culture.  By including significant participant leadership, immersion in communities, experiential education, and pre- and post-course assignments, we wanted to promote a meaningful connection between head (understanding), heart (values and principles), and hand (concrete action), thinking that this connection would spill over into their daily work when they returned home.  A 1 1/2-day immersion in a local community would be a key component of the workshop.

After a lengthy, collaborative design process, we agreed on a three-part workshop design (included here – Building Program Leaders – Immersion Workshop – Final Preworkshop Version).  The overall framework looked like this:

Screen Shot 2017-10-25 at 1.53.26 PM.png

Once Dola and Evelyn approved the design, they asked ChildFund Philippines to book a venue, and invitations were sent out to 3 or 4 participants from each office in Asia.  Extensive pre-reading assignments were sent to each participant, covering current trends in poverty and international development as well as the fundamental documents related to Bright Futures that I have shared in earlier posts in this series, such as the CCF Child Poverty Study, the Organisational Capacity Assessment, etc.

*

In the first workshop section, “Setting the Stage,” we would prepare participants for the experience.  A lengthy role play, adapted from a full-day exercise I had created in Viet Nam, was designed to challenge participants in an experiential, emotional manner, helping them actually feel what it was like to be a community member participating in programs implemented by ChildFund in the old way, the pre-Bright-Futures way.

We assigned various roles – community members (dressed appropriately), staff members of a fictitious NGO called “WorldChild International” (wearing formal attire), observers, etc.  I had written an extensive script (Role Play – Module 1 – Design – 4) which set up a serious of interactions designed to provoke misunderstandings, conflict, moments of emotional impact, and some fun:

Role Play 3 (1024x768)

DSC00544

DSC01079

 

As usual, the most important part of any exercise like this one was the group reflection afterwards, in this case led by Lloyd McCormack:

DSC01089

 

This led into a session, which I led, on mind-shifts and archetypes: M2 – Archetypes – 2.  The purpose here was to build on the impact from the role play to get participants thinking about their own attitudes and values, and how they might need to shift.

Ending the first section of the workshop, Jason, who had flown in directly from the US and was quite jet-lagged, gave an excellent historical overview of CCF’s programmatic evolution.  This presentation contained an important message of continuity: Bright Futures was the next step in a long and proud programmatic history for the agency: we were building on what had been accomplished in the past, not starting over.  Jason’s presentation set the scene for our work on the changes in attitudes and values that were in store:

Jason4.jpg

 

The next sessions outlined each of the main values and commitments articulated in Bright Futures (at least at that point in its evolution):

  • Deprived, Excluded, and Vulnerable children are our primary focus.  This session built on the CCF Poverty Study, which I described in an earlier post in this series.  At BF101 we sought to unpack what this “primary focus” would mean in practice;
  • We Build on the Stages of Child Development.  After I had concluded my tenure as consultant at CCF, program development efforts had built on Bright Futures by articulating a clear theory of child development, along with interventions related to each stage.  This was a very good development in ChildFund’s program approach which, however, had the potential to conflict with the bottom-up nature of Bright Futures.   So this section of BF101 would deepen understanding on how to resolve this seeming contradiction in practice;
  • Programs are Evidence-Based.  Again, ChildFund had continued to develop aspects of its program approach, building on Bright Futures to try to professionalize the design of projects and programs.  As above, this was a very good development in ChildFund’s program approach which, however, had the potential to conflict with the bottom-up nature of Bright Futures.   So we would reflect on how to resolve this seeming contradiction in practice;
  • We Build Authentic Partnerships.  This commitment flowed directly from the work we had done on Bright Futures earlier.

*

Perhaps the most important and crucial element of the BF101 design was a 1 1/2-day stay in communities.  We divided up the participants into smaller groups, and set out to spend a night in a community nearby the conference center:

 

*

Our concluding sessions were aimed at building on the community immersion by considering a range of personal and institutional transformations required, discussing systems implications, and then breaking into National Office groups to plan for action after the workshop.

*

During the workshop, Jason was blogging regularly, and asked me to prepare one, also.  Here is one of Jason’s blogs: http://ccfinthefield.blogspot.com/2009/05/opposite-sides-time-to-reflect.html.  And here is mine: http://ccfinthefield.blogspot.com/2009/05/seeking-balance.html.

*

We used a simple tool to track participant assessments along the way:

IMG_1047.jpg

 

As can be seen, the overwhelming majority of participants rated the workshop as very positive and helpful.  I myself felt quite happy with the workshop – I felt that we had gotten fairly deep into discussions that had the potential to transform people’s attitudes and values in a positive way.  Although it was a lot to ask people to set aside their work and families for seven full days, and to spend a night in a village, it seemed to pay off.

So, BF101 was successful, and fun.  Together with the systems work and structural shifts that were ongoing in the agency, it set the scene for the continued rollout of Bright Futures across ChildFund International, now including a positive, constructive way to promote values and attitudes consistent with the new program approach.

*

But, sadly, Bright Futures would soon be set aside by ChildFund.  In what felt like an echo of Plan International’s pathology (new leadership = starting over from scratch), despite having embraced the approach initially, ChildFund’s new leadership moved deliberately away from Bright Futures.  The global financial crisis had erupted and, like many international NGOs, ChildFund’s income was dropping.  It was felt that investment in the transition to Bright Futures was no longer affordable, so much of the investment in research, piloting, systems development, and training (for example, followup to BF101) was dropped.

As a consultant, I could only look at this decision with sadness and regret.  The dedication and resources that Michelle, Daniel, Victoria, Mike, Jon, Andrew, Jason, Dola and many others across ChildFund had invested in such a positive and disruptive shift was, to a great extent, lost.

Many years later, when I joined ChildFund Australia as International Program Director, a very senior program leader expressed similar regret to me, lamenting that Bright Futures was a clear ideology which was now lacking.

I’ve recently been reminded of another consequence of the virtual abandonment of Bright Futures: a year later, 65% of the participants in the BF101 workshop had left ChildFund.  Perhaps we didn’t do enough to help participants operationalize the changes we were promoting, in the context of ChildFund’s reality of the time.  But that would have been quite a contradiction of the basic message of BF101: that each person needed to take the initiative to operationalize their own transformations.

My own assumption is that the personal transformations begun during our week in the Philippines led to significant disappointment when the agency didn’t follow through, when ChildFund didn’t (or wasn’t able to) invest in creating BF102, 202, etc.

*

Why is it that international NGOs so often suffer this phenomenon, that when leadership changes (at country, regional, or global levels) everything changes?  That new leaders seem to view the accomplishments of their predecessors as irrelevant or worse?

I think it comes, at least in part, from the way that we who work in the value-based economy associate ourselves, and our self images, with our work so strongly and emotionally.  This ego-driven association can be a great motivator, but it also clouds our vision.  I saw this many times in Plan, as many (if not most) new Country Directors or Regional Directors or International Executive Directors scorned their predecessors and dismissed their accomplishments as misguided at best, quickly making fundamental changes without taking the time to appreciate what could be build upon.  And, when the next generation of leaders arrived, the cycle just repeated and repeated.

This, to me, is the biggest weakness of our sector.  Today, alongside this ego-driven pathology, the entire international-development sector is also facing severe disruptive change, which greatly complicates matters… but that’s a story for another day!

*

Meanwhile, I made the big move, joining UUSC as Executive Director, shifting from international development to social justice and human rights campaigning, internationally and domestically.  And into a strongly unionized environment.  These were the days of Bush’s Iraq invasion, torture and neoliberal economics, and I was excited to turn my work towards the grave problems affecting my own country.

Next time I will begin to tell that part of the story… stay tuned!

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.

Mt Isolation (25) – Pilot-Testing Bright Futures

September, 2017

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

So, in the end, there will be 48 posts about climbing 48 mountains and about various aspects of the journey to thus far…

*

Leaving Plan International after 15 years, the last four of which were spent as Country Director in Viet Nam, I was fortunate to join CCF as a consultant.  My task, over what became two great years with CCF, was to help develop a new program approach for the agency.  This was exciting and opportune for me: I had been reflecting a lot about how things had changed in the development sector, and at that point I had a lot of experience across five continents, in a wide variety of roles, under my belt.

There was probably nobody in the world better suited for the task.

*

Last time, I wrote extensively about what we came up with: the “Bright Futures” program approach.  We developed the approach through a very thorough process of reflection, benchmarking, and research, and even though the changes foreseen for CCF were very significant and disruptive, senior management and board embraced our recommendations enthusiastically.  We were given the green light to pilot test the approach in three countries – Ecuador, the Philippines, and Uganda – and I was asked to train staff in each location, accompany the rollout, document learning, and suggest refinements.

This meant that I would continue to work with Michelle Poulton and Daniel Wordsworth, development professionals I’ve described in earlier blogs, people I admired and who were very serious about creating a first-class program organization.

What a fantastic opportunity!

In this blog, I want to describe how the pilot testing went.  But first…

*

I climbed Mt Isolation (4004ft, 1220m) on 8 June 2017, after having spent the previous night at Dolly Copp Campground.  Since getting to the top of Mt Isolation would be a long hike, I wanted to have a full day, so I drove up the previous afternoon and camped at Dolly Copp Campground in Pinkham Notch.

Screen Shot 2017-07-08 at 4.00.36 PM.png

 

As you can see, I went up to the top of Mt Isolation and retraced my steps back, which involved quite a long hike.  I’ve included a large-scale map here, just so that the context for Mt Isolation can be seen: it’s in a basin to the south and east of the Presidential range, with Mt Washington, Adams, Jefferson and Monroe to the north, and Eisenhower, Pierce, and Jackson to the west.

Sadly, I spent an uncomfortable night at Dolly Copp, mainly because I had forgotten the bottom (that is to day, lower) half of my sleeping bag!  So I tossed and turned, and didn’t get a great night’s sleep.

IMG_0667.jpg

 

But the advantage was that I was able to get an early start on what I thought might be a long climb, leaving the Rocky Branch parking lot, and starting the hike at about 7:15am, at least two hours earlier than I would have started if I had driven up from Durham that morning.

IMG_0670.jpg

 

The hike in the forest up the Rocky Branch Trail was uneventful, though there was lots of water along the way, and therefore lots of rock-hopping!  That trail isn’t very well maintained, and with recent heavy rains there were long sections that were more stream than path!

I reached the junction of Rocky Branch and Isolation Trail at about 9:15am, two hours from the start of the hike.  I crossed over and headed upstream.  Rocky Branch was full, as expected with all the rain, so crossing was a bit challenging.  There were four more crossings as I headed up, through the forest, before I reached the junction of Isolation Trail and Davis Path at about 11am.  I have to admit that I dipped my boots into the Rocky Branch more than once on the way up, and even had water flow into my boot (over the ankle) once!  So the rest of the hike was done with a wet left foot…

IMG_0791IMG_0694

IMG_0679

 

 

Once I got onto Isolation Trail, I found it was better maintained than Rocky Branch Trail had been.  Evidence of a strong storm was obvious near the top, where I joined Davis Path: lots of downed trees had been cut, clearing the trail, but the hike was still narrow in places, crowded with downed trees and shrubs on both sides.

As I hiked up Isolation Trail, still in the forest, I began to have views of the Presidential Range.  I reached the turnoff for the spur up to the summit of Mt Isolation at about 11:30am, and reached the top a few minutes later.  So it took me about 4 3/4 hours to reach the top; I didn’t see any other hikers on the way up.

The view from the top was fantastic, probably the best so far of all the hikes in this series: it was clear and dry, and I had the whole Presidential Range in front of me.

IMG_0757

From the Right: Mt Washington, Mt Adams, Mt Jefferson

IMG_0728

Mt Eisenhower

IMG_0722

IMG_0719

IMG_0727

 

 

And I had a winged visitor, looking for food.

IMG_0761

IMG_0740

 

But I also had hordes of one other particular species of visitor: for only the second time in these 25 climbs, swarms of black flies quickly descended, making things impossible and intolerable.  Luckily, I had carried some insect repellent leftover from our years in Australia, and once I applied generous quantities onto my face and arms and head, the black flies left me alone.  Otherwise I would have had to leave the summit immediately, which would have been a real shame, because I had walked nearly 5 hours to get there, without very many views!

IMG_0814

 

Happily, I was able to have a leisurely lunch at the top.  The views were glorious, and the flies left me alone.

After I left, retracing my steps down, I did meet with a few hikers, including a mother and son who had come up from Glen Ellis Falls.  Descending Rocky Branch, of course, I had to cross the river again, five more times.  However, in this case, I crossed once in error and had to recross there to get back to the trail… so, make that seven more times!  Happily, it seemed easier to navigate the crossings on the way back, either the water had gone down (unlikely), or I was a bit more familiar with the right spots to cross.

I arrived back at the parking lot at about 4pm, having taken almost nine hours to climb Mt Isolation.  Tired, but it was a great day out!

*

Change is complicated and, given the nature of our value-driven organisations, changing international organisations is particularly challenging (see my 2001 article on this topic, available here: NML – Fragmentation Article).  Even though the close association that our best people make between their work and their own personal journeys is a huge advantage for our sector (leading to very high levels of commitment and motivation), this same reality also produces a culture that is often resistant to change.  Because when we identify ourselves so closely with our work, organisational change becomes personal change, and that’s very complicated!

And the changes implied with Bright Futures were immense, and disruptive.  We were asking pilot countries:

  • to move from: programs being based on a menu of activities defined at CCF’s headquarters, and focused on basic needs;
  • towards: programs based on a broad, localized, holistic and nuanced understanding of the causes and effects of the adversities faced by children, and of the assets that poor people draw on as they confront adversity.

The implication here was that pilot countries would need to deepen their understanding of poverty, and also learn to grapple with the complexity involved in addressing the realities of the lived experience of people living in poverty.  In a sense, staff in pilot countries were going to have to work much harder – choosing from a menu was easy!

  • to move from: programs being led by local community associations of parents, whose task was primarily administrative: choosing from the “menu” of activities, and managing funds and staff;
  • towards: programs being designed to enhance the leading role (agency) of parents, youth, and children in poor communities, by ensuring that they are the primary protagonists in program implementation.

The implication was that pilot countries could build on a good foundation of parents’ groups.  But extending this to learning to work appropriately with children and youth would be a challenge, and transforming all of these groups into authentic elements of local civil society would be very complex!  The reality was that the parents’ associations were often not really managing “their” staff – often it was the other way ’round – that would have to change.  Another of the challenges here would be for existing staff, and community members in general, to learn to work with groups of children and youth in non-tokenistic ways.

  • to move from: carrying out all activities at the local community level;
  • towards: implementing projects wherever the causes of child poverty and adversity are found, whether at child, family, community, or Area levels.

This would be a big challenge for pilot countries, because it involved understanding the complex linkages and causes of poverty beyond the local level, and then understanding how to invest funds in new contexts to achieve real, scaled, enduring impact on the causes of child poverty and adversity.

One big obstacle here would be the vested interests that had been created by the flow of funds from CCF into local communities and the parents’ groups. Not an easy task, fraught with significant risk.

And, on top of all of that, Bright Futures envisioned the consolidation of the existing local-level parent’s associations into parent “federations” that would operate at “district” level, relating to local government service provision.  Transforming the roles of the existing parents’ associations from handling (what was, to them) vast quantities of money, to answering to an entirely new body at “district level” was a huge challenge.

  • to move from: working in isolation from other development stakeholders;
  • towards: integrating CCF’s work with relevant efforts of other development agencies, at local and national levels.

This would require a whole new set of sophisticated relational and representational competencies that had not been prioritized before.

For example, in a sense, CCF had been operating in a mechanical way – transfer funds from headquarters to parents’ groups, which would then simply choose from a menu of activities that would take place in the local community. Simple, and effective to some extent (at least in terms of spending money!), but no longer suitable if CCF wished to have greater, longer-lasting impact, which it certainly did.

  • to move from: annual planning based on local parents’ groups choosing a set of activities from a menu of outputs, all related to basic needs;
  • towards: planning in a much more sophisticated way, with the overall objective of building sustainable community capacity, the ability to reflect and learn, resilience, and achieving impact over, as an estimation, four 3-year planning periods.

CCF would have to create an entirely new planning system, focused on the “Area” (district) level but linked to planning at Country, Regional, and International contexts.

Fundamental to this new system would be the understanding of the lived reality of people living in poverty; this would be a very new skill for CCF staff.  And pilot countries would have to learn this new planning system immediately, as it would be the foundation of pilot operations… so we had to move very quickly to develop the system, train people, and get started with the new way of planning.  (I will describe that system, the “ASP,” below…)

  • to move from: program activities taking place far from CCF’s operational structure, with visits by staff to local communities once per year;
  • towards: programs being supported much more closely, by decentralizing parts of CCF’s operational structure.

This was a huge change, involving setting up “Area” offices, staffing these offices with entirely new positions, and then shifting roles and responsibilities out from the Country Office.

There was deep institutional resistance to this move, partly because of a semi-ideological attachment to having parents make all programmatic decisions (which I sympathized with, although the evidence was clear that the program activities that resulted were often not high-quality).

But resistance also came from a more-mundane, though powerful source: showing a “massive” increase in staffing on CCF’s financial statements would look bad to charity watchdogs like Charity Navigator and Guidestar.  Even though the total levels of staffing would go down, as staffing at the “parents’ associations” would decrease significantly, those employees had not been shown on CCF’s books, because they were technically employees of the associations.  So the appearance would be a negative one, from a simple bookkeeping, ratio-driven point of view.  But this “point of view” was of the very highest priority to CCF’s senior management, because it strongly influenced donor behavior.

  • to move from: funding program activities automatically, to parents’ groups on a monthly basis, as output “subsidies”;
  • towards: projects being funded according to the pace of implementation. 

This would be another enormous, foundational change, entailing a completely-new financial system and new flows of funding and data: now, the Country and Area offices would authorize fund transfers to the local parents’ (and child and youth) associations based on documented progress of approved projects.

All of this would be new, so CCF had to develop project documentation processes and funding mechanisms that provided sufficient clarity and oversight.

To properly test Bright Futures, we would need to provide a lot of support to the pilot countries as they grappled with these, and other, disruptions!

*

In this blog post, I want to describe several aspects of the year that we piloted Bright Futures in Ecuador, the Philippines, and Uganda as they moved to implement the disruptive changes outlined above: how we helped staff and leadership in the three pilot countries understand what they were going to do; how we worked with them to get ready; and how we accompanied them as they commenced working with the Bright Futures approach.   And how we developed, tested, and implemented an entirely new set of program-planning procedures, the Area Strategic Plan methodology.

As I have just noted, Bright Futures was a profoundly different approach than what these pilot countries were used to, deeply disruptive.  So we set up what seems to me to have been, in retrospect, a careful, thorough, rigorous, and exemplary process of support and learning.  In that sense, I think it’s worth describing the process in some detail, and worth sharing a sample of the extensive documentation that was produced along the way.

*

Before beginning to pilot, we carefully identified what we would be testing and how we would measure success; we set up processes to develop the new systems and capacities that would be needed in the pilot countries and at CCF’s headquarters; and we established mechanisms to support, and learn from, the pilot countries as they pioneered a very new way of working.

In the end, I worked closely with the three pilot countries for a year – helping them understand what they were going to do, working with them to get ready, and then accompanying them as they commenced working with the Bright Futures approach.  And, along the way, I supported staff in the Richmond headquarters as they grappled with the changes demanded of them, and with the impact of the changes on headquarters systems and structures.

When CCF’s senior management had agreed the pilot testing, their president (John Schulz) had decided that the organization would not make changes to key systems and structures across the agency until pilot testing was complete and full rollout of Bright Futures had been approved.  This meant that the functional departments at headquarters had to develop “work-arounds” so that pilot areas could manage financial and donor-relations aspects of their work.

This made sense to me: why spend the time and money to develop new systems when we didn’t know if, or how, Bright Futures would work?  But it meant that much of the agency, including all three pilot Country Offices, would be using parallel basic organizational processes, especially financial processes, at the same time, just adding to the complexity!

*

First we brought key staff from each country together with staff from CCF’s headquarters in Richmond, Virginia, to develop a shared understanding of the road ahead, and to create national plans of action for piloting.  Management approved these detailed plans in late May of 2003.

I recently rediscovered several summary videos that I prepared during the creation and pilot testing of what became Bright Futures.  These videos were used to give senior management a visual sense of what was happening in the field.

Here is a short (11-minute) summary video of the preparation workshop that took place in late April of 2003:

 

It’s fun for me to see these images, now 14 years ago: the people involved, the approaches we used to start pilot testing Bright Futures.  Staff from all three pilot countries are shown, along with Daniel and Michelle, and other senior staff from Richmond.

One important result of that launch workshop was the production of a set of management indicators which would be used to assess pilot performance: the indicators would be measured in each pilot country before, and after the pilot-testing period.  The agreed indicators reflected the overall purposes of the Bright Futures program approach (see my previous blog), and can be found here: Piloting Management Indicators – From Quarterly Report #2.

Once detailed national plans of action were approved, we scheduled “Kickoff” workshops in each pilot country.  These two-day meetings were similar in each location, and included all staff in-country.  On the first day, we would review the background of the pilot, including summary presentations of CCF’s strategic plan, the Organisational Capacity Assessment, and the CCF Poverty Study.   Finally, the basic principles, concepts, and changes included in the pilot testing were presented and discussed, along with an outline of the pilot schedule.  At the end of the first day, we handed out relevant background documentation and asked participants to study it in preparation for the continuation of the meeting on the second day.

The second day of these Kickoff meetings was essentially an extended question and answer, discussion and reflection session, during which I (and staff from CCF’s headquarters, when they attended) would address concerns and areas where more detail was required.  Occasionally, since I was an external consultant, there were questions that needed discussion with functional departments at CCF’s headquarters, so I tracked these issues and methodically followed them up.

During these initial visits, I also worked with Country Office leadership to help them obtain critical external support in two important and sensitive areas:

  • Given the fundamental nature of the changes being introduced, and in particular noting that only part of the operations in each pilot country would be testing Bright Futures, human-resources issues were crucial.  Bright Futures would demand new competencies, new structures, new positions, and change management would be complex.  So in each country we sought external support from specialised agencies; I worked with CCF’s director of human resources in Richmond, Bill Leedom, to source this support locally;
  • One particular skill, on the program side, would be pivotal: new planning systems would require field staff to master the set of competencies and tools known as “PRA” – participatory rural appraisal.  (I had first come across PRA methods when in Tuluà, at Plan’s Field Office there, back in 1987, but somehow most CCF staff had not become familiar with this approach.  Some did, of course, but this gap in knowledge was an example of how CCF staff had been somewhat isolated from good development practices).  Since by 2003 PRA was completely mainstream in the development world, there were well-regarded, specialised agencies in most countries that we contacted to arrange training.

Also, in this first round of visits, I worked with local staff to finalise the selection of two pilot “Areas” in each country.  I visited these locations, helping determine the details of staffing in the Areas, reviewed and decided systems and structural issues (such as how funds would flow, how local parents’ associations would evolve as district-level “federations” were formed), etc.

*

Once the two “Areas” in each pilot country began working, I started to issue quarterly reports, documenting progress and concerns, and including visit reports, guidance notes issued, etc.  (I continued to visit each country frequently, which meant that I was on the road a lot during that pilot-testing year! )  These quarterly reports contained a very complete record of the pilot-testing experience, useful for anybody wanting (at the time) to have access to every aspect of our results, and useful (now) for anybody wanting to see what the rigorous pilot-testing of an organizational change looks like.

I produced five lengthy, comprehensive quarterly reports during that year, which I am happy to share here:

*

Staff from functional departments at CCF’s headquarters also visited pilot countries, which we encouraged: support from Richmond leadership would be important, and their input was valuable.  Of course, leaders at headquarters would need to be supportive of the Bright Futures model once the pilot-testing year was concluded, if CCF were to scale up the approach, so exposing them to the reality was key, especially because things went well!

We asked these visitors to produce reports, which are included in the quarterly reports available in the links included above.

*

Evidence of an interesting dynamic that developed during the year can be seen reflected from a report produced by Bill Leedom, who was CCF’s HR director at the time.  Bill’s visit report for a visit he made to Ecuador is included in the Q1FY04 Quarterly Report (Q1FY04 – 2).  In his report, he describes a discussion he had with the Country Director:

“Carlos (Montúfar, the Country Director in Ecuador) and I had a discussion about the role of consultants in the organization. Although it appears at times that the consultant is running the organization it must be the other way around. CCF hires a consultant to help with a process and then they leave. They are a “hired gun.” If changes are recommended they cannot be implemented without his approval as he will have to live with the consequences of whatever was done. The consultant moves on to another job and does not have to suffer any consequences of a bad recommendation or decision but he and his staff have to. I think Carlos was glad to hear this and hopefully will “stand up” to and express his opinions to what he believes might not be good recommendations by consultants.”

When Bill uses the word “consultants,” I know that he is politely referring to me!  My recollection is that this comment reflects a strong dynamic that was emerging as we pilot tested Bright Futures: leadership in the three pilot countries had volunteered to pilot test a particular set of changes, perhaps without fully understanding the ramifications, or without fully understanding that headquarters (meaning, mostly, me!) would be accompanying the pilot process so closely.

Understandably, leaders like Carlos wanted to maintain authority over what was happening in their programs, while headquarters felt that if we were going to test something, we had to test it as designed, learn what worked and what didn’t work without making changes on the fly.  Only after testing the model as proposed would make changes or adaptations as we prepared to scale up.  Otherwise, we’d never be able to document strengths and weaknesses of what we had agreed to pilot.

But not everything went perfectly – that’s why we were pilot testing, to discover what we needed to change!  When things didn’t go well, naturally, people like Carlos wanted to fix it.  That led to tension, particularly in Ecuador – perhaps because the program in that country was (rightly) highly-esteemed.

Carlos resisted some of the guidance that I was giving, and we had some frank discussions; it helped that my Spanish was still quite fluent.  But Daniel and Michelle, program leadership in Richmond, made it clear to me, and to Carlos and his regional manager that we needed to test Bright Futures as it had been designed, so even though I was an external consultant, I felt that I was on strong ground when I insisted that pilot countries proceed as we had agreed at the launch workshop in April of 2003.

*

From the beginning, we understood that an entirely-new planning, monitoring, and evaluation methodology would need to be developed for Bright Futures.  Since this would be a very large piece of work, we sought additional consulting help, and were fortunate to find Jon Kurtz, who worked with me to prepare and test the Bright Futures “Area Strategic Planning” method, the “ASP.”

We wanted to take the CCF Poverty Study very seriously, which meant that a rigorous analysis of the causes of child poverty and adversity, at various levels, had to be evident in the ASP.  And we had to make sure that program planning reflected all of the principles of Bright Futures – involving, for example, children and youth in the ASP process, incorporating other stakeholders (local NGOs operating in the Area, district government), and so forth.

Area Strategic Planning was aimed at supporting CCF’s goal of achieving broader, deeper and longer-lasting impact on child poverty.  To do this, the ASP process was guided by several key principles.  These principles can be seen in terms of the goals that ASP was designed to help programs to achieve:

  • Understanding poverty: Programs will be based on a deep understanding of, and responsive to the varied nature of child poverty across the communities where CCF works.
  • Leading role: Programs will build the capacities of parents, youth and children to lead their own development. Each group will be given the space and support required to take decisions and action to improve the wellbeing of children in their communities and Areas.
  • Linkages: Programs will be linked to and strengthen the resources that poor people call upon to improve their lives. Efforts will strive to build on the existing energies in communities and on relevant efforts of other development agencies.
  • Accountability: Programs will be recognized by sponsors and donors for their value in addressing child poverty, and at the same time will be accountable to the partner communities, especially the powerless and marginalized groups.
  • Learning: Programs will be based on best practices and continuos learning from experiences. Planning, action and review processes will be linked so that lessons from past programs are reapplied to improve future efforts.

The process for conducting Area Strategic Planning was structured to reflect these principles and aims.  It was foreseen that the proposed ASP process would evolve and be innovated upon beyond the pilot year, as Areas discovered other ways to achieve these same goals.  However, for the purposes of the pilot year the ASP process would follow the following process consisting of four stages:

  1. Community reflections on child poverty and adversity: Initial immersion and reflection in communities to gain a deep understanding of child poverty in each context, including its manifestations and causes, as well as the resources poor people rely on to address these.
  2. Area synthesis and draft program and project planning: Developing programs and projects which respond to the immediate and structural causes of child ill-being in the Area while building on the existing resources identified.
  3. Community validation, prioritization and visioning: Validating the proposed program responses in communities, prioritizing projects, and developing visions for the future for assessing program performance.
  4. Detailed project planning and ASP finalization: Designing projects together with partners and technical experts, defining capacity building goals for the Area Federation(s), and developing estimated budgets for programs and getting final input on and approval of the ASP.

We settled on a process that would look like this:

Screen Shot 2017-09-10 at 1.37.11 PM.png

CCF’s Area Strategic Planning Model

 

The ASP’s Stage Two was crucial: this was where we synthesized the understanding of child poverty and adversity, into root causes, compared those root causes with existing resources (latent or actual) in the Area, and created draft programs and projects.

Screen Shot 2017-09-10 at 1.55.03 PM.png

 

This step required a bit of “magic” – somehow matching the root causes of child poverty to local resources… and you can see Jon working hard to make it work in the video included below.  But it did work!

I really liked this ASP process – it reflected much of what I had learned in my career, at least on the program side.  It looked good, but we needed to test the ASP before training the pilot countries, so a small team of us (me, Jon, and Victoria Adams) went to The Gambia for a week and tried it out.  In this video you can see Jon working the “magic” – conjuring programs and projects from comparing root causes of child poverty (broadly understood) with locally-available (existing or latent) resources:

 

I like that there was a large dose of artistry required here; development shouldn’t be linear and mechanical, it should be joyful and serendipitous, and I was proud that our ASP process made space for that.

With the learnings from that test in The Gambia, we finalized a guidance document, detailing underlying principles, the ASP process, detailed procedures, and reporting guidelines and formats.  The version we used for pilot testing can be downloaded here: ASP Guidance – 16.

Later we trained staff in each pilot country on the ASP.  Here is a video that shows some of that process:

 

I often tell one fun anecdote about the ASP training sessions.  Stage One of the process (see the diagram above) required CCF staff to stay for nearly a week in a village where the agency worked, to carry out a thorough investigation of the situation using PRA methods.

In one country (which I will not name!), after the initial training we moved out to the pilot Area to prepare to spend the week in a village.  When we gathered there after arriving, to discuss next steps, senior national CCF staff informed me that the “village stay” would not be necessary: since they were not expatriates, they had a clear idea of the situation in rural areas of their country.

My response was simple: as a consultant, I had no authority to force them to engage in the village stay, or anything else for that matter, but that we wouldn’t continue the training if they were not willing to participate as had been agreed…!

That got their attention, and (after some discussion) they agreed to spend much of the week in local villages.

I was delighted when, at the end of the week, they admitted that things were very different than they had expected in these villages!  They seemed genuine in their recognition that they had learned a lot.

But I wasn’t surprised – these were smart, well-trained people, but they were highly-educated elite from the capital city, distant physically and culturally from rural areas.  So, I think, the village stay was very useful.

*

Along the way, across the year of pilot testing in Ecuador, the Philippines, and Uganda, I issued a series of short guidance notes, which were circulated across CCF.  These notes aimed to explain what we were pilot testing for staff who weren’t directly involved, covering the following topics:

  1. What are we pilot testing?  Piloting Notes – 1.9.  This guidance note explains the basic principles of Bright Futures that we were getting ready to test.
  2. The operational structure of Bright Futures.  Piloting Notes – 2.4.  This guidance note explains how CCF was going to set up Federations and Area Offices.
  3. Recruiting new Bright Futures staff.  Piloting Notes – 3.6.  This guidance note explains how CCF was going to build up the Area structures with new staff.
  4. The CCF Poverty Study.  Piloting Notes – 4.9  This guidance note gives a summary of the Poverty Study, that would underlie much of the Area Strategic Planning process.
  5. Monitoring and Evaluation.  Piloting Notes – 5.2  This guidance note explains Area Strategic Planning.
  6. Area Federations.  Piloting Notes – 6.6.  This guidance note explains the ideas behind building the power of people living in poverty by federating their organizations so that they could have more influence on local government service provision.
  7. Finance Issues.  Piloting Notes – 7.3.  This guidance note explains how CCF would change funding from being a “subsidy” of money, remitted every month to parents’ associations, towards a more modern process of funding project activities according to advance.
  8. Partnering.  Piloting Notes – 8.7.  This guidance note outlines the basic concepts and processes underlying one of Bright Futures’ biggest changes: working with and through local civil society.
  9. Growing the Capacity of Area Federations.  Piloting Notes – 9.6.  This guidance note describes how the federated bodies of parents, youth, and children, could become stronger.
  10. The Bright Futures Approach.  Piloting Notes – 10.2.  This guidance note explains the  approach in detail.
  11. Child and Youth Agency.  Piloting Notes – 11.  This final guidance note explains the ideas behind “agency” – enabling children and youth to take effective action on things that they find to be important in their communities.

The “Piloting Notes” series was fairly comprehensive, but purposely brief and accessible to the wide range of CCF staff across the world – busy people, with very different language abilities.  The idea was to “over-communicate” the change, so that when the time came to roll out Bright Futures, the agency would be as ready as possible.

*

There is so much more that I could share about that fantastic year.  For example, the work that Andrew Couldridge did helping us grapple with the establishment of Area “Federations” of people living in poverty.  But this blog is already quite long, so I will close it after sharing staff assessments of the pilot testing, and thanking the people who were really driving this positive change in CCF.

*

CCF carried out a formal evaluation of the pilot test of Bright Futures, using an external agency from the Netherlands (coincidentally named Better Futures, I think).  Sadly, I don’t have access to their report, but I think it was quite positive.

But I do have access to the assessment we carried out internally – the summary of that assessment is here: Management Summary – 1.  We surveyed at total of 17 people in the  three pilot countries, asking them about the Bright Futures model, HR and structural aspects, the planning process (ASP), Federations, Partnership, working with children and youth, sponsor relations, and support from Richmond.

I want to share some of the findings from the first domain of assessment (the Bright Futures model) and the last domain (support from Richmond).

  • In terms of the basic Bright Futures model, staff in pilot countries felt that positive aspects were the way that it included working in partnership and linking with other development actors, how it changed funding flows, how it deepened the undersding of poverty, and how it enhanced the participation and involvement of the community in general, and children and youth in particular.
  • On the negative side, the Bright Futures model was felt to be too demanding on them, that there was not enough capacity in communities, that there was a high cost to community participants (I think this was related to their time), that the piloting was too quick, that CCF’s focus was moved away from sponsored families, that Bright Futures guidelines were not complete at the beginning of the pilot period, that CCF itself became less visible, that Area staff may be dominant, and that the role of National Office staff was unclear.
  • In terms of support from the CCF headquarters, staff in pilot countries felt that positive aspects were that visits were very positive, helping clarify, giving a sense of accompaniment and solidarity.  Also, the flow of materials (guidance notes, etc.) was seen positively.
  • On the negative side, support visits were seen as too few and too short, guidelines were provided “just in time” which caused problems, messages from CCF headquarters were contradictory, and more support was called for in later stages of the ASP.

Piloting change is tricky, and leading it from headquarters of any INGO is even trickier – I think we did very well.

*

Once the pilot phase was evaluated, CCF began to prepare for scaling up – preparing a “second wave” of Bright Futures rollout.  Firstly we thought about how countries would be “certified” to “go-live” in Bright Futures – how would we know that they were “ready”?

To help, we produced a document summarizing how “certification” would be handled: Certification – 1.

Five countries were selected for the “second wave”: Angola, Honduras, Sierra Leone, Sri Lanka, and Zambia.  At this point, I was beginning to transition to another role (see below), so my involvement in the “second wave” was minimal.  But I did help facilitate a “pan-Asia” Bright Futures rollout workshop in Colombo, and met several people I would later work closely with when I joined ChildFund Australia (Ouen Getigan and Sarah Hunt, for example!)

*

As I’ve described here, piloting the kind of disruptive, fundamental change that was envisioned in Bright Futures brings many challenges.  And once the lessons from pilot testing are incorporated, scaling up brings a different set of complexities: for example, CCF was able to provide very extensive (and expensive) support to the three Bright Futures pilots, but would not be able to cover the entire, global organisation with that same intensity.  So, often, quality drops off.

One gap that we noticed in the support we were providing to the pilot countries was very basic: attitudes, skills, and understanding of poverty and how to overcome it.  For example, as mentioned above, we had tried to partially address this gap by getting training for pilot-country staff in PRA methods.

Next time, in my final “Bright Futures” post, I will describe how we sought to build competencies, and momentum, for Bright Futures by creating and implementing a week-long immersion training, which we called “BF 101”.   And I’ll share how Bright Futures came to a premature end…

In 2009, four years after I completed my time as a consultant with CCF, I was asked to briefly return and create a week-long training workshop which we called “Bright Futures 101.”  We conducted this workshop in the Philippines and, next time, I will skip ahead in time and describe that fascinating, and successful experience.

And I will describe how Bright Futures ended!

*

But before that, I finished my work with CCF by serving as acting Regional Representative for East Africa, based in Addis Ababa.  This assignment was to fill-in for the incumbent Regional Representative during her sabbatical.  Jean and I would move to Addis, and I worked with CCF’s offices in Ethiopia, Kenya, and Uganda for those fascinating months.

Then … I would move into the world of activism and human-rights campaigning, joining the Unitarian Universalist Service Committee as Executive Director in 2005.  Stay tuned for descriptions of that fascinating experience.

*

Before closing this final description of the two years I spent as a consultant with CCF, I want to thank Michelle and Daniel for giving me the opportunity to lead this process.  As I’ve said several times, they were doing exemplary work, intellectually honest and open.  It was a great pleasure working with them.

Carlos, in Ecuador, Nina in the Philippines, and James in Uganda, all did their best to stay true to the principles of Bright Futures, despite the headaches that came with pilot testing such disruptive change.  And they unfailingly welcomed me to their countries and work on many occasions during those two years.  Thank you!

And I also want mention and recognize a range of other Richmond-based CCF staff who worked very effectively with us to make the pilot testing of Bright Futures a success: Mike Raikovitz, Victoria Adams, Jason Schwartzman, Jon Kurtz, Andrew Couldridge, Dola Mohapatra, Tracy Dolan, and many others.  It was a great team, a great effort.

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.

 

 

Mt Jackson (24) – The Bright Futures Program Approach

August, 2017

I climbed Mt Jackson (4052ft, 1235m) on 2 June, 2017.  This was my first climb of 2017, having taken a rest over the long, cold winter of 2016-2017.  In 2016, I had been able to start hiking in early May, but this year we had much more snow, and longer and later cold spells.  So I gave May 2017 a miss, and began to tackle the 4000-footers in early June…

*

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

*

Leaving Plan International after 15 years, the last 4 of which were spent as Country Director in Viet Nam, I was fortunate to join CCF as a consultant.  My task, over what became two great years, was to help develop a new program approach for the agency.  This was exciting and opportune for me: I had been reflecting a lot about how things had changed in the development sector, and at that point I had a lot of experience across five continents, in a wide variety of roles, under my belt.

So I was very ready for the challenge that CCF offered me – I felt I had a lot to offer.  Little did I know that I was also stepping into a great environment, where CCF’s senior programmatic leadership, and the CEO, were beginning a very exciting journey of reflection and discovery.

*

My first task had been to research current thinking, and best practices, across our sector.  Last time I described that research and the recommendations that had emerged.  To my delight, Daniel Wordsworth and Michelle Poulton embraced my findings enthusiastically, and senior management had endorsed them as well.

Our next step was to take the research that I had done, with its recommended themes of change, and create the specifics of CCF’s new program approach.  In this, Daniel took the lead, with me acting as a sounding board and advocate for the principles and themes of the prior research.  This was appropriate, as now we would be detailing concretely how the agency would implement programs, core stuff for CCF.  So I moved into more of an advisory role, for now.

In this blog post, I want to share the details of what we came up with, and how CCF ended up proceeding.

*

As I drove north from Durham, the weather forecast was problematic, with a strong chance of afternoon rain.  But I decided to take the chance.  This was #24 of my 48 climbs, and I hadn’t had any rain so far, on any of those climbs.  So I figured I was on a long run of good luck – couldn’t possibly rain this time, right?

I left Durham at around 7:45am, and arrived at the trailhead at just after 10am, parking just off of Rt 302 near Crawford Notch.

IMG_0553.jpg

 

Even though it was June, I could see some patches of snow above me in the mountains as I approached Crawford Notch, but all was clear on the road.

My plan was to walk up the Webster Cliff Trail to Mt Webster, on to Mt Jackson, and then take the Webster-Jackson Trial to loop back to Mt Webster.   I would retrace my steps from there, on Webster Cliff Trail, to the trailhead.

Screen Shot 2017-07-10 at 3.06.18 PM.png

 

As I began the hike, it was a nice day, cool and a bit cloudy.  I crossed Rt 302 and quickly reached a pedestrian bridge over the Saco River.  The Webster Cliff Trail forms part of the Appalachian Trail here:

IMG_0557.jpg

IMG_0561.jpg

 

The first section of the Webster Cliff Trail was moderately steep.  Though the temperature was cool, I heated up as I ascended.  It was a beautiful day hiking, still sunny at this point:

IMG_0572.jpg

 

Clouds gathered as I ascended, and by 11am the sun was mostly gone.  The trail was consistently steep and became rockier as I ascended the Webster Cliff Trail, passing above the tree line.  Once I was onto the ridge, the views were great, looking north up into Crawford Notch:

IMG_0576

Looking Across Crawford Notch, Mt Tom

IMG_0589.jpg

That’s Mt Webster Up Ahead

 

Here are two views of the ridge, taken over a year later, from across the way on Mt Willey:

IMG_1135

Mt Webster is on the left.  I ascended steeply up the right side, then along the ridge

IMG_1157

The Ridge

 

I ran into some snow remnants along the path as I approached Mt Webster!  Just proves, once again, that you have to be prepared for snow  – even in June!

I was prepared this time… but the snow patches were not an issue this time!:

IMG_0594.jpg

 

The walking was good, but windy, and clouds were building from the west.  So far, I had not seen any other hikers…

I arrived at Mt Webster ( 3910ft, 1192m – not a 4000-footer) at 1:30pm.  The plan was to rejoin the trail here on my way back, via the Webster-Jackson Trail.

IMG_0600.jpg

 

To the west, I could look across Crawford Notch and see Mt Tom and Mt Field and Mt Willey.  The views north towards the Presidential Range were great, though Mt Washington was in the clouds.  There were patches of blue sky above me, but darker skies to the west.

 

Just before reaching Mt Webster, I passed a through hiker: he was hiking north, doing the entire Appalachian Trail.  Impressive, since it was only early June, that he was this far north.  Maybe in his 60’s, with a grey beard.  He asked me what my “trail handle” was, assuming (I guess) that I was also a through hiker.  I just laughed and said: “well, my name is Mark”!

“These are some heavy hills” I said.

“Hills?!” he exclaimed.

So I guess he was feeling the ascent, as I was.  But, having just restocked his pack with food, he was carrying much more weight than I was…

Just past Mt Webster, I began the Webster-Jackson loop that planned to take; first, continuing on to Mt Jackson, then down and around to return to Mt Webster:

IMG_0630.jpg

IMG_0632.jpg

 

Here I encountered the second hiker of the day.  Dan was hiking with the guy I had met earlier, and was waiting here for him.  Dan had joined the other guy a week ago, for part of the through hike.  Dan seemed tired and ready to get off the trail, asking me what was the fastest way to the road.  Seemed like he had had enough, describing lots of rain and snow and ice over the last days.

I told him how I had run into so much ice over that way, on Mt Tom and Mt Field the year before, and how I had fallen in May on Mt Liberty.

I left Dan there, and arrived at the top of Mt Jackson at about 1:45pm, and ate lunch – a tried-and-true “Veggie Delite” sandwich from Subway.  It began to sprinkle, light rain falling.

Here the views of the Presidential Range were great, though Mt Washington was still in the clouds.  Mispah Springs Hut can just be seen, a speck of light in the middle left of the photo:

IMG_0605.jpg

 

The Mt Washington Hotel, in Bretton Woods, can be seen here in the distance with distinctive red roofs, looking north through Crawford Notch:

IMG_0603.jpg

 

From the top of Mt Jackson, the Webster Cliff Trail continues on towards Mt Pierce (which I had climbed with Raúl and Kelly earlier in the year) and the rest of the Presidential Range.  I turned left here, taking the Webster-Jackson Trail, hoping to loop back up to Mt Webster.  My hunch was that Dan was going to wait for his friend, and then follow me down, since that would be the quickest way to “civilization” and he was ready for a shower!

I began to drop steadily down Webster-Jackson, a typical White-Mountains hike, rock-hopping.  But I was a bit surprised, and became increasingly concerned, at the amount of elevation I was losing, as I went down, and down, and down… I knew I’d have to make up this elevation drop, every step of it!

 

I passed five people coming up – two young men running the trail, a mother and daughter (probably going up to stay at the Mispah Hut), and one guy huffing and puffing.

I arrived at the bottom of the loop at just before 3pm, exhausted and now regretting having taken this detour.  Cursing every step down, which I would have to make up, soon: because, from here, it would be a long way back up to Mt Webster, and it was beginning to rain steadily.

IMG_0615.jpg

 

At the bottom of the Webster-Jackson loop, there is a beautiful waterfall, and the temperature was much lower than it had been at the top of the ridge:

It was a VERY LONG slog back up to the top of Mt Webster, where I arrived again at 3:45pm, very tired and very wet.  It had become much colder here since I had passed through earlier in the day, now windy and steadily raining.

Here I would walk back along the ridge.  And I began to feel quite nervous about the possibility of slipping on the slick rocks – from here it would be all downhill, and a fall on the now-slippery rocks could be trouble!

I didn’t really stop at the top of Mt Webster – too cold and rainy.  Conditions had changed a lot since I’d passed this peak that morning!

IMG_0635

IMG_0637

IMG_0642

 

Although it was raining steadily, some blue sky did roll by once in a while:

IMG_0640

 

From here I began the descent back to Rt 302, and soon the trees began to grow in size, and cover me.  I never slipped on the wet granite stones, though I came close a couple of times.  I had to take it very slowly, taking care as I went across every one of the many rocks…  But I got soaked through – for the first time in 24 climbs!

IMG_0643

IMG_0645

Soaking Wet, But Happy

 

I was back at my car at about 6:15pm; it was raining hard and 49 degrees.

IMG_0647

 

The Mt Jackson climb was great, despite the unwelcome rain and cold.  It was longer and harder than expected – nothing technical or super-steep, just long, due mostly to my decision to do the loop down from the summit and back up, and because I had to take care on the slick rocks coming down.

*

Once CCF’s management had endorsed my recommendations for their new program approach, Daniel and I began the design process.  Along the way, CCF’s President John Schulz had baptized the new approach as “Bright Futures,”  which was very smart: branding the change with an inspirational, catchy name that also captured the essence of what we were proposing would help open people to the idea.

Gesture 5.jpg

Daniel Wordsworth, 2003

Here I will be quoting extensively from a document that Daniel and I worked on, but which was primarily his.  He boiled down the essence of Bright Futures into three fundamental objectives.  Bright Futures would:

  1. Broaden, deepen and bring about longer-lasting impact in children’s lives;
  2. Fortify sponsorship;
  3. Strengthen accountability.

Bright Futures would be based on the belief that people must be given the space to design and shape the programs that will be carried out in their communities and countries.  The fundamental principle that guided our thinking was that there was no universal strategy that CCF could apply across the complex and different contexts in which it worked.  Therefore, the emphasis was not on a framework that outlined what should be done – e.g. health, education, etc – but rather on a set of key processes that would set the tone of the agency’s work and provide coherence to its programming around the world.

There were five key work processes, qualities of work, that would characterize CCF’s Bright Futures programming.  Each of these was firmly linked to the transformational themes that my own research had identified, but Daniel managed to put things in clear and incisive terms, displaying the brilliant insights I had come to admire:

Screen Shot 2017-07-31 at 1.51.32 PM

Grounded and Connected: Bright Futures programs would be integrated into the surrounding social environment, contributing to and drawing from the assets and opportunities that this environment provides.

To accomplish this, programs would be based in well-defined, homogeneous “Areas”, matching the level of government service provision – often the “district” level.  Program planning would be based at the community level, and program implementation would be accountable to local communities, but programs would be integrated with relevant efforts of the government and other development agencies, at local and national levels. CCF staff would be decentralized, close to communities, to ensure on-the-spot follow-up, using participatory methods and strict project management discipline to ensure effective program implementation.  By partnering with other organizations, building the capacity of local people, and seizing opportunities to replicate program methods wherever possible, impact would be expanded into other communities within the Area and beyond.

These would be big changes for CCF, on many dimensions.  Current programming was exclusively at village or community level, but it was disconnected from efforts to overcome poverty that were taking place at other levels.  Staff visited programs rarely, typically only once per year.  And notions of replication or even sustainability were rarely addressed.  Making these changes a reality would be challenging.

Achieve Long-Term Change: Bright Futures programs would be grounded in an understanding of poverty and of the causes of poverty, and designed to make a long-lasting difference in the lives of poor children.

To accomplish this, program design would begin with immersion in communities and a thorough analysis of the deeper issues of poverty confronting children and communities.  Program interventions would then take place where the causes of child poverty were found, whether at child, family, community, or area (district) levels. Programs would be designed and implemented according to a series of three-year strategic plans, and would consist of a comprehensive set of integrated “Project Activities” that had specific objectives, implementation plans and budgets.  Financial flow would follow budget and implementation.

As we began to design Bright Futures, CCF’s programming was guided by an agency-wide set of outcomes that had been articulated some years before, called “AIMES.”  These “outcomes” were really more of a set of indicators, most of which were tightly focused on basic needs such as immunization, primary-school completion, etc.  Communities seemed to view these indicators as a menu, from which they selected each year.  And, as I mentioned above, interventions were exclusively at village or community level.

With the advent of Bright Futures, the findings of the CCF Poverty Study, and of my own research, we would fundamentally change these practices.  From now on, there would be no “menu” to draw from; rather, CCF would help local organizations to grapple with the causes of child poverty, viewing that poverty in a broader way, and consulting deeply with local people and children; staff would then create an “Area Strategic Plan” (“ASP”) that outlined how programming would address these causes across the “Area.”

(Details of how the ASP would be designed will be included in my next posting, stay tuned!)

Build People: Bright Futures programs seek to build a stronger society with the ability to cooperate for the good of children and families.

To accomplish this, programs would build Federations and Associations of poor children, youth and adults that represent the interests of excluded and deprived people.  These entities would manage program implementation (mostly) through and with partners. Programs would be implemented through local bodies such as district government, NGOs, or community-based organizations, building the capacity of these groups to effectively implement solutions to issues facing poor children.  A long-term, planned approach to capacity building would be adopted, that reinforced and strengthened local competencies and organizations so that communities could continue their efforts to build bright futures for their children long after CCF had phased out of their communities.  This approach would include clearly articulated and time-bound entry and exit conditions, and specific milestones to gauge progress towards exit.

This was another big and challenging change.  CCF would continue to work with parents’ associations at community level, as it had been doing, because this was a real strength of the agency.  However, these associations tended to lack capacity, were left to fend for themselves, and did not interact with other stakeholders and “duty-bearers” around them.

All of this would change with Bright Futures.  Parents’ associations would now be “federated” to district level, and the Parent’s Federations would be the primary bodies that CCF worked with and for.  These Federations, being located at the “district” level, would interact with local government service providers (“duty bearers”), serving as interest groups on behalf of poor and excluded people.  And the Parents’ Federations would, normally, not be seen as program implementors.  Rather, they would – at least in the first instance – locate local partners that could implement the kinds of projects that were identified in the ASP.

Here we had a challenge, as we moved the existing Parents’ Associations into very different roles, where they no longer controlled funds as they had previously.  There were many vested interests involved here, and we anticipated opposition from people who had learned to extract benefits informally, especially given that in the previous model CCF’s staff had been very hands-off and remote from program implementation.  And the very idea of “federating” and influencing local duty-bearers was completely new to CCF.

Show Impact: Bright Futures programs demonstrate the impact of our work in ways that matter to us and the children and communities we work with.

To accomplish this, using CCF’s poverty framework of Deprivation, Exclusion, and Vulnerability, the National Office would clearly articulate the organization’s niche, and demonstrate its particular contribution.   The outputs of each project would be rigorously monitored to ensure effective implementation, and programs would likewise be carefully monitored to ensure relevance to enrolled children.

Before Bright Futures, CCF’s National Offices had very little influence on programming.  If a local Parents’ Association was not breaking any rules, then funding went directly from CCF’s headquarters in Richmond, Virginia to the Association, without intervention from the National Office.  Only when a serious, usually finance- or audit-related, issue was identified could the National Office intervene, and then they could only halt fund transmissions and await remedial action from Richmond.

Now, the National Office and local Area team would be monitoring project implementation on a regular basis, using techniques that ensured that the voices of local children were central to the process of monitoring and evaluation.  We would have to develop tools for this.

Recognize Each Child’s Gift: Bright Futures programs recognize and value each particular child as a unique and precious individual.

To accomplish this, programs would be designed to facilitate the development of each child in holistic ways, taking into account the different phases of development through which each child passes.  The voices of children would be heard and would shape the direction of programs.  CCF would promote children and youth as leaders in their own development, and in the development of their communities and societies.  This would now be central to program implementation.

While the local Parents’ Associations would be retained, and federated to district level, two new forms of Association and Federation would be introduced: of children, and of youth.  These new Associations and Federations would be given prominent roles in program design and project implementation, as appropriate to their age.

*

These were all big, fundamentally-disruptive changes, involving seismic shifts in every aspect of CCF’s program work.  I felt that we had incorporated much of the learning and reflection that I had done, beginning in my Peace Corps days and all the way through my 15 years with Plan – this was the best way to make a real, lasting difference!

Once Daniel and Michelle were happy with the way that we were articulating Bright Futures, our next step was to get senior-management and board approval.

I was very pleased that, in the end, CCF’s leaders were very supportive of what Daniel was proposing.  But, in a note of caution given the magnitude of the changes we were proposing, we were asked to pilot test the approach before rolling it out.

This cautious approach made sense to me, and I was delighted that Daniel asked me to continue as an outside consultant, to oversee and support the pilot National Offices, documenting their experience and our learning as the Bright Futures approach was tested.

*

We then began to consider where we should pilot test.  First, we asked for volunteers across CCF’s National Offices and then, after creating a short list of viable options, we reviewed the status of each of the National Offices remaining on the list.  We quickly came to the conclusion that we would select one National Office in each of the continents where the majority of CCF’s work took place:

  • Carlos - 1.jpg

    Carlos Montúfar

    In the Americas, we chose Ecuador.  The office there was well-run, stable, and was regarded as a model in many ways.  The National Director (Carlos Montúfar) was a strong leader, and he and his team were enthusiastic about being Bright Futures “pilots”;

 

 

 

 

  • Screen Shot 2017-08-01 at 2.18.44 PM.png

    James Ameda

    In Africa, we chose Uganda.  Here things were a bit different than in Ecuador: the Uganda office was considered by many in CCF as needed a bit of a shakeup.  James Ameda was a senior National Director and was supportive of the pilot, but there were some tensions in his team and performance across CCF/Uganda in some areas was weak;

 

 

 

  • For Asia, we decided to choose the Philippines office.  The office in Manila was well-
    Screen Shot 2017-08-01 at 2.18.35 PM.png

    Nini Hamili

    run, with high morale and strong leadership in the form of Nini Hamili, a charismatic and long-tenured National Director.  Nini was a very strong leader, who sidelined as a mediator in violent Mindanao – I came to see how courageous Nini was…

 

 

 

 

*

Soon I would begin regularly to visit the three pilot offices, training them on the methods and systems that were being developed for Bright Futures, accompanying them as they learned and adapted, documenting our experience.

It was a great privilege working with Carlos, James, and Nini and their teams – they had taken on a huge challenge: not only did Bright Futures represent a set of fundamental shifts in what they were accustomed to doing, but they were asked to continue to manage their programs the old way in the areas of their country where Bright Futures wasn’t being introduced.

And it was equally impressive working with Daniel and Michelle at CCF’s Richmond headquarters, along with staff like Victoria Adams, Mike Raikovitz, and many others, and fellow consultants Jon Kurtz and Andrew Couldridge.

Next time, I will go into much more detail on the pilot testing of Bright Futures, including how we designed and implemented perhaps the most fundamental program-related system, Area Strategic Planning.

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.

 

Mt Pierce (16) – Four Years At Plan’s International Headquarters

March, 2017

In early May, 1997, Jean and I left the UK and flew to Boston, on our way to spend a year on sabbatical in New Hampshire.  I had spent four years at Plan’s International Headquarters (“IH”) as Program Director, having planned to stay for only three; as I mentioned in an earlier blog, I agreed to stay a fourth year to lead the restructuring of Plan’s field structure, and to support the rollout of the new structure.  Then it was time to move on.

The last four entries in this series have described the major initiatives that we undertook while I worked at IH (defining a new program approach, goals and principles; deciding where to expand and where to shrink Plan’s program work; and restructuring how we worked at country level), and included, most recently, a “guest blog” from Plan’s International Executive Director during those years, Max van der Schalk.

It was an honour to work at IH, to contribute to Plan’s work at that level.  I look back on that time with some pride in successes, and also with a clear realisation of areas where we fell short.

So, this time, I want to share my own reflections on those four years at IH.  Joys, sorrows, successes, and failures, and lots of lessons learned.

*

I began a new journey in May of 2016, tracing two long arcs in my life:

  • Climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Working in international development during the MDG era: what was it like in the sector as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

This is number 16, so covering all 48 of those mountains might take me a couple more years…

*

Last time I described how Raúl and Kelly, friends and colleagues from Australia, and I climbed Mt Eisenhower on 20 August 2016.  From the summit of Mt Eisenhower we retraced our steps back down the Crawford Path and then reached the top of Mt Pierce (4312ft, 1314m), just after 3pm.

Slide15

IMG_6783

This Cairn Marks the Summit Of Mt Pierce

IMG_6782

IMG_6789

IMG_6788

 

Here are my hiking companions on the way down Mispah Cutoff, close to the point where we would rejoin the Crawford Path:

 

We had planned on climbing three 4000-footers that day – continuing south from Mt Pierce along Webster Cliff Trail, to Mt Jackson, and then dropping from there back down to Saco Lake where we had left the car.  But by the time we reached Mizpah Spring Hut we were very knackered, so decided to take the Mizpah Cutoff over to rejoin Crawford Path, and then hike back down to the parking area that way.  Retracing our steps.

So we didn’t get to the top of Mt Jackson, which awaits ascent on another day – but we did scale Mt Eisenhower and Mt Pierce.

It was a strenuous hike that day, but with beautiful views and no insect problems.  Glorious views from the Presidential Range, mainly looking south.

*

Looking back on four years at Plan’s International Headquarters (“IH”), what stands out?  Let me share some thoughts on what went well and on what went badly.

What went well

  1. We made good choices about what to change;
  2. The way we went about making those changes was, mostly (but not always), smart;
  3. We were able to involve some of Plan’s future stars in what we did, giving them exposure and experience at the highest organisational levels, thus helping to build a new generation of Plan leaders;
  4. I’m glad I set a goal of leaving IH in three years, even though it took me four.

Let me reflect briefly about each of these positive aspects of my time at Plan’s head office.

First, in addition to normal, daily tasks and senior-management duties, I decided to focus on three major change projects, all aimed at creating unity of purpose across what was, I felt, a quickly-atomising organisation.

I had outlined these priorities to Max in our first interactions, before I even went to IH. Described in three earlier blog posts in this series, these projects were focused on: overhauling Plan’s program approach; deciding, in accordance with set strategy, where to grow and where to phase out our work; and finishing Plan’s restructuring by reorganizing the organization’s field structure.

Looking back, these were very good choices.  Before moving to IH I had served as Plan’s Regional Director for South America, and had appreciated wide latitude to run operations in that region as I saw fit.  As Plan finished regionalizing, with six Regional Offices in place by the time I was brought to IH, and as each of the six Regional Directors began to “appreciate” that wide latitude,  Plan was in real danger of atomizing, becoming six separate kingdoms (all six were, initially, men!)

So I selected those three major change projects carefully, seeking to build unity of purpose, to bring the organization together around shared language, culture, and purpose.  This would, I hoped, balance the centrifugal forces inherent in regionalization and decentralization with necessary, binding, centripetal forces that would hold Plan together.  Building unity of purpose around a common program approach, a common structure (with local variations in some particular functions), and a shared understanding of where we would work.

Plan should have taken these change efforts much farther – for example, to build shared staff-development tools around the core, common positions at Country Offices, and finishing a monitoring and evaluation system centered on the program goals and principles that we developed.  More on that below.  But, in four years, I think we accomplished a lot and, generally speaking, we were able to notably increase unity of purpose across Plan.

Second, as we developed those changes, we were (mostly) pretty smart about it.  Plan’s new program goals and principles evolved from a wide organizational conversation, which began with a workshop that involved people from across the agency.  Development of the Country Structure began with a “skunk works” that involved a very impressive set of people, chosen both because of their expertise and experience, as well as their credibility.  In both cases, we took initial prototypes across the organization, through senior management and the board, and the results worked well… and lasted.

As I’ve described earlier, the preparation of the organizational growth plan, on the other hand, was primarily handled by me, myself, without anything like the kind of participation, contribution, and ownership that characterized the other two projects.  Yes, we consulted, but it wasn’t enough.  Partly as a result, the growth plan was less successful in bringing Plan together than were the other two projects.

1607-4210So the way we went about addressing unity of purpose in Plan was effective, mostly.  The model of advancing change in an international NGO by convening a focused reflection, including key staff, and honestly consulting the initial prototype across all stakeholder groups, seems appropriate.  (See below for some reflections on implementation, however.)

Third, I look back on the people that we involved in those projects, and I’m proud that we helped bring Plan’s next generation of leadership into being.  Just to give a few examples, participants and leaders in those key efforts included people like Donal Keane, who would become my manager when I went to Viet Nam as Plan’s Country Director; Subhadra Belbase, who would soon become Regional Director in Eastern and Southern Africa; Jim Emerson, who helped me create the planning framework for Country Offices, and who would later become Finance Director and Deputy IED at IH; Mohan Thazhathu, who would become RD for Central America and the Caribbean, and later a CEO in other INGOs; and many others.  To a great extent, this was purposeful: I wanted to involve the right people, and I wanted their experience, and the associated high-profile visibility, to help move these amazing people onward and upward in Plan.

Finally, I’m glad I set a goal of leaving IH in three years, even though it took me four.  My experience working with many INGO headquarters is that people stay too long: head offices are exciting places to work and to contribute; people who join our social-justice organizations (mostly) have strong desires to make the world a better, fairer, more-just place, and a lot can be accomplished from the center.  Plus, there are great opportunities for power and prestige, not to mention ego-fulfillment.

This reality can be entrancing, and can lead to people staying for too long.  I wanted to be the kind of person who didn’t overstay my time, and I wanted Plan to be the kind of organization where the most important place to work was the field, not International Headquarters; in fact, my predecessor as Program Director, Jim Byrne, returned to the field from IH, as Country Director for Bolivia and then Ghana.  I was determined to follow that great example, and did so.

Plus, I was pretty burned out after four years, partly because of the things that went badly during those four years…

What went badly

  1. I was much too gentle with Plan’s Regional Directors;
  2. After designing organizational changes as described above, with lots of consultation and co-creation, we should have been much more forceful when it came to implement the resulting decisions;
  3. I wasn’t smart enough in relating to Plan’s Board;
  4. Again related to the Board, we didn’t tackle basic governance problems, especially the imbalance due to the huge success of Plan’s Dutch National Organisation in those days;
  5. Personally, I was much too focused on making the three major changes that I described above, and didn’t spend enough time attending to the wider, political reality inside the agency.

First, I should have been much tougher with Plan’s Regional Directors during my time as Program Director.  In this, I agree with much of Max van der Schalk’s “guest blog,” published earlier in this series, when he says that he “learned from experience to mistrust most of the RD’s. I wasn’t always sure of their honesty and I also doubted that the whole team felt responsible for the effectiveness of the organization. Quite a few RD’s appeared to me to take advantage of their position and to think mainly about their own achievement.”

I completely understand what he’s referring to.  When Max arrived as Plan’s IED, he organised senior management to include the Regional Directors.  This was a change – previously, Plan’s senior management had all been IH-based.  Thus, in principle at least, all major operational decisions, and proposals to be made to Plan’s board of directors, would go through a staff team that included the field managers at Regional level.

From my perspective, this was very smart.  It was a great way to balance headquarters priorities with the realities of field implementation.  But, sadly, Plan developed a bad case of what I called the “Heathrow Syndrome” in those years – the global agreements that we made when Senior Management gathered in Woking, outside London, seemed to evaporate (at least for our six Regional Directors) when they got in to the taxi to go to the airport.  And then, by the time they boarded their flights home, their priorities seemed to have already shifted to their Regions, and thoughts of the wider organisation seemed to have disappeared.

In fact, a couple of the Regional Directors of the time should have been dismissed for behavior that was even worse than the “Heathrow Syndrome“, and I should have done more to encourage that.  Even though they didn’t report directly to me, I should have been much more willing to advocate changes to Max, been much less gentle.  In the future, I would be more willing to take action in similar situations.

After leaving IH I came to realise that part of the problem was related to the emotional connection that NGO staff – at least the good ones – make with their work.  Our people, at their best, associate their own values and self image with the aims of our organisations: we work for justice, human rights, to overcome oppression and deprivation, because we hold those values very deeply.img_6662

This emotional connection is a strong motivational force and, if managed well, can produce levels of commitment and passion that private-sector organisations rarely achieve.  But it often also means that NGO people overly personalise their work, take things too personally, and resist change. Perhaps part of the reason that several of Plan’s Regional Directors in those days resisted thinking globally and acting locally was that their personal ambitions – for good and for bad – were advanced more easily by thinking locally and acting globally.

Second, and related to my first point, after designing organizational changes as described above, with lots of consultation and co-creation, we should have been much more forceful when it came to implement the resulting decisions.  For example:

  • there should have been no exceptions for putting in place the agreed country structure, because a suitable level of flexibility was already included;
  • we had agreed to develop training packages for the four core, common positions that would be in place at all Country Offices, but we didn’t get that done;
  • we should have mandated that all Country Strategic Plans be structured around the new Domains and Principles that comprised Plan’s Program Approach;
  • an effort existed to design and implement a “Corporate Planning, Monitoring, and Evaluation” system, which didn’t really get off the ground until Catherine Webster took over the project;
  • finally, I should be been much more insistent that the agreed growth plan be followed, insisting on plans to close operations in the countries where our strategy mandated phase-out.

Generally speaking, my conclusion here is that we were right to design changes in a very open, participatory way, and to consult (and adjust) with all key stakeholders before finalising decisions.  That was good.  But once decisions were made, we should have been much stronger, much tougher, in carrying out those agreements.  Over time, that approach might have reduced the toxic “Heathrow Syndrome.

Third, I should have developed a much stronger relationship with Plan’s board of directors than I did.  Again, in his “guest blog,” Max notes that he is “… less than happy about my relationship with the Board and I missed a chance there…”  As Program Director, I naturally had less direct relation with Plan’s Board than Max did, but I could have usefully developed more of a connection.  That might have helped me achieve my own goals, advance the organization, and also helped Max (though he might not have agreed with that, or even accepted it!)

For example, one Board member was named to work with us on the development of Plan’s program approach; Ian Buist had worked in the UK government’s overseas aid efforts across a long career, and his contributions to what became Plan’s “Domains” and “Principles” were valuable.  In retrospect, I would have been more effective, more successful, and more helpful to Max if I had developed similar relationships with other program-minded board members.

But I wanted to focus on program, and felt that working with the Board was not my role; Max would involve me when it was necessary, I thought.  But, of course, I knew Plan much better than Max did, having at that point worked at local, regional, and global levels for nearly ten years, so my reluctance to put more energy into working with Plan’s board was short-sighted on my part.

Fourth, and perhaps most fundamental, comes governance.  When organisational governance doesn’t function smoothly, watch out!  And, in those days, if not broken, Plan’s governance was not working very well at all, for one main reason.

When I was at IH, Plan’s funds came from nine “National Organisations” in nine developed countries (Australia, Belgium, Canada, France, Germany, Japan, the Netherlands, the UK, and the US).  The way that Plan’s corporate bylaws were designed meant that the Dutch organisation was allocated four seats, four votes, on the 25-person board, even though over 50% of Plan’s funding came from the Netherlands.  (In comparison, the Canadian and US National Offices, each bringing in around 10% of Plan’s funding, each had three seats, three votes.)

This lack of balance – over half of Plan’s funding coming from the Netherlands, with the Dutch organisation having just 16% of the votes on Plan’s board – distorted the agency’s behavior in negative ways, ways that I could see in my daily work.

Unsurprisingly, and most damaging, was that an informal power structure evolved to compensate for Plan’s unbalanced governance.  This could be seen in action in several ways.  For example, it felt to me as I observed board meetings, that Dutch board members had effective veto over any major decisions: if a Dutch board member spoke strongly against, or in favour of, a proposition at a meeting, the vote would always go that way, despite the Dutch only having 4 of 25 votes.

There’s nothing inherently bad, or wrong, or evil about what was happening; it was completely logical that the interests of the biggest financial stakeholder would become paramount.  Don’t kill the goose that lays the golden egg!  But the problem was, as I saw it, Plan’s formal governance structure wasn’t able to handle the reality of those days, so informal mechanisms evolved, and those informal mechanisms were not always transparent or effective.

For example, I vividly remember a lunch meeting which included Max, me, and the National Director for the Netherlands.  The Dutch National Director was, without a doubt, a genius fundraiser, and had build Plan Netherlands into an iconic force in Holland, known and respected by virtually everybody in the country from the royal family on down.

His undoubted accomplishments were accompanied by similar levels of ego and assertiveness.

I don’t recall the exact issue that we were discussing that day over lunch, but I do remember our Dutch colleague expressing his strong disagreement with the direction that Max and I were planning to take.  Those kinds disagreements are common in any human endeavour, of course.  But he took it one step further: in so many words, he made it very clear that, if we proceeded with the course of action we were planning, he would have Max dismissed.

In Plan’s formal governance setup, the Dutch National Director was not a Plan board member, and had no formal influence on Max’s job security.  But the informal governance structures which had evolved, to recognise the importance of the Dutch Office’s success to the overall organisation, meant that his threat was completely credible.

Another example of the dysfunctional consequences of Plan’s imbalanced governance came soon after I (and Max) left IH.  Max’s successor fired one of Plan’s Regional Directors, who was Dutch.  From my perspective, this was probably well within the new IED’s authority, but from what I heard (I wasn’t in the room!) the actual dismissal was not handled very astutely.  The Regional Director then threatened legal action to challenge his dismissal and, as I understand it, had an assurance of financial support from the Netherlands office in this action – essentially, one part of the agency would be suing the other!  This led to several years of estrangement (and worse) between Plan and the Dutch Office, our biggest source of funds!

Apparently, the imbalance in governance, and resulting informal power structures, extended to the Dutch Office having the ability to veto personnel-related decisions, at least when a Dutch Regional Director was involved!

These examples illustrate how our operational management was influenced by the realities as seen from the point of view of our biggest revenue source.  Nothing wrong with that, in theory – in fact, it makes a lot of sense.  But in the absence of a formal governance structure that reflected organisational realities, informal mechanisms evolved to reflect the needs of Plan’s biggest funder: such as heated lunch discussions, and a law suit against Plan funded by one of its own National Organisations.  These informal mechanisms drained our energy, stressed us all, and became major distractions from what we were supposed to be focused on: the effective and efficient implementation of our mission to help children living in poverty have better lives.

Now, the best solution to re-balancing Plan’s governance would have been for other National Organisations to grow – for the Australian or Canadian or German or US offices to increase their fundraising closer to what our Dutch colleagues were achieving.  Then Plan’s existing governance structure would have functioned well.  Alternatively, perhaps, at least in the short term, we could have increased the votes allocated to the Dutch organization.  In these ways, the imbalance described above would have been corrected without informal mechanisms.

What actually happened, sadly, was that the Dutch organisation ended up shrinking dramatically, as the result of a mishandled public-relations crisis.  In fact, I think that our management of that crisis actually illustrated the basic problem: Plan’s Dutch Office refused to let us address false accusations coming from a Dutch supporter as we should have done, and the problem just festered, got worse and worse.  But the informal power of the Dutch Office, caused in part by the governance imbalance I’ve described, was such that we at Plan’s International Headquarters were not able to go against the preferences of the Dutch Office to take the actions we felt would have defused the crisis.  (Namely, full, frank, and fast disclosure of the facts of the particular case.)  In this case, I’m pretty sure that we were right and the Dutch Office was wrong… and, as a direct result, Plan’s fundraising in the Netherlands dropped by half.

My sense is that these kinds of governance dynamics are common in federated International NGOs (ChildFund, Save the Children, Oxfam, World Vision, etc.) though there are differences in the particularities of each grouping, of course.  The solution, as far as I can see it, is to periodically re-examine governance and make sure that structures fit the reality of the agency.  (Ironically, Plan had attempted to review and adjust its governance before I arrived at IH.  Glorianne Stromberg, who readers of this blog series have already met, was Board Secretary in those days, during Alberto Neri’s time; she had proposed a far-reaching update of Plan’s governance.  Probably Glorianne’s proposals would have helped reduce the imbalance I’ve described, and would also have addressed Max’s feeling that the Board was too big…)

Finally, I was much too focused on my program changes, my three projects, and was not “political” enough.  In a sense, this failure on my part relates to all of the above accomplishments and setbacks – if I had been more astute “politically” I could have helped Max correct the behaviour of several Regional Directors, and connected more effectively with Plan’s board of directors.

But I just wasn’t interested in spending my limited time and energy on those things.  I was focused, passionate, and effective focused on program matters (goals and principles, structure, and growth.)  I felt, and still feel, that behaving “politically” would be inconsistent with the values and aspirations of the NGO sector.  I wanted to enact those values – honesty, transparency, empathy, compassion – and I didn’t see how I could do that while also being “political.”

Today I think I see that it is indeed possible to be focused and true to the moral and ethical values of our sector while also being “political.”  It’s not about learning from Machiavelli; rather, it’s mostly about being able to handle conflict competently.  Conflict is inherent in the human experience, certainly including at senior management levels in an INGO like Plan!  Managing conflict productively, being able to confront conflict situations with confidence and panache, is a skill that I would deepen later, some years after my time at Plan’s International Headquarters.

*

Still, when I was going through my papers from that time at IH, I came across two documents that I want to share.  Firstly, from a board member, Ian Buist.  I had worked with Ian in the development of the Program Directions and Growth Plan, and had incorporated much of his wise advice into the processes and results.  When he heard that I was moving from IH, he wrote me a very nice note, which I am taking the liberty to copy here:

Ian Buist to Mark - June 1997.jpeg

 

Another person whose opinion I valued enormously was Jim Byrne.  He had been my predecessor as Plan’s Program Director, in that role when I had joined the agency in Tuluá.  I admired and respected Jim for many reasons, one of which was that he had chosen to return to the field from IH.  I planned to emulate this, as with many other things about him.

Jim also wrote me a very nice message of “fond farewell”:

Jim Byrne to Mark - April 1997.jpeg

 

Many thanks to Ian for the partnership and the thoughts, and to Jim for the mentorship, friendship, and inspiration.

*

Those four years at IH were great.  Weighing up all the successes and failures, large and small, looking back there’s no doubt in my mind that Plan was stronger and more unified when Jean and I left the UK, in May, 1997, than it had been when I arrived.

But it was time to move on, and it would be for others to take up the challenges and joys of running that organization.

*

In future blogs in this series I’ll describe my tenure as Country Director for Plan in Viet Nam, as consultant at CCF, as Executive Director at the UU Service Committee, and as International Program Director at ChildFund Australia.  As I approached my work in those organisations, I tried to apply what I learned from those four years at Plan’s International Headquarters, from the successes and failures described above.  Stay tuned!

Next time I’ll begin to reflect on four years living and working in Viet Nam, as Plan’s Country Director in that very special country.

*

Here are links to other blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration.