South Twin (41) – Disaster Risk Reduction

September, 2018

During my six years at ChildFund Australia, the program approach that we developed early in my tenure meant that reducing vulnerability became one of our biggest priorities.  This was new territory for us, with lots of learning and testing: what did reducing child vulnerability mean for ChildFund Australia?  What kinds of vulnerability would we address?  And where?

In this article I will focus on one aspect of vulnerability that we worked on: “disaster risk reduction” (DRR).  And, in particular, I want to highlight our participation in the United Nations World Conference on Disaster Risk Reduction, which took place in March, 2015, in Sendai, Japan.

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 34 years ago: on development, social justice, conflict, experiences along the way, etc.

But first…

*

Last time I described how our Granddaughter V and I had climbed North Twin Mountain on 2 September 2017.  Since it was V’s first real mountain climb, we had agreed that we would decide about continuing to South Twin when we got to the top of North Twin.

We had arrived at the outlook near the top of North Twin at about 1:30pm and had lunch, getting to the top just after 2pm.  V was enthusiastic about continuing, and she was certainly keeping up with me on the climb – no problems at all – so we decided to continue on towards the top of South Twin (4902ft, 1494m) – the 8th-highest 4000-footer, and my 41st on this round.  It would be V’s second 4000-footer that day!

Screen Shot 2017-09-05 at 4.01.41 PM.png

 

We took the North Twin Spur, leaving North Twin a bit after 2pm.  We dropped down into a saddle between North and South Twin, and walked through pleasant forest for 1.3miles:

IMG_2414.jpg

 

Here the top of South Twin is visible in the near distance, with a few people at the summit:

IMG_2415.jpg

 

We arrived at the top of South Twin at 2:45pm, and spent some time there taking pictures.  It was clear, and spectacular, not that many people.  Probably the very best views of the White Mountains I’ve ever seen – towards the northeast and the Presidentials; to the south and Mt Bond and West Bond (Bondcliff was obscured by West Bond from this viewpoint) and Waterville Valley; over towards the west and Franconia Ridge and Garfield:

 

IMG_2435.jpg

IMG_2443.jpg

IMG_2456.jpg

Summit Of South Twin – Franconia Ridge In The Background

 

 

Here we turned around, leaving South Twin at around 3:15pm, heading back towards the trail-head.  In this section, V developed a painful cramp in her knee, briefly, which she was able to shake off.  So we kept going…

IMG_2469.jpg

 

We arrived back at the top of North Twin about an hour after leaving South Twin:

IMG_2476

IMG_2472.jpg

 

And then we began the long walk back down North Twin trail, dropping down fairly steeply at first, then more gradually as we arrived at Little River.  I filmed V crossing the river at about 6pm:

 

I had estimated that we’d be back at the trail-head, where the car was parked, by 6:30pm. It turned out to be 6:51pm by the time we got there – a long, 8-hour hike, around 11 miles total, but a spectacular day.

IMG_2493.jpg

 

South Twin was number 41 for me, and V’s second 4000-footer in one day!  She did a great job on her first 4000-footers.

Now I had only seven more to go.

*

Very early in my time at ChildFund Australia, we developed a program approach founded upon a comprehensive “theory of change.”  I’ve written about that earlier in this series.

The development of that program approach was done through a great process of reflection and collaboration.  In the end, our experience, learning, and reflection led us to understand that people are poor because:

  1. they are deprived of assets (human, capital, natural, and social);
  2. they are excluded from their societies, and are invisible (voice and agency);
  3. of power differentials in their families, communities, societies, and across nations.

And we understood that (4) children and youth are particularly vulnerable to risks in their environment, which can result in dramatic increases in poverty; they therefore require protection from physical and psycho-social threats, and sexual abuse, natural and human-caused emergencies, slow-onset disasters, civil conflict, etc.

Because we understood that these are the four causes of child poverty, we set ourselves the collective challenge of improving children’s futures by:

  • building human, capital, natural and social assets around the child, including the caregiver;
  • building the voice and agency, and
  • power of poor people and poor children, while
  • working to ensure that children and youth are protected from risks in their environments.

*

A few weeks ago I wrote about the third domain of our work, building the power of poor people and poor children.  That was a very new area of work for us.

Risk reduction was also new, though we had some experience with child protection.

We had designed outcome indicators which we would use, through our Development Effectiveness Framework, to measure the impact of all our work, and giving a sense of our priorities.  There was one outcome indicator that corresponded to risk reduction:

Indicator 11: % of communities with a disaster preparedness plan based on a survey of risks, including those related to adaptation to anticipated climate change, relevant to local conditions, known to the community, and consistent with national standards.

I liked this outcome indicator.  It would enable us to reflect the results of our DRR work in a broad sense.  It showed connectivity with efforts related to climate-change, and linked nicely with the work of others at country level.  Along with other areas of child protection, disaster risk reduction would become a priority for us.

Here’s an illustration of why this made sense: this graph shows the dramatic increase in the number of people affected by natural disasters from 1900 through to 2011.

Screen Shot 2018-09-19 at 4.13.26 PM.png

 

No organization working to create better futures for children, or even human beings, can afford to ignore the fact that vulnerability is increasing very quickly, and dramatically.

*

And it was becoming very apparent that the Australian government was quite keen on both responding to disaster response and tackling climate change.  AusAID had already set up a pre-approval mechanism for emergency-response work, through which a designated group of NGOs could respond very quickly to emergencies; ChildFund Australia wasn’t in the group.

But it all meant that, if we developed capacity, and a strong record of working in DRR and emergency response, there might be opportunities for collaboration with the Australian government, including the possibility of funding …

*

Our ChildFund Alliance partners in the US, ChildFund International, had been doing some good emergency response work back when I was doing my “Bright-Futures”-related consulting work, around 2003, but their leadership that came onboard later had (mistakenly, in my view) wound up that effort soon after.

(For me the decision to exit emergency-response work was a mistake for several reasons.  Firstly, they were already doing good work in that space, which was helping children in very difficult situations.  It was an area with great potential for fundraising, because of the urgency involved as well as the priority being put on this work by donors, particularly government bi-lateral donors.  And because raising funds in this space was less costly than other revenue sources, overheads could be used to support program development in other areas.  Finally, as general child-poverty levels dropped, and our climate changed, child poverty began to be much more related to vulnerability, as we in Australia had determined in the design of our program approach.

ChildFund International – the US Member of the ChildFund Alliance, would later get back into the emergency-response business, and we would begin to structure a formal collaboration with them, and with our Canadian partners, in the humanitarian space.  But we all lost time – for example, perhaps if ChildFund International had stayed engaged in emergency response, we in Australia might have qualified for AusAID’s pre-approval pool …)

And, likewise, there was little DRR-related work going on across the Alliance, and certainly we at ChildFund Australia had no significant track record in that area.  So if we were to build up our expertise, we needed to start by bringing it in from outside.

Happily, at around that time Nigel Spence (the ChildFund Australia CEO) and I were in a meeting in Canberra, talking about these issues.  A very senior AusAID leader told us that if we sent her an ambitious plan to help communities in southeast Asia prepare for disasters, at scale, it would be funded.  Especially if it involved a consortium.

So I went back to Sydney and got to work.  In short order I recruited two other preeminent Australian INGOs (Plan International Australia, and Save the Children Australia) and wrote up an extremely-ambitious proposal: we would reduce climate- and disaster-related risks in 1000 communities in five countries across Asia, reaching approximately 362,500 children and adults directly, and over 1.5m people indirectly.  Among the expected outcomes of the project, we would also seek to empower children and youth:

  • Children and youth will be recognised within communities as effective agents of positive change;
  • Children and youth will have increased knowledge and understanding, and thus capacity, to anticipate, plan and react appropriately to short-term risks and longer-term threats and trends.

These aims are very consistent with our presentation at the United Nations World Conference on Disaster Risk Reduction  that took place in Japan in March, 2015.  More on that conference below!

*

The response from AusAID was quite positive.  But:

Mistake #1: I didn’t get anything in writing!

However, given that we had gotten such a strong green light from Canberra, Nigel and I agreed that we’d go ahead and recruit a DRR expert to help us finalize the proposal and begin to prepare for the work.

*

We had a good response to our recruitment outreach and, in the end, I was lucky to recruit Sanwar Ali from Oxfam Australia to be our first “Senior Advisor for ER and DRR.”  Sanwar had long experience in emergency response across Asia and parts of Africa, and was also deeply experienced with DRR.  At the same time, I felt the Sydney International Program Team could benefit from Sanwar’s background – he would round out the team on several dimensions...

IMG_3483 copy 2.jpg

Sanwar Ali, ChildFund Australia’s Senior Advisor for Emergency Response and Disaster Risk Reduction

 

Soon after Sanwar joined, AusAID let us know that their “green light” was actually just encouragement to include DRR work in our normal project portfolio, so there would be no ambitious funding for scaling-up across Asia!

That was very frustrating.

No matter.  Our program approach had committed us to working to help protect children from disasters, and Sanwar would be key to that program development effort.  It was just unfortunate that the funding for his position evaporated!

*

Soon Sanwar led the development of policies for Emergency Response and Disaster Risk Reduction, which we would incorporate into the ChildFund Australia Program Handbook.

The ER Policy was an expanded update to previous policy, but the DRR policy was new.  Like all our program policies, this one was rather brief.  Its introduction and policy statement were succinct:

Introduction

The frequency with which disasters are occurring is increasing dramatically, in part because of human-induced climate change. This trend represents a threat to children, youth, and caregivers, and has the potential to undermine progress made in improving wellbeing and reducing poverty.

At the same time, however, thanks to efforts of local communities, national governments, the international community, and INGOs such as ChildFund, the human impact of these
disasters has been reduced over time.

Reducing risks for children, youth, and caregivers is central to ChildFund’s program approach, because communities which are resilient to risks are best positioned to provide security and ensure continued wellbeing of vulnerable children.

This policy provides an organisational framework for action related to disaster risk reduction.

Policy Statement

ChildFund Australia will work to ensure that disaster risk reduction (DRR) plans, known as Community Action Plans – CAPs) are in place in all communities where we work. These CAPs will be developed in a participatory manner, consistent with the Hyogo Framework for Action, and according to relevant guidelines in each country.

DRR efforts will be mainstreamed in our development, humanitarian and advocacy activities whenever appropriate.

The rest of the policy document outlined “Key Actions” required by ChildFund Australia staff at various levels and locations, and outlined how work in this area was connected to our organizational Outcome Indicators.

*

Sanwar and I worked together on two ChildFund Alliance-wide projects.  I had proposed that the operational Members of the Alliance (which, initially, meant Australia, Canada, and the US, with Japan and Korea observing) work together in Emergency Response and DRR, partly because we would be able to show global reach that way.  So we planned to develop a set of common policies and procedures through which we would respond to humanitarian disasters jointly.

I want to describe the other project that Sanwar and I worked on in a bit more detail: I led the ChildFund Alliance delegation to the United Nations World Conference on Disaster Risk Reduction that took place in Japan in March, 2015; a delegation from ChildFund had a big presence at that conference.  And, the week before the UN Conference, I visited areas of Fukushima Prefecture in Japan that had been affected by the earthquake-tsunami-nuclear explosion that had devastated that area exactly four years before.  My visit was part of JCC2015 (Japan CSO Coalition for 2015 WCDRR) conference, which concluded the day after the field visit with an important program of sessions.

I will bring in some content from blogs I published here in 2015, just after my trip to Japan…

*

The visit to Fukushima was unforgettable. The impact of the horrific events of four years ago was still very apparent, as was the strong and continuing resilience of the local people, even those who (at that point) remained in “temporary” camps.

A good summary of the events of the events of 2011, along with some lessons we should learn, is contained in the publication “Ten Lessons from Fukushima.” In brief, a massive (magnitude 9) earthquake, at 2:46pm on March 11, 2011, caused extensive damage across northern Japan, and triggered an enormous tsunami.  This tsunami struck coastal zones of northern Japan an hour after the earthquake, destroying vast areas and killing many. The Fukushima “Dai-Ichi” (number one) nuclear plant, located on the coast, was severely damaged by the tsunami. The next day at 3:36pm, core meltdown and a massive explosion destroyed reactor unit 1. Other reactors subsequently failed.

It has been estimated that the equivalent of 168 Hiroshima nuclear bomb’s worth of radiation was released when Fukushima Dai-Ichi reactor unit 1 exploded.

Evacuation orders were slow to come, partly due to the loss of communications facilities, but also due to startling management and leadership errors.   Eventually, after suffering serious exposure to radiation, some 300,000 people were evacuated from areas inside a 30km radius around the reactor complex.

This map shows the 30km evacuation zone, and the radiation plume.

Screen Shot 2015-03-23 at 2.16.21 pm

We would visit areas well within the red area during our trip.

Radiation drifted with the wind, falling onto land and people and animals, leaving extremely high levels of iodine, cesium, and other radioactive elements. Radioactivity fell across inhabited, farmed, and forested areas according to the wind direction at the time, and radioactive water was released into the sea – even four years later, when I visited, something like 600 tons of radioactive water were being released into the ocean each and every day. “Safe” levels of radiation were repeatedly raised.

Investigations found that the nuclear disaster was preventable; appropriate safety mechanisms – existing at the time – were not incorporated into the Fukushima Dai-Ichi reactors when they were built.

Another good document covering the disaster and its aftermath was created by the Citizens’ Commission on Nuclear Energy.

I had not realised that vast areas of Fukushima Prefecture were still closed due to extremely high radiation levels, and that the Fukushima Dai-Ichi reactor complex was dangerously unstable, and events could have spun out of control again at any time. Wreckage littered a vast area, and radiation in many of the places we visited was startlingly higher than what is considered to be safe, if “safe” levels even exist.  Radiation contamination seriously impeded recovery efforts, as workers could not stay in the area for very long. (I was struck by the contrast with Hurricane Katrina where, even with the bungled response, cleanup was far more advanced four years after the storm than what we saw in Fukushima… The difference?  The radiation.)

We visited Iitate village, a place where some of the highest levels of radiation were found just after the meltdown – it’s right near the center of the darkest plume in the map above. We visited Namie town, where we met with local officials who are doing their best to deal with the situation, spending their days in highly-contaminated areas.   We visited tsunami-affected areas of Namie, where we could see vast areas of wreckage, damaged housing, vehicles crushed by the power of the waves.

IMG_3919 IMG_3904

 

And we drove to within 4km of the Fukushima Dai-Itchi plant.  We finished the day by visiting a group of evacuees from Namie town, at that point still in “temporary” housing in Kohri town. Their courage and resilience was powerful and inspiring.

Radiation levels on our bus were high. This reading, taken during our lunch break, is 0.82 micro Sieverts per hour, which is considered “safe” for short-term visits only.

IMG_3872

Throughout the area, thousands of one-ton bags of soil, wood, debris, etc., were piling up. Topsoil, down 20cm, from farms was being removed; areas 20m around houses were similarly being cleaned. I was told that these bags, nearly a million of them now, would be taken to gigantic incinerators for processing.  Nineteen hugely-expensive incinerators were being built, each with a short lifetime of just two years.  Meanwhile, the residual ash was being stored, “temporarily”, at the Fukushima plant.

IMG_3866IMG_3910

Radiation “hotspots” remained, however.

The catastrophe took place four years before our visit.  The situation when we were there is described in the “Fukushima Lessons” publication noted above, and in a recent article in The Economist. There were still over 120,000 “nuclear refugees” who could not return to their communities, their homes, the areas we visited.  They probably never will, despite financial incentives being offered, because levels of contamination were still so high.

We met with a group of displaced people, from Namie town, now living in Kohri town in “temporary” housing far from their homes, to which they will never return. They lived virtually on top of each other, able to hear the softest sounds from their neighbours (such as snoring!) – quite a change from Namie, where their ancestral homes and farms were.

IMG_3968

We were lucky to be presented with a narrative, using a hand-made storyboard, of their experience in the days after the tragedy.  It was a powerful and moving description of their suffering; and in the telling of the story, of their resilience and spirit.

IMG_3946

Discussing Fukushima was sensitive in Japan. I found it interesting, the next week at the WCDRR Conference in Sendai, that the many references to the Fukushima disaster refer to it as the “Great East Japan Earthquake” or “Great East Japan Earthquake and Tsunami” disaster.  No reference to the horrific and ongoing events at the nuclear power plant, which were somehow not relevant, or too sensitive to mention.

Some final reflections:

  • Several times during the day in Fukushima, we heard local people use a striking visual metaphor: “nuclear power is like a house without a toilet.” Until there is a safe and permanent place for nuclear waste, we should not be using this source of power;
  • In our increasingly-unstable world, the use of a technology with such potential for devastation and tragedy – Fukushima, Chernobyl – seems foolish. The precautionary principle should be followed;
  • Let’s try to remember that the impact of these mega-disasters still persists.  Four years after the disaster, there were 120,000 people still displaced, even in Japan, such a rich country.  And cleanup and stabilization in Chernobyl, even more than 30 years later, is still far from finished.

Ultimately, I took a sense of inspiration from the stories of the people affected, of those who responded and who are responding still, and those who are living lives of advocacy to ensure that the injustices that took place in Fukushima are corrected, and not repeated.

Reflecting on what we saw, those (government, TEPCO, etc.) who were saying that the area had recovered, or was on the road to recovery, were either not seeing what we saw, or not telling the truth.

*

I want to share links to two articles from The Guardian, describing the situation in 2017, two years after my visit to Fukushima: this article describes how radiation levels inside the collapsed reactor had risen to “extraordinary” levels, unexpectedly; and this article provides a lengthy update of what are described as “faltering” cleanup efforts.  They make for depressing reading.

*

On a more positive note, and with the importance of DRR clearly in my mind after the visit to Fukushima, I led the ChildFund delegation to the United Nations World Conference on Disaster Risk Reduction in Sendai.

I want to share content from a summary article I published in DevEx soon after the conference ended:

After several sleepless nights of negotiations, representatives from 187 governments agreed the Sendai Framework for Disaster Risk Reduction 2015-2030. As one of more than 6,500 participants in the Sendai conference, I can attest to the exhausted sense of relief that many of us felt when the framework was finally announced.

The framework that emerged contains seven global targets — nonbinding and with funding left unspecified — focused on reducing disaster risk and loss of lives and livelihoods from disasters.

Sendai was a monumental effort, involving government representatives, senior-most leaders of the United Nations and several of its specialized agencies, staff from hundreds of civil society organizations and private sector businesses, and the media, with venues scattered across the city. The urgent need for international action to reduce disaster risks was thrown into stark relief when Cyclone Pam — one of the most intense storms ever to occur in the Pacific — tore across Vanuatu just as the conference began.

Why does what happened in Sendai matter? Because hazards — both man-made and natural — are growing, as our climate changes and growing inequality contributes to a sense of injustice in many populations. Doing nothing to prepare for these increased risks is not a viable option for our future.

Also, Sendai matters because the conference was the first of four crucial U.N. gatherings this year. What happened in Sendai will influence the Third International Conference on Financing for Development, coming in Ethiopia in July; the post-2015 sustainable development goals that will be discussed at the global U.N. summit in September; and the U.N. Climate Change Conference in Paris in December.

ChildFund’s delegation, part of the “Children in a Changing Climate” coalition, had several objectives in Sendai. In particular, we worked to make the case that children and young people should be seen as agents of change in any new DRR framework.

Children and young people are normally seen as helpless, passive victims of disasters. During and after emergencies, the mainstream media, even many organizations in our own international NGO sector, portray children and young people as needing protection and rescue. Of course, children and young people do need protection. When disasters strike they need rescue and care. But what such images fail to show is that children also have the capacity — and the right — to participate, not only in preparing for disasters but in the recovery process.

Since the last U.N. agreement on DRR, in 2005, we have learned that children and young people must be actively engaged so that they understand the risks of disasters in their communities and can play a role in reducing those risks. Children’s participation in matters that concern them is their right — enshrined in the 1989 Convention on the Rights of the Child — and strengthens their resilience and self-esteem.

And, crucially, we know that young people’s participation in DRR activities leads to better preparation within families and in communities.

My presentation at Sendai included examples of how youth brigades in the Visayas region in the Philippines helped in the Typhoon Haiyan response.

 

The example we used to make our case came from ChildFund in the Philippines.  In 2011, with support from local government and the Australian government, ChildFund worked with several youth groups to help them prepare for disasters, and to help them help their communities prepare. We engaged young people in Iloilo and Zamboanga del Norte provinces to identify hazards, to develop local DRR and disaster risk management plans, to train children and young people in disaster risk management, and to raise awareness of DRR in eight communities.

Little did we know that, just 18 months after the project concluded, this effort would really pay off.  Many of us remember vividly the images of Typhoon Haiyan barreling across the Philippines in November 2013, just north of where our project was carried out.  As local and national government in the Philippines began to respond to the typhoon, with massive support from the international community, we could see that the efforts of children and young people we had worked with were proving to be important elements in managing the impact of the storm.

Advocacy of children and young people during the project had led to the local government investing more in preparedness and mitigation, which was crucial as the storm hit. Young people trained in the project trained other groups of parents and youths, building the capacity of people who were affected by, and responded to, Haiyan. Local government units mobilized disaster risk reduction committees, including youth members, who were involved in evacuation of families living in high-risk areas. Youth volunteers helped prepare emergency supplies and facilitated sessions for children in child-centered spaces that were set up after the typhoon passed.

This experience led ChildFund to strongly support elements of the Sendai Framework that recognize the importance of the meaningful participation of children and youth in DRR activities. We are happy to see the text calling for governments to engage with children and youth and to promote their leadership, and recognizing children and young people as agents of change who should be given the space and modalities to contribute to disaster risk reduction.

But two major weaknesses can be seen in the Sendai Framework: Its targets are not binding and are not quantified; and no global commitments to funding DRR actions were made. Many observers feel that governments were keen to establish (or not establish!) precedents at Sendai that would bind them (or not bind them!) in the high-stakes conferences to come. These weaknesses are serious, and greatly undercut the urgency of our task and likely effectiveness of our response.

Still, on balance, the Sendai Framework is good for children and youth, certainly better than failure to agree would have been. Let’s hope for even stronger action in Addis Ababa, New York and Paris, with binding targets and clear financial commitments.

Then our children, and grandchildren, will look back at Sendai as a milestone in building a better, fairer and safer world.

*

After the success of Sendai, Felipe Cala (then working with the ChildFund Alliance secretariat) and I took a couple of days to visit Kyoto, a marvelous place full of culture and beautiful scenery.  And we enjoyed the modern “bullet” trains, that made our countries look so backward.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. West Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System;
  37. Mt Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness System;
  38. Mt Waumbek (38) – “Building the Power of Poor People and Poor Children…”
  39. Mt Cabot (39) – ChildFund Australia’s Teams In Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam;
  40. North Twin (40) – Value for Money.

 

 

North Twin (40) – Value for Money

September, 2018

During my years with ChildFund Australia, the overseas-development sector, and organizations like ours, were booming.  The subject of this brief article is one issue that became a focus of attention during those years: Value For Money.  What is it?  Is it just a “bumper sticker”?  If not, how can we measure it?  How can we assure that our organizations deliver it?

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

I’ve recently been writing about the six years I was honored to serve as International Program Director at ChildFund AustraliaIn an earlier post in this series, I introduced, and thanked, the team I worked with in Sydney, the “International Program Team.”  And last time I took time to thank the great teams that I worked with in Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam.

Before digging into the what “Value For Money” meant for us…

*

I climbed North Twin (4761ft, 1451m) on 2 September, 2017, with our grand-daughter V.   This would be number 40 of the 48 4000-footers that I hoped to climb, and it would be V’s first hike of this length, first real mountain-top, so she seemed a little bit curious about how it would go … but, as always, enthusiastic about giving it a try!  Just in case, our plan was to get to the top of North Twin and then decide if we wanted to continue to South Twin.

It was a perfect, dry, cool, cloud-free day for a hike:

Screen Shot 2017-09-05 at 4.01.23 PM.png

 

(I have also highlighted ascents of six other 4000-footers on the map, all of which I had climbed earlier in this series: Lincoln, Lafayette, Garfield, Galehead, West Bond, and Bond.)

*

We left Durham fairly early that morning, at around 7:45am.  I wanted to leave early, because it seemed like the hike might be a long one, which would normally mean that I’d camp up in the White Mountains the night before, to get an early start; 7:45am wasn’t really early enough, but we headed west on Rt 4 to Concord, and then north on I-93 to Lincoln, where we picked up some sandwiches for lunch.

It was nearly 10:45am when we arrived at the very crowded North Twin trail-head: this was Labor-Day weekend, and the parking area on Haystack Road had overflowed.

IMG_2360.jpg

IMG_2365.jpg

 

The hike up to the top of North Twin was straightforward: at first, up a nearly-flat old railway grade along the Little River, gradually getting a bit steeper, and then crossing the stream once (at 11:41am):

IMG_2370.jpg

IMG_2372.jpg

 

The trail got gradually steeper as we neared the top of North Twin:

IMG_2376.jpg

Steeper

 

We saw very few people on the trail, which was somewhat surprising, especially given the overflow of cars down at the trail-head.  As we began to get above tree-line, the views became spectacular, perhaps the clearest and sharpest views I’ve had on all of these climbs:

IMG_2384.jpg

IMG_2386.jpg

 

We arrived at a ledge outlook very near the top of North Twin at around 1:30pm, and had lunch there:

IMG_2392.jpg

Mt Washington And The Presidential Range

WP_20170902_019.jpg

 

There was a small group of people here, and it got a bit crowded with hikers, mostly coming down from South Twin.  We outlasted them, and had lunch pretty much to ourselves.

This video of the view from that ledge outlook illustrates what a spectacular place it was, what a perfect day we had:

 

IMG_2403.jpg

IMG_2400

 

We finished lunch and left that ledge at about 2pm, and arrived at the true top of North Twin a few minutes later.

IMG_2405.jpg

IMG_2409.jpg

 

We had reached my 40th of the 48 4000-footers!

From here, the view was amazing.  To the west and south we could see six 4000-footers (Galehead, Mt Flume, Mt Liberty, Mt Lafayette, and Mt Garfield), and Galehead Hut below us:

Screen Shot 2017-09-05 at 6.37.46 PM.png

 

V did very well getting up to the top of North Twin, and she was keen to continue.  So there was no question in our minds – we would now continue on North Twin Spur towards the summit of South Twin, and then retrace our steps to Haystack Road.

Onward!  More on that next time…

IMG_2411.jpg

 

*

I served as International Program Director for ChildFund Australia for over six years, from mid-2009 until October of 2015.  Those were exciting and rewarding years for Jean and I, living in Sydney; and they were great years for ChildFund Australia.  In fact, generally-speaking, the whole overseas-development sector prospered during those years, because of great support from the Australian public and, in particular, from the Government.

The Rudd Government had been elected in 2007, and one of their stated commitments was to raise the overseas-aid budget up to commitments made by previous governments, with a target of 0.7% of GNP.

As can be seen in this graph, Kevin Rudd delivered a dramatic increase:

 

In constant (2018) dollar terms, Australia’s ODA budget grew from A$ 3841m in 2008 to A$ 5479m in 2012, an increase of nearly 43%.  (To be fair, as can be seen, this increase was actually a continuation, an acceleration, of growth initiated by the Howard government from around 2001.)  After 2012, the aid budget stayed fairly constant until 2015, when the Abbott government made dramatic cuts, going to the extent of even closing the government agency responsible for managing the program, AusAID.  By then I was nearing the end of my time with ChildFund.

The big ODA increases after 2008 meant that we could do more, reach more people, have more impact.  Our programs grew in scale and sophistication – many of the innovations that I’ve described in this series of articles (for example, here and here) were made possible, at least in part, by generous funding from the Australia government.

But it turned out that this growth in official development assistance wasn’t politically sustainable.  As other areas of government budget were tightened, political pressure grew to reign in ODA spending.  The Rudd and Gillard governments addressed this pressure in several ways, one of which was to emphasize “value for money.”  Agencies such as ChildFund began to be asked to demonstrate that they were delivering good value for the taxpayer’s dollar.  (The Abbott government didn’t resist the pressure at all, which is another story.)

Fair enough: nobody can be against delivering value for money.  But it was never clear what, exactly, “value for money” really was.  In fact, one quite-senior AusAID official once referred to it in a meeting that I attended as a “bumper sticker”!  Despite this, all INGOs in Australia that received government funding came under pressure to demonstrate their approach…

I’ve written about this topic in an earlier article.  Here I want to extend that discussion and update it with later work we did in ChildFund Australia to respond to the (correct, but vague) pressure we began to receive from AusAID staff.

I began to think about the concept, and started circulating drafts to our staff in Sydney and overseas.  Here are some of the results of that process of reflection.

*

All reputable organizations working to overcome poverty seek to ensure that they provide “value for money.” Because our work is of the highest importance to people living in poverty, we must make best use of all resource we have. And, at the same time, because we are entrusted with valuable resources, we must be careful stewards of this trust.

But it is challenging to articulate a definition of “value for money” for work in the development sector.  Some large agencies have taken an econometric approach, using concepts of social return on investment and cost-benefit analysis.  These tools are very suitable, and represent a rigorous approach to assessing “value for money,” but they are much too complex for most development agencies to consider, and are very costly to implement.  Other agencies use randomised control trial methods, adapted in part from the pharmaceutical industry, where an intervention is tested and compared with a carefully-selected control population where the intervention doesn’t take place.  While such methods are increasingly accepted in our sector, for most INGOs like ChildFund (generalists, that don’t have the funds to hire the specialised staff and undertake the extensive reviews required), these methods are not yet fit for purpose.

(I’ve written extensively elsewhere about how we at ChildFund Australia approached the measurement and improvement of the effectiveness of our work: here, and here.)

And yet, the notion of “value for money” was important to us.  So how would we approach it?

*

A Definition of “Value For Money”

The first step we took was to clarify our definition of “value for money,” and to indicate the mechanisms through which we could ensure that we achieve good value for the resources we manage.

After extensive research and reflection, and many drafts, we settled on this simple definition: For ChildFund Australia, “value for money” had three elements:

  • Firstly, we use resources effectively;
  • Secondly, we use resources efficiently;
  • Thirdly, we are accountable about our use of resources to our stakeholders and ourselves.

Using resources effectively, efficiently, and accountably – that was how ChildFund Australia intended to ensure “value for money.”  But for this definition to be operational, we needed to define what those terms meant!

*

For ChildFund Australia, when we worked on this issue, we decided that “effectiveness” meant working on the causes of child poverty, according to our understanding of child poverty.  And it meant having a systematic approach to achieving development effectiveness, embedded in our programmatic work processes.

In terms of causes, we had learned that children are poor because they lack assets such as health, education, and income.  Assets such as clean air and water and access to productive land.  Assets such as the bonds of trust and solidarity in their communities and across cultures.  They experience poverty as being excluded from having voice and agency in processes that affect them. They are poor also because they, and their families and communities (and even nations) are relatively less powerful than other (children, families, communities, nations…)  And they are poor because they face increasing risks – from other people, from civil conflict, from climate change, and so forth.

So our programs were designed to build human, financial, social assets; stimulate opportunities for people living in poverty to express their opinions and exercise their personal agency; enhance the power of poor people to take collective action in the interests of their children; and strengthen protective networks around children.

But to be effective, we also needed to establish and maintain systems and procedures that keep us focused on these causes of child poverty.  Our “Development Effectiveness Framework” (DEF) provided that operational focus, making sure that all our programmatic efforts were aligned towards a defined purpose that was clearly embedded in each particular context in which we worked.

The DEF also supported a learning, adaptive approach, because the work we did was complex and only rarely could external models be put into place in the range of contexts where we work without extensive adaptation. This means that a tolerance for the risk that comes with innovation was also required to ensure effectiveness.

For us, that was effectiveness in a nutshell – understanding and addressing the causes of the phenomenon we sought to change, striving to understand the mechanisms through which those causes act, and taking deliberate action aligned to achieve our purpose.

Using resources efficiently meant that we put in place appropriate systems and procedures to ensure that we allocated our human and financial resources explicitly, clearly, for the purposes that are agreed, and according to good business practices.  Not being wasteful.

So we had budgets which were reviewed and approved; our expenditures and activities were authorized and controlled and monitored according to agreed protocols and standards.  We supported and trained our staff so they had the tools and competencies they needed.  We reviewed the use of these resources frequently with an eye towards ensuring that our costs were in line with good practice. And we had clear procurement and tendering procedures, and robust policies and procedures (including independent audits) to deter fraud.

These systems and procedures were set out clearly in our finance and HR documentation.  All our team members were trained in their use as appropriate to their functions, and our management teams in Sydney and in our Country Offices rigorously followed up operations to ensure that these guidelines were followed and that they in fact resulted in “efficient” use of resources.

In addition, we carefully managed the use of foreign staff in our programs, because we firmly believed that local people had the knowledge, skills, and capacities that were needed.  Our local staff were central to our program approach, which relied on long-term, positive relationships with communities and local partners.  And external resources were always somewhat more expensive and should therefore be used judiciously.

Finally, we couldn’t deliver value for money unless our stakeholders knew what we were doing and were able to influence us.   So we strived to be accountable – transparent and responsive – by developing our programs together with local communities and partners; by reporting periodically and fully about what we do with, and accomplish with, funds to a wide range of publics; and by responding to concerns, questions, suggestions from our stakeholders and the public.

We had a range of processes and procedures to enhance our accountability, transparency, and responsiveness, but this was not a destination – it was a journey, through which we sought to continually be more accountable.

Operationalizing the approach

That all sounded good, and correct, so then we had to put these measure in place, working operationally in the different places we worked.

In terms of effectiveness, ChildFund’s “Development Effectiveness Framework” (the “DEF”) was contained in Chapter 3 of our Program Handbook, and was mandatory for all ChildFund Australia offices. The DEF established how ChildFund’s Vision, Mission, Program Approach, and program policies were implemented in each particular country context.

The DEF contained procedures, formats, and guidelines for:

  • designing and improving holistic, evidence-based programs;
  • preparing, assessing, approving, monitoring, and evaluating projects that contribute to the goals of each program;
  • learning from project implementation;
  • contributing to community planning of projects;
  • assessing the impact of our work on the causes of child poverty.

When thinking about how to make sure that our operations were efficient, we had policies, procedures, resources, and systems in place, from the collection of funds through to delivery of quality programs pursuant to our Mission.  There were financial systems to control funds, administration systems to ensure appropriate use of funds in procurement and day-to-day administration, and people and organisational systems to support the people who work for us.

We were committed to minimising the risk of funds being misappropriated, wasted or used to fund terrorism and had policies for fraud, procurement and counter-terrorism. Our staff and partners to whom we entrust funds were regularly trained on the importance of complying with these policies and how to apply them. Our financial reports were audited by an international audit firm annually and we conducted internal audits in the field on a regular basis. The learning gained from these exercises was used to improve our financial, administration, and human-resources systems.

These systems and policies were documented in the Sydney Finance Manual, HR Manual, and policies and procedures maintained centrally and mandatory for all ChildFund Australia offices, including policies on Fraud Awareness and Prevention, and Procurement.

In addition, Country Offices had their own local procedures, consistent with central, organisation-wide policies and procedures that, together, ensured that our operations were efficient.

Finally, in terms of accountability, our DEF mandated several moments in the project cycle where key stakeholders (children, youth, caregivers, local partners, local government) were informed and were given authentic opportunities to influence decisions, and to help reflect on our performance.

Consistent with legal requirements, accreditation with the Australia government, and the code of conduct that was agreed by nearly all Australian INGOs (the ACFID Code), ChildFund Australia put in place a range of communication systems to inform our stakeholders (such as the reporting of financial and programmatic results) and to enable them to provide comments about our work, including complaints.

We instituted regular monitoring and evaluation processes, yearly financial audits and Annual Reports, yearly reporting to sponsors, and annual Country Office Reports – all of which were available publically on the ChildFund Australia website. A range of programmatic results were also published on our website, in the “Development Practitioners” section.

*

In summary, during those years we took up the challenge of ensuring “value for money” by creating and implementing a Development Effectiveness Framework that was based on our understanding of the causes of child poverty, and which gave us the tools to measure and improve the impact of our work.  We created and followed a set of good business practices to ensure that we worked efficiently.  And we took measures to communicate the results of our work, and reported on our financial results, to be accountable to donors and community partners.

“Value for money,” in those days, was a vague concept, which nevertheless was important to us and to the whole sector.  Our approach to defining and delivering “value for money” was relatively straightforward, befitting the nature of our agency, but at the same time it was internally consistent and complete.  Other than the two agencies that I know of that tried to implement “randomized control trials” in a few test projects, I am not aware of any other Australian INGO that had as comprehensive and complete approach to this issue as we did at ChildFund Australia.

I am proud of what we achieved, how we took up the challenge to ensure that we were providing “value for money.”

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. West Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System;
  37. Mt Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness System;
  38. Mt Waumbek (38) – “Building the Power of Poor People and Poor Children…”
  39. Mt Cabot (39) – ChildFund Australia’s Teams In Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam.

Mt Cabot (39) – ChildFund Australia’s Teams in Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam

August, 2018

During my years with ChildFund Australia, I was privileged to work with great people in six countries.  In an earlier article, I wrote about the terrific Sydney-based International Program Team: Caroline, Jackie, John, Mai, Manasi, Maria, Ouen, Sanwar, Sarah, Richard, Terina … This time, I want to thank the teams in Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam that did such great work to help children and their families overcome poverty.

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 34 years ago: on development, social justice, conflict, experiences along the way, etc.

Last time I described how we built collective action for child rights into our program approach, beginning with an exciting pilot project in Cambodia.  It was our way of building a rights-based approach into our development work, and I think there were many valuable lessons we learned through that project.

Earlier, I described ChildFund’s Sydney-based International Program Team.  Now, in this blog article, I want to introduce our teams in Southeast Asia and the Pacific: the people I had the pleasure of working with, overseas, during those years in Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam.  Great people, doing great work…

But first…

*

In the previous article in this series, I described climbing Mt Waumbek on 28 August.  My plan was to get to the top of Mt Waumbek, stay the night at Moose Brook Campground, and attempt to climb Mt Cabot the next day.  These are the two northern-most 4000-footers in New Hampshire, the farthest away from Durham, where we live, so it made sense to climb them both in one two-day trip.

I climbed Mt Cabot (4170ft, 1271m) on 29 August, 2017.  Cabot’s a much longer hike than Waumbek, especially given that (instead of going up-and-back) I hiked a loop up Bunnell Notch Trail, across Kilkenny Ridge Trail (including a visit to “The Horn” at lunchtime), and finally down Unknown Pond Trail.

Screen Shot 2017-08-30 at 2.43.27 PM.png

I Had Climbed Mt Waumbek The Day Before

 

Unusually, this day was characterized by a certain level of worry and anxiety: the trail-head is at the Berlin Fish Hatchery, which closes its gates at 4pm.  So, as I arrived at “The Horn” for lunch, at around 12:15pm, I began to worried about reaching my car in time!  Would I be able to reach the top of Mt Cabot and get back?

*

I had camped at Moose Brook Campground the night before, so was able to get going pretty early, arriving at the York Pond trail-head at 8:41am.

IMG_2225

IMG_2226

 

There were two cars at the trail-head as I arrived, so (once again) it looked like I would have a quiet hike!  After a lengthy stretch walking through hip-high shrubs and ferns, I emerged into typical White-Mountains low forest, along the side of a stream:

 

IMG_2232.jpg

IMG_2233.jpg

 

I reached Kilkenny Ridge Trail at just after 10am, an hour and a quarter after starting.

IMG_2235

IMG_2238

IMG_2240

 

Along this section of Kilkenny Ridge Trail I encountered two hikers, both coming the other way.  They were doing the “Coos Trail,” which I had never heard of; we had a short chat about the 48 4000-footers (they had done them all, and we exchanged thoughts about Owl’s Head and the looooong Lincoln Woods Trail.)

At around 11am I reached Cabot Cabin, which is NOT at the top of Mt Cabot:

IMG_2248.jpg

IMG_2244.jpg

IMG_2246.jpg

 

There’s a small cairn just beyond Cabot Cabin, with views towards the north:

IMG_2250.jpg

 

Readers may have noticed that I don’t mention much wildlife when I describe these climbs.  That’s because there just isn’t much, or perhaps the animals that are in the area have learned to avoid humans.  But I did see a beautiful bird, probably a grouse of some kind, just after Cabot Cabin:

IMG_2252.jpg

 

The walk along Kilkenny Ridge Trail to the top of Mt Cabot is very pleasant – no real views, but a fine ridge walk.  I got to the top of Cabot at about 11:20am – number 39!:

 

 

 

From the top of Mt Cabot I dropped down towards “The Bulge,” continuing along Kilkenny Ridge Trail:

 

 

Between “The Bulge” and the spur trail up to “The Horn,” I began to be concerned about the time.  The trail-head, where I had parked, was inside the Berlin Fish Hatchery, which is closed (and the road gated, with the exit blocked) at 4pm.  It was nearly noon, and even though I hadn’t begun the hike until just before 9am, when I consulted the map I realized that I wasn’t yet halfway done!

But I was determined to walk up the spur to “The Horn,” as the view from there was supposed to be excellent.  I arrived at the junction with the path up to “The Horn” just at noon:

IMG_2270.jpg

 

As promised, the views from the top were great, though the clouds that were building took off a bit of the luster:

IMG_2274.jpg

 

I had a quick lunch there at The Horn, and decided to walk as quickly as possible to Unknown Pond: I figured that, if I arrived there before 1:15pm, all would be well and it would be easy to reach my car in good time.  If I got to Unknown Pond much later than 1:30pm, it might be close!  And even though I had my tent (down in my car), and would be fine, I didn’t want to spend the night at the trail-head, partly because there was no cellphone reception there and I thought Jean might be worried if she didn’t hear from me!

So I walked very quickly from “The Horn” to Unknown Pond, reaching there at 1:14pm,  just before the self-imposed deadline I had set!  The walk was pleasant, but I moved so fast that I didn’t see much of it!

Unknown Pond is quite pretty, with a campsite nearby – perhaps worth a visit sometime when the sun is out!

IMG_2280

IMG_2295.jpg

IMG_2294

 

Here I reached the junction with Unknown Pond Trail, which I would follow 3.3 miles down to the trail-head at York Pond Road, where I had left the car:

IMG_2285.jpg

 

Much of the rest of the day would be spent walking down alongside the stream that flows out of Unknown Pond, hiking at full velocity – I really didn’t want to be stuck behind the Hatchery gate!

 

There were many small meadows along the stream, with nice wildflowers, and as I dropped down in elevation, there were even some indications of the coming autumn:

IMG_2296

IMG_2297

IMG_2300

IMG_2302.jpg

Autumn Is Coming!

 

So I flew down Unknown Pond Trail, not knowing when I would arrive at the trail-head; not sure if I’d make it or not because I couldn’t tell where I was!

I need not have worried, because I arrived at my car at 2:40pm – over an hour to spare!

IMG_2307.jpg

 

Mt Cabot, at least when hiked in a loop as I did, is a pleasant and long ascent, not challenging but a long day.  I had climbed number 39 of the 4000-footers – 9 to go!

*

ChildFund Australia is part of a global group called, collectively, the ChildFund Alliance.  It’s a fairly loose grouping, in which each Member operates quite autonomously, and the few common policies that did exist across the ChildFund Alliance were not strictly enforced.  I will write more about some of the unusual behaviors of INGO groupings such as the ChildFund Alliance and Plan International in an upcoming article in this series.

For now, I just need to mention that, in the ChildFund Alliance, some Members both raise funds in their home markets and implement programs in developing countries, while others essentially only raise funds, supporting the work of other, more operational Members.  ChildFund Australia is in the former category – raising funds in Australia and also operating programs as “Lead Member” in Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam, with other ChildFund Alliance members supporting our work financially.

During my six years with ChildFund Australia, I worked directly with those five Country Office teams, and in this article I want to introduce those teams.

*

ChildFund Australia’s first operational Country Office was established in 1985, in Port Moresby, Papua New Guinea.  Working in PNG is very challenging; it’s a context of great poverty, incredible cultural and ecological diversity, huge inequality, astonishingly high costs, and shocking levels of violence.  My admiration goes to the three ChildFund Country Directors that I worked with in PNG, and their staff, that did such great work in the most challenging environment I ever worked in: I can compare it only to working in Tuluá, Colombia in terms of complexity and challenge.  And PNG is more complex and challenging.

Warwick “Smokey” Dawson had been CD in PNG for a few years when I arrived.  A lanky, phlegmatic Australian with lots of experience in PNG, Smokey and his wife Jeannie had established a high degree of discipline for local staff, and were doing some good work.

Here Smokey is to my left, with Terina Stibbard (our Sydney-based International Program Coordinator, who worked with PNG) on the far right, rear:

IMG_1870.jpg

Photo Taken At A ChildFund PNG Staff Retreat, 2010

 

But when I arrived in Sydney in 2009, our program operations in PNG were at a fairly small scale, and were in fact drifting slowly downward.  We were underspending our budget, and seemed to struggle meeting the basic requirements of our fundraising (mostly sponsorship).   Operational costs were very high.  As a result, our program ratios (a proxy for efficiency) were too low, and there were some strong and persistent opinions in Sydney that we should close our work there.  But, at the same time, PNG was a place with extreme child poverty, so I definitely wanted to try to address whatever was holding us back before giving up.

But certainly I was in no position to make changes myself, both because it wasn’t my role to directly manage things overseas, and also because at that point I was mostly learning about the (major!) challenges of working in the Pacific.  It’s a very complex and challenging environment, and I was on a very steep learning curve!  So hats off to Smokey and the team.

But we were very lucky to have Terina Stibbard as International Program Coordinator for PNG, based in Sydney – her passion and drive would be invaluable in turning around our operations in Port Moresby.

So when Smokey Dawson left, returning to Australia, I thought that a good next step might be to hire a PNG national as Country Director.  Given the complexities of working in such a unique culture, surely somebody from PNG would be better able to navigate this world, this parallel universe?  And, given her long experience overseas, I thought that Terina would be the perfect “critical friend” for a Papuan Country Director…

So Terina and I interviewed a range of candidates, both expatriates and PNG nationals.  In the end, we were lucky to find Andrew Ikupu, a PNG national with a PhD from Adelaide.  We felt that Andrew would be able to manage across the wide differences between these two cultures – PNG and Australia.

Here are a couple of images of Andrew:

IMG_2645 copy.jpg

IMG_1172 copy.jpg

 

I liked Andrew a lot – he was very charismatic and smart.  Andrew was deeply immersed in his culture, and his long experience in Australia meant that he was also well able to bridge to other points of view.  He knew his country very well, and brought a unique combination of competencies to the role.  I learned a lot from him during the years we worked together.

On the other hand, I think that it was very challenging for Andrew to serve as our Country Director.  Firstly, having a decent salary and steady income in such a desperately-poor, and deeply-tribal place meant that, as a PNG national, Andrew faced constant pressure to help people in his home area (his “Wantoks” as they are called in PNG).  This pressure was financial (if a “Wantok” needed financial support, Andrew came under pressure to help), and also logistical (for example, if somebody in his home area was sick and needed to be evacuated, there were no formal alternatives; the only recourse people felt they had was to ask him to send a ChildFund vehicle, which he couldn’t do.)  I think he resisted these pressures, at least mostly, but I also think it was a big challenge.  I’m not sure I did Andrew a favor by hiring him…

At the same time, Andrew was in poor health, having developed diabetes just before joining ChildFund.  This illness is endemic in the Pacific, as a result (I think) of the encounter between a group of people living very traditional lives, with very traditional diets, abruptly transitioning to unhealthy Western food.  People there experienced a dramatic dietary shift to eating processed products, sugar, alcohol, etc., which seemed to have a big, negative impact on people in PNG and the Pacific, and certainly on Andrew.  The long-term consequences for people of the Pacific are likely to be quite negative.

So I think we made a good choice, but at the same time serving as ChildFund’s CD in PNG was very stressful for Andrew Ikupu.  We were very lucky that, just as Smokey was leaving, we also recruited another gifted senior manager: Manish Joshi joined as Program Manager for PNG.  Andrew and Manish worked together for some time and, when Andrew stepped down*, Manish was appointed Country Director.

Here are Manish and Andrew during the time that they worked together:

IMG_1807.jpg

 

Manish came to ChildFund from India, having recently worked in PNG as a United Nations Volunteer, in Madang with local government, so he was quite familiar with the country and the culture.  Manish did (and, as I write this article, still does) a truly outstanding job as Country Director, managing to address operational issues with a steady hand, and dramatically expanding the scale and scope of our program.

Early in Manish’s tenure, we made a very significant decision: we would stop raising funds through child sponsorship in PNG.  The associated costs of running sponsorship systems were way too high, and the complex and detailed nature of managing those systems in the chaotic context of PNG was too big a challenge for our (and any) team.  The whole system never seemed to work there.  But the flip side of this decision was that we would have to fund our programs exclusively through grants, which have their own serious complexities.  A very major challenge.

Manish and his team, with Terina’s support, succeeded brilliantly in this transition, exceeding all expectations.  One asset we would gain was the hiring of a very strong Program Manager (Aydel Salvadora) to replace Manish as he moved into the Country Director role.  Aydel served as a very competent “pair of hands,” enabling Manish to worry about the rest of our operation.

Amazingly, program activities increased steadily, despite having closed child sponsorship, and operations in general became more stable and functional. Our program ratio improved rapidly, and the operation moved quickly towards being financially sustainable.  Even more importantly, I think we were making increasing impact on child poverty.

Few INGOs can claim as much success in this challenging environment.  Of course our local staff deserves much credit, and Terina Stibbard played a fundamental role across the tenures of Smokey, Andrew, and Manish, keeping our support from Sydney at very high levels, and investing her heart and soul into the challenge.

But the most important factor in our success was the appointment of Manish Joshi as Country Director.  During the years that we worked together, I spoke with Manish most weeks by Skype, and almost every week there would be a situation or two – inside ChildFund or (most of the time) outside in the PNG environment – that was somehow catastrophic in a way that made Manish shake his head and worry.  We would talk about whatever it was, work through what we could do to minimize the impact on our world, Manish would chuckle about what an amazing place PNG was to work in, and then he’d get onto the issue in a relaxed but determined way.

Soon the voices in Sydney that had been insistently calling for the closing of our program in PNG became quiet.  Which made me feel very proud of our teams.

*

This photo was taken during my final visit to the Port Moresby Country Office, in late 2015.  Manish Joshi is fourth from the left, and Joe Pasen (our Development Effectiveness Manager) is to my left.  Other key staff in this photo include Aydel Salvadora (who followed Manish into the Program Manager role), and program leaders Olive Oa, and Sharon Pondros:

IMG_5060.jpg

 

Nigel Spence, our CEO, is also in there (sixth from the right, in the back.)

And here is a photo from an early visit to PNG, with me on the left and Joe Pasen in the foreground:

IMG_1592.jpg

 

I’ve written about the villager Hillary, and shared a “Case Study” about his garden project, in an earlier blog article in this series.  Here Manish and I are visiting Hillary’s garden, with a ChildFund PNG staff member:

IMG_2591.jpg

IMG_1793

With Manish Joshi

IMG_4991 copy.jpg

Paul Brown (CEO Of ChildFund NZ), Manish Joshi, and Nigel Spence (CEO of ChildFund Australia)

 

Apologies to other staff members who I haven’t named – huge thanks to all of you!

It was a great pleasure working with Manish, who built ChildFund PNG into an important, high-performing organization in one of the world’s most challenging places.  Thank you Manish!  And warm appreciation goes to Terina Stibbard, who brought her formidable passion and energy to building ChildFund PNG.

*

ChildFund Australia’s second Country Office was established in 1997, in Viet Nam.   By coincidence, I became Plan’s Country Director in Hanoi at about the same time, and I can remember meeting ChildFund’s second and third Country Directors there.

Later I would work with the first ChildFund Viet Nam CD, however, during my time consulting with CCF – by that time Daniel Wordsworth had moved to become Program Development Director in Richmond, Virginia, working with my old colleague Michelle Poulton.  I’m still not sure why I never met Daniel when we were both in Hanoi; perhaps it was Daniel’s predilection for nocturnal working hours…

I’ve written more about Daniel in several earlier articles in this series…

*

By the time I joined ChildFund Australia, Peter Walton was finishing seven years as Country Director in Viet Nam.  He had done a great job there, and was ready to move on.

Here are three images of Peter (and others) from my first visit to ChildFund Viet Nam:

IMG_1656.jpg

Nigel Spence Is On The Far Left; Peter Walton And I Are In The Back

IMG_1637.jpg

Peter Walton (Back Left), Nigel Spence (Third From Left), Nguyen Ba Lieu (Third From Right)

IMG_1640.jpg

Peter Walton (Left), Nguyen Ba Lieu (Second From Right), Me (Right)

 

Peter’s departure was a challenge, partly because it came very soon after I joined ChildFund (I certainly didn’t want him to leave, at least not so soon after I arrived!)

But, in a sense it was good timing.  At the time, the Viet Nam operation was viewed by staff in Sydney as the model that our offices in PNG and Cambodia should emulate.  And it was also viewed by staff in Viet Nam as the model!  So the establishment of a program team in Sydney, with a mandate to lead program thinking, would be tricky for our team in Hanoi to handle…

Peter’s transition, although the timing was bad, was an opportunity to assert the proper role of the Sydney International Program Team and International Program Director.

*

Hiring Peter’s successor was my first major overseas recruitment in my role at ChildFund.  So I was very lucky to have already met Deborah Leaver in Sydney.

At that time, Deb was Program Manager at ActionAid Australia, one of the few other INGOs based in Sydney.  (Most are based in Melbourne or Canberra.)  When I was reaching out to meet colleagues in my early months, Deb was one of the most welcoming, inviting me to visit her office near Sydney University and spending over an hour with me.  I was impressed, during that visit, with Deb’s obvious drive, energy, experience and intelligence.  So I was very happy when, a few weeks later, she asked if she could put her name forward for the Viet Nam Country Director position.

My response was: “of course!”

Here are two photos from our first visit to Hanoi, where we traveled together so I could  introduce her as our new Country Director:

IMG_2134.jpg

Deb Leaver, In The Center

 

Here is the Country Management Team in place when Deb arrived in Hanoi:

IMG_2185.jpg

Deb Leaver And The ChildFund Viet Nam Management Team

IMG_3398 (2).jpg

Another Image Of The Viet Nam Country Management Team, Plus Me

 

ChildFund Viet Nam expanded on several important dimensions during Deb’s tenure.  We grew into a new province, Cao Bang, on the northern border with China.  And our local website and communications work really moved forward.  Our visibility in the development community was greatly enhanced, in a very good way; we became one of the “go-to” agencies.

*

Several images of Nguyen Ba Lieu are included in the photos that I’ve shared, above.  He is on the far left in the image just above here.  Lieu was our Program Manager in Hanoi, and was one of our very first employees back in 1999 or so.  In fact, I can remember meeting Lieu when I was with Plan, ten years before I joined ChildFund!

Lieu was a vital part of our team in Viet Nam**, with a strong gift for working with local government partners (a complicated, and essential, aspect of working there.)  And he had a very agile and active mind, regularly creating interesting frameworks and concepts that were meant to guide our thinking and our work – not only for ChildFund Viet Nam.  I think that this meant that perhaps he had to adjust the most when I arrived on the scene and the International Program Team came into being, with our mandate for leading program thinking; but he handled the change with his innate grace and humility.

*

Deb Leaver thrived in the CD role, and ended up staying for seven years before moving to Laos with another organization.  During that time she continued to build our program and enhanced the stature of ChildFund in the Vietnam development community.  And she started a family there in Hanoi.

LY9A6076.jpg

With Deb Leaver

IMG_3034.jpg

 

As I’ve hinted already in this article, one challenge Deb faced was that her arrival coincided with the establishment of the Sydney International Team, and my own new position as International Program Director.  During Peter Walton’s time, as I mentioned, ChildFund Viet Nam was seen as the leading Country Office, in effect leading program thinking for the overall agency.  My arrival meant that things would change – the Viet Nam team would now contribute to program thinking, of course, but would no longer act autonomously, no longer lead things.  And now there would be more space for other Country Offices, in PNG and Cambodia (and, later, in Laos and Myanmar) to contribute.

This was a tricky transition, and Deb worked hard to integrate ChildFund Viet Nam into the program-development efforts of the wider organization, under my leadership, while also maintaining the sense of agency and pride that had been built up during earlier years when the Viet Nam office essentially served as the program-development entity for ChildFund Australia.

I enjoyed working with the ChildFund Viet Nam team a great deal.  And it was great working with Deb: she was hard working, very smart, with a very wicked sense of humor.  It is a testament to her work with her senior management that her successor came from inside ChildFund Viet Nam: Nguyen Bich Lien, who had overseen administrative aspects of our operation, became our Country Director when Deb moved to Laos after seven years in Hanoi.

Huge thanks to the whole ChildFund Viet Nam team, and to Deb Leaver.  It was great working with all of you!

*

ChildFund Australia’s third Country Office was in Cambodia, under the leadership of Carol Mortenson.  In 2009, the office was about a year old, our newest program, working in Svay Rieng province, in the far east of the country on the border with Viet Nam.

As I mentioned last time, given the nature of Cambodian governance, Carol made the astute decision early in her tenure to, essentially, work through local government to implement projects.  As a result, ChildFund faced relatively few difficulties operating on Cambodia.  Other agencies, whose mandates were more explicitly focused on human rights advocacy or democratization, faced a much different, much more challenging operating environment.

 

One key hire that Carol made early on was to recruit the gifted and talented, inspirational Sophiep Chat as her Program Manager.  Sophiep brought a unique set of skills to his role, and was of great help also to our wider program-development efforts beyond Cambodia.  I always enjoyed working with Sophiep, one of the most gifted NGO leaders I’ve worked with, and I learned a great deal from him – his contribution to our work in Cambodia, and to wider program development for the wider organization, was fundamental.

Later, two other gifted Cambodians joined Carol’s team:

  • Solin Chan became ChildFund Cambodia’s Development Effectiveness and Learning Manager, playing a key role in creating and implementing the overall ChildFund Australia Development Effectiveness Framework (DEF).  As with Sophiep, I learned a huge amount from Solin, who moved to work with UNICEF late in my time with ChildFund.  Solin was a very smart, and funny, professional who worked closely with Richard Geeves in moving the DEF forward;
  • Oum Vongarnith, our Finance and Administration Manager, was another key member of our team.  Hardworking and determined, we relied on Oum to make sure that operations were efficient and effective, and he never let us down.  I greatly enjoyed his sense of humor, and his dedication to our work was unrivaled.
IMG_0800.jpg

Solin, Carol, and Oum

 

During my time at ChildFund Australia, the Cambodia team grew into a second province (Kratie, north of Phnom Penh, on the Mekong) and slowly began to diversify our operational partnerships beyond local government.

*

Carol Mortenson left ChildFund Cambodia after a productive seven years, the same length of time that Peter Walton had been with ChildFund in Viet Nam.  I thought that this was a good, long period of time, and reasonable to consider leadership changes.

After an extensive recruitment, we were lucky to bring Prashant Verma into ChildFund as Country Director.

LY9A6085.jpg

With Prashant Verma

 

Prashant came from Plan International in Cambodia, and brought with him probably the highest energy of any Country Director I’ve worked with.  His drive and commitment to our work was amazing to see, and upon assuming his position he immediately began to discern areas where we could advance the effectiveness of our work.  We were fortunate to bring Prashant on board, and although I only worked with him for a short time, I learned a lot from him.  Prashant is one of the most innovative Country Directors I ever worked with, and was a perfect successor to Carol.

My deep thanks to both Carol and Prashant!

*

The fourth of our Country Offices to open was in Laos.  In Peter Walton’s last few months with ChildFund, he supervised initial research into our possible expansion into Laos, having hired a team of two consultants to carry out the work.  The outstanding work of those consultants, Chris Mastaglio and Keo Souvannaphoum, positioned us astutely to obtain a license to operate in the country.  Then they were selected as (respectively) Country Director and Program Manager.

We were very lucky to bring Chris and Keo onboard as our first staff in Laos.  They knew the country and its development context very well: Chris had been working in Laos with other INGOs for some time, and Keo was a Lao citizen with an advance degree from Duke University.

LY9A6075.jpg

With Chris Mastaglio

Personally, I found working with Chris and Keo to be a constant source of inspiration.  At first, Laos is a seemingly relaxed and uncomplicated place to work; but once we got going, the real situation revealed itself to be far more complex and challenging than initially perceived.  Chris and Keo knew this from the beginning, and positioned ChildFund in a very interesting space where we could make a lot of positive difference in the lives of children in Nonghet District while also subtly encouraging change in the deeper causes of poverty in the country.  This balancing act was very difficult and, in fact, most INGOs that tried to work in both areas were not successful.  It is a real testimony to Chris’s and Keo’s hard work and keen insights that we were able to stay engaged, successfully and sustainably (though not without some very nerve-wracking moments) in both domains.

IMG_3309.jpg

Vientiane Staff In 2011

As a result, ChildFund became a leader in the INGO community, and our work in Nonghet flourished, making a big difference in the lives of some people who were facing great poverty.

*

Our first working area was in Nonghet District of Xieng Khouang Province, in the Northeast of the country, right on the border with Viet Nam.  A very good choice, because it was quite remote and the population was mainly from the Hmong ethnic group, somewhat excluded from Laos’s development process.

These images were taken in January of 2011; it was very cold in the winter!

IMG_3126.jpg

IMG_3074.jpg

Staff Dinner

*

Chris is a veteran rugby player, from Newcastle, and before joining ChildFund he had been coaching the Lao Women’s rugby team alongside his INGO work.  After a few years with ChildFund, Chris came up with a very innovative and fascinating project, supporting the development of female youth through rugby.  As this sport was new to Laos, it wasn’t “gendered,” so could be used as a tool to develop leadership skills, conflict management, teamwork, psycho-social development, etc.

The project, later called “Pass It Back,” was very successful and later expanded into other countries in Asia.  Today Chris directs the Pass It Back program, which is now a partnership with World Rugby, Asia Rugby Federation, and Women Win.  When Chris moved over to concentrate (more than) full-time on Pass It Back, we recruited his successor, and I was delighted that Keo was the successful candidate!  So Keo became our second Country Director for Laos.

LY9A6095.jpg

With Keo

 

It was a huge pleasure and honor working with these two gifted professionals and their teams in Vientiane and Nonghet.  My hats off to both Chris and Keo, and their teams, truly.

*

The final Country Office to be established in my tenure with ChildFund was in Myanmar, initially headed up by the very gifted and smart Burmese Country Manager Win May Htway, and then by her successor Nini Htwe.  (After I left ChildFund, and Nini departed, Win May returned as Country Director.)

IMG_2385 - Version 2.jpg

With Win May

IMG_1105.jpg

ChildFund Myanmar Team In April 2013 (Maria Attard On The Left)

IMG_1256.jpg

Win May Inspecting Our First Country Office

 

This is an image of Win May (on the right) with Oum Vongnarith, our Finance Manager in ChildFund Cambodia.  Oum provided outstanding support to the Myanmar operation from Phnom Penh, with frequent visits to Yangon and Mandalay:

IMG_2387 - Version 2.jpg

IMG_0924.jpg

Our Myanmar Team in January, 2015

 

 

 

 

 

 

LY9A6090

With Nini

*

From the start, we hoped to work in a different way in Myanmar: Papua New Guinea is sui generis.  In Cambodia, Laos, and Viet Nam, the nature of the national political reality meant that we worked mostly, or exclusively, with and through local and regional governments.  This worked, but had obvious drawbacks.

But, ideally, organizations like ChildFund Australia work with and through local civil society – in principle this is more efficient, builds the capability of local society over the long-term, and is embedded within the actual reality of the country.  But working with local organizations requires a very different skill set than we in ChildFund had developed in our other Country Offices.  So we would have to learn by doing, and much would depend on our own local leadership – first and foremost, our Country Director.

So we were lucky to have recruited such experienced and dedicated local Burmese Country Directors.

IMG_2388.jpg

IMG_2390.jpg

Our Staff And Partners, Plus Maria Attard (Front Row), and Nigel Spence and Me (Back Row)

 

 

We approached this carefully, researching the development context in Myanmar and then preparing a proposed approach.  We prepared a document summarizing how we hoped to work in Myanmar and, since it implied such a large change in our model, once Nigel Spence was in agreement, we took it to our board of directors for discussion and approval.

When we had refined our proposed approach, and had approval from our board, we got going.  Maria Attard was the International Program Coordinator for Myanmar (and Viet Nam), and she managed the process of selecting our initial staff and first partners.  This went well: Win May was amongst our first employees, a brilliant choice of a courageous and passionate leader.  Our first partner organizations were selected, knowing full-well that things would likely go very well with some, and not so well with others.

Based around the capital, Yangon, Mandalay, and Shwebo, north of Mandalay, our first partners included organizations focused on street children, early-childhood development, primary education, youth, etc.

IMG_1180 - Version 2

Early Childhood Development Project

*

We saw this initial period of work in Myanmar as “Phase 1” of a possible shift to working through local civil society instead of implementing ourselves.  We weren’t thinking that ChildFund Australia would make that shift everywhere, but that it would be an important tool in our repertoire.

Many INGOs had already made this shift, years before, so in a sense it was overdue for us.  On the other hand, most the other major INGOs in Myanmar had begun to work in the country after Cyclone Nargis and, since the nature of the Myanmar government at that point was so oppressive, they had been forced to work directly, implementing projects with their own staff.  Even the INGOs that trumpeted their commitment to working in “partnership” with local civil society were not doing so, at least in Myanmar.  Because, in the interim, governance had changed (and improved) dramatically there, our relatively-late arrival gave us the opportunity to try to work in the right way.

“Phase 1” was meant to be a learning period, in which we worked with a larger number of partners, across a wider range of sectors and geographies, enabling us to learn and refine our approach and, in “Phase 2,” focus our programs more tightly on a narrower set of sectors and fewer partners.

This worked well – we had successes and setbacks.  One partner went a bit crazy and stopped cooperating.  Some project work wasn’t consistent with what ChildFund sought to achieve.  Most partners and projects went well.

But probably our biggest challenge was that our organizational systems and procedures were designed for direct implementation of projects.  At a deeper level, our culture and mental models were all consistent with operations being through our own staff.  Organizational ego is a problem in our development sector in general, but gets in the way even more significantly when working through partners.  And when our Myanmar staff began to interact with their peers in other ChildFund Australia countries, some sharing of experience actually ended up being quite unhelpful, because even in the same organization the operating models weren’t at all comparable.  These clashes of systems and assumptions caused on-going irritants and glitches, which is normal as this kind of fundamental shift progresses.

Many thanks to Win May and Nini and their teams in Myanmar, and to Maria Attard, for their hard work and courage in moving forward with such a different model.  It was very hard work, and they blazed a trail for ChildFund Australia.  Thank you!

*

Myanmar was changing very quickly during the years that I was traveling there with ChildFund Australia.  The last military government seemed to be trying to reform, at least in some areas, and you could feel things loosening up.  Then free elections were held and the National League for Democracy came into power.  Optimism was in the air.  (Sadly, the giddy optimism of these early days seems now to be very tempered by ethnic conflict and a range of other setbacks…)

It was an interesting and positive time for the country, and visiting was fascinating.  One other reason that I enjoyed visiting Myanmar was that my own meditation practice is Burmese in origin, so I was able to connect with that part of the country’s tradition several times a year.

Here are a few images of the country taken during my visits:

 

IMG_4073-2

Shwedagon At Night

 

 

IMG_4052

Shwedagon

IMG_4059-2

Shwedagon

IMG_4049

Shwedagon

 

IMG_4086

Shwedagon

IMG_1277.jpg

 

 

IMG_4066

One Of My Favorite Places To Eat In Myanmar: “Feel” Restaurant

Spices

*

I would get together with our Country Directors, face-to-face, at least once each year.  Here is an image from an early get-togethers, with (from the left) Carol, Andrew, Chris, Deb, and me:

IMG_2645 copy 2.jpg

 

And here is an image of the ChildFund Australia Country Directors in place as I departed, in 2015: from the left, Keo (Laos), Nini (Myanmar), Manish (PNG), Deb (Viet Nam), Prashant (Cambodia), Chris (Laos, then Pass It Back), and me:

LY9A6066.jpg

Keo, Nini, Manish, Deb, Prashant, Chris, Me

*

My heartfelt appreciation goes to our teams in Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam.  It was an honor and privilege working with all of you.  Thanks for your incredible hard work and commitment!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. West Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System;
  37. Mt Bond (37) – Impact Assessement in ChildFund Australia’s Development Effectiveness System;
  38. Mt Waumbek (38) – “Building the Power of Poor People and Poor Children…”

 

*- Sadly, Andrew Ikupu died a few years after I left Australia.

**- Also very sadly, Lieu died a year or so before I left Australia.

 

West Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness Framework

June, 2018

International NGOs do their best to demonstrate the impact of their work, to be accountable, to learn and improve.  But it’s very challenging and complicated to measure change in social-justice work, and even harder to prove attribution.  At least, to do these things in affordable and participatory ways…

Two times in Plan International, earlier in my career, I had worked to develop and implement systems that would demonstrate impact – and both times, we had failed.

In this article I want to describe how, in ChildFund Australia, we succeeded, and were able to build and implement a robust and participatory system for measuring and attributing impact in our work.

Call it the Holy Grail!

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 36 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”  This is my 37th post in the series.

In recent posts in this series I’ve been describing aspects of the ChildFund Australia “Development Effectiveness Framework” (“DEF”) the system that would help us make sure we were doing what we said we were going to do and, crucially, verify that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

There are three particular components of the overall DEF that I am detailing in more depth, because I think they were especially interesting and innovative.  In my previous blog I described how we used Case Studies to complement the more quantitative aspects of the system.  These Case Studies were qualitative narratives of the lived experience of people experiencing change related to ChildFund’s work, which we used to gain human insights, and to reconnect ourselves to the passions that brought us to the social-justice sector in the first place.

This time, I want to go into more depth on two final, interrelated components of the ChildFund Australia DEF: Outcome Indicator Surveys and Statements of Impact.  Together, these two components of the DEF enabled us to understand the impact that ChildFund Australia was making, consistent with our Theory of Change and organizational vision and mission.  Important stuff!

But first…

*

Last time I described climbing to the top of Mt Bond on 10 August 2017, after having gotten to the top of Bondcliff.  After Mt Bond, I continued on to West Bond (4540ft, 1384m), the last of three 4000-footers I would scale that day.  (But, since this was an up-and-back trip, I would climb Mt Bond and Bondcliff twice!  It would be a very long day.)

As I described last time, I had arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  This early start was enabled by staying the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.  Then I had reached the top of Bondcliff at 10:30am, and the summit of Mt Bond at about 11:30am.

Now I would continue to the top of West Bond, and then retrace my steps to Lincoln Woods:

Bond Map - 6c.jpeg

 

So, picking up the story from the top of Mt Bond, the Bondcliff Trail drops down fairly quickly, entering high-altitude forest, mostly pine and ferns.

IMG_1952.jpg

 

After 20 minutes I reached the junction with the spur trail that would take me to the top of West Bond.  I took a left turn here.  The spur trail continues through forest for some distance:

IMG_1955.jpg

IMG_1958.jpg

 

I reached the top of West Bond at 12:30pm, and had lunch there.  The views here were remarkable; it was time for lunch, and I was fortunate to be by myself, so I took my time at the summit.

IMG_1965 (1).jpg

Bondcliff From West Bond

IMG_1972.jpg

At The Summit Of West Bond.  Franconia Ridge And Mt Garfield In The Background.  A Bit Tired!

IMG_1984.jpg

Mt Bond, On The Left, And Bondcliff On The Right

 

Here are two spectacular videos from the top of West Bond.  The first simply shows Bondcliff, with the southern White Mountains in the background:

 

And this second video is more of a full panorama, looking across to Owl’s Head, Franconia Ridge, Garfield, the Twins, Zealand, and back:

 

Isn’t that spectacular?!

After eating lunch at the top of West Bond, I left at a bit before 1pm, and began to retrace my steps towards Lincoln Woods.  To get there, I had to re-climb Mt Bond and Bondcliff.

I reached the top of Mt Bond, for the second time, at 1:20pm.  The view down towards Bondcliff was great!:

IMG_1996.jpg

Bondcliff From The Top Of Mt Bond, Now Descending…

 

Here is a view from near the saddle between Mt Bond and Bondcliff, looking up at the latter:

IMG_2005.jpg

Looking Up At Bondcliff

 

As I passed over Bondcliff, at 2:15pm, I was slowing down, and my feet were starting to be quite sore.  I was beginning to dread the descent down Bondcliff, Wilderness, and Lincoln Woods Trails… it would be a long slog.

Here’s a view from there back up towards Mt Bond:

IMG_2007.jpg

A Glorious White Mountain Day – Mt Bond And West Bond, From Bondcliff

 

But there were still 8 or 9 miles to go!  And since I had declined the kind offer I had received to ferry my car up to Zealand trail-head, which would have saved me 3 miles, I had no other option but to walk back to Lincoln Woods.

It was nearly 5pm by the time I reached the junction with Twinway and the Lincoln Woods Trail.  By that time, I was truly exhausted, and my feet were in great pain, but (as I said) I had no option but to continue to the car: no tent or sleeping bag, no phone service here.

The Lincoln Woods Trail, as I’ve described in more detail elsewhere, is long and flat and wide, following the remnants of an old forest railway:

IMG_2024

IMG_2025

Sleepers From The Old Forestry Railway

 

Scratches from walking poles?

IMG_2026 (1).jpg

 

It was around 5:30 when I got to the intersection with Franconia Notch Trail, which is the path up Owl’s Head.

IMG_2028.jpg

IMG_2034.jpg

 

It was a very long slog down Lincoln Woods Trail – put one foot in front of the other, and repeat!  And repeat and repeat and repeat and repeat …

Finally I reached the Lincoln Woods Visitor Center, where I had parked my car at 6:30am that morning, at 6:40pm, having climbed three 4000-footers, walked 22 miles, and injured my feet in just over 12 hours.

Looking back, I had accomplished a great deal, and the views from the top of three of New Hampshire’s highest and most-beautiful were amazing.  But, at the time, I had little feeling of accomplishment!

IMG_2038 (1).jpg

Knackered!

 

*

Here is the diagram I’ve been using to describe the ChildFund Australia DEF:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework

 

In this article I want to describe two components of the DEF: #2, the Outcome Indicator Surveys; and #12, how we produced “Statements of Impact.”  Together, these two components enabled us to measure the impact of our work.

First, some terminology: as presented in an earlier blog article in this series, we had adopted fairly standard definitions of some related terms, consistent with the logical framework approach used in most mature INGOs:

Screen Shot 2018-05-28 at 2.16.30 PM

 

According to this way of defining things:

  • A Project is a set of Inputs (time, money, technology) producing a consistent set of Outputs (countable things delivered in a community);
  • A Program is a set of Projects producing a consistent set of Outcomes (measurable changes in human conditions related to the organization’s Theory of Change);
  • Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan.

But that definition of “Impact,” though clear and correct, wasn’t nuanced enough for us to design a system to measure it.  More specifically, before figuring out how to measure “Impact,” we needed to grapple with two fundamental questions:

  • How “scientific” did we want to be in measuring impact?  In other words, were we going to build the infrastructure needed to run randomized control group trials, or would we simply measure change in our Outcome Indicators?  Or somewhere in between?;
  • How would we gather data about change in the communities where we worked?  A census, surveying everybody in a community, which would be relatively costly?  If not, what method for sampling would we use that would enable us to claim that our results were accurate (enough)?

*

The question “how ‘scientific’ did we want to be” when we assessed our impact was a fascinating one, getting right to the heart of the purpose of the DEF.  The “gold standard” at that time, in technical INGOs and academic institutions, was to devise “randomized control group” trials, in which you would: implement your intervention in some places, with some populations; identify ahead of time a comparable population that would serve as a “control group” where you would not implement that intervention; and then compare the two groups after the intervention had concluded.

For ChildFund Australia, we needed to decide if we would invest in the capability to run randomized control group trials.  It seemed complex and expensive but, on the other hand, it  would have the virtue of being at the forefront of the sector and, therefore, appealing to technical donors.

When we looked at other comparable INGOs, in Australia and beyond, there were a couple that had gone that direction.  When I spoke with my peers in some of those organizations, they were generally quite cautious about the randomized control trial (“RCT”) approach: though appealing in principle, in practice it was complex, requiring sophisticated technical staff to design and oversee the measurements, and to interpret results.  So RCTs were very expensive.  Because of the cost, people with practical experience in the matter recommended using RCTs, if at all, only for particular interventions that were either expensive or were of special interest for other reasons.

For ChildFund Australia, this didn’t seem suitable, mainly because we were designing a comprehensive system that we hoped would allow us to improve the effectiveness of our development practice, while also involving our local partners, authorities, and people in communities where we worked.  Incorporating RCTs into such a comprehensive system would be very expensive, and would not be suitable for local people in any meaningful way.

The other option we considered, and ultimately adopted, hinged upon an operational definition of “Impact.”  Building on the general definition shown above (“Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan”), operationally we decided that:

Screen Shot 2018-06-18 at 3.06.57 PM.png

 

In other words, we felt that ChildFund could claim that we had made an significant impact in the lives of children in a particular area if, and only if:

  1. There had been a significant, measured, positive change in a ChildFund Australia Outcome Indicator; and
  2. Local people (community members, organizations, and government staff) determined in a rigorous manner that ChildFund had contributed to a significant degree to that positive change.

In other words:

  • If there was no positive change in a ChildFund Australia Outcome Indicator over three years (see below for a discussion of why we chose three years), we would not be able to claim impact;
  • If there was a positive change in a ChildFund Australia Outcome Indicator over three years, and local people determined that we had contributed to that positive change, we would be able to claim impact.

(Of course, sometimes there might be a negative change in a ChildFund Australia Outcome Indicator, which would have been worse if we hadn’t been working in the community.  We were able to handle that situation in practice, in community  workshops.)

I felt that, if we approached measuring impact in this way it would be “good enough” for us – perhaps not as academically robust as using RCT methods, but (if we did it right) certainly good enough for us to work with local people to make informed decisions, together, about improving the effectiveness of our work, and to make public claims of the impact of our work.

So that’s what we did!

*

As a reminder, soon after I had arrived in Sydney we had agreed a “Theory of Change” which enabled us to design a set of organization-wide Outcome Indicators.  These indicators, designed to measure the status of children related to our Theory of Change, were described in a previous article, and are listed here:

Screen Shot 2018-05-28 at 3.16.59 PMScreen Shot 2018-05-28 at 3.17.10 PM

 

These Outcome Indicators had been designed technically, and were therefore robust.  And they had been derived from the ChildFund Australia Vision, Mission, and Program Approach, so they measured changes that would be organically related to the claims we were making in the world.

So we needed to set up a system to measure these Outcome Indicators; this would become component #2 in the DEF (see Figure 1, above).  And we had to design a way for local partners, authorities, and (most importantly) people from the communities where we worked to assess changes to these Outcome Indicators and reach informed conclusions about who was responsible for causing the changes.

First, let me outline how we measured the ChildFund Australia Outcome Indicators.

*

Outcome Indicator Surveys (Component #2 in Figure 1, Above)

Because impact comes rather slowly, an initial, baseline survey was carried out in each location and then, three years later, another measurement was carried out.  A three-year gap was somewhat arbitrary: one year was too short, but five years seemed a bit long.  So we settled on three years!

Even though we had decided not to attempt to measure impact using complex randomized control trials, these survey exercises were still quite complicated, and we wanted the measurements to be reliable.  This was why we ended up hiring a “Development Effectiveness and Learning Manager” in each Country Office – to support the overall implementation of the DEF and, in particular, to manage the Outcome Indicator Surveys.  And these surveys were expensive and tricky to carry out, so we usually hired students from local universities to do the actual surveying.

Then we needed to decide what kind of survey to carry out.  Given the number of people in the communities where we worked, we quickly determined that a “census,” that is, interviewing everybody, was not feasible.

So I contacted a colleague at the US Member of the ChildFund Alliance, who was an expert in this kind of statistical methodology.  She strongly advised me to use the survey method that they (the US ChildFund) were using, called “Lot Quality Assurance Sampling.”  LQAS seemed to be less expensive than other survey methodologies, and it was highly recommended by our expert colleague.

(In many cases, during this period, we relied on technical recommendations from ChildFund US.  They were much bigger than the Australia Member, with excellent technical staff, so this seemed logical and smart .  But, as with Plan International during my time there, the US ChildFund Member had very high turnover, which led to many changes in approach.  This meant, in practice for us, although ChildFund Australia had adopted several of the Outcome Indicators that ChildFund US was using, in the interests of commonality, and – as I said – we had begun to use LQAS for the same reason, soon the US Member was changing their Indicators and abandoning the use of LQAS because new  staff felt it wasn’t the right approach.  This led to the US Member expressing some disagreement with how we, in Australia, were measuring Impact – even though we were following their – previous – recommendations!  Sigh.)

Our next step was to carry out baseline LQAS surveys in each field location.  It took time to accomplish this, as even the relatively-simple LQAS was a complex exercise than we were typically used to.  Surveys were supervised by the DEL Managers, carried out usually by students from local universities.  Finally, the DEL Managers prepared baseline reports summarizing the status of each of the ChildFund Australia Outcome Indicators.

Then we waited three years and repeated the same survey in each location.

(In an earlier article I described how Plan International, where I had worked for 15 years, had failed twice to implement a DEF-like system, at great expense.  One of the several mistakes that Plan had made was that they never held their system constant enough to be comparable over time.  In other words, in the intervening years after measuring a baseline, they tinkered with [“improved”] the system so much that the second measurement couldn’t be compared to the first one!  So it was all for naught, useless.  I was determined to avoid this mistake, so I was very reluctant to change our Outcome Indicators after they were set, in 2010; we did add a few Indicators as we deepened our understanding of our Theory of Change, but that didn’t get in the way of re-surveying the Indicators that we had started with, which didn’t change.)

Once the second LQAS survey was done, three years after the baseline, the DEL Manager would analyze differences and prepare a report, along with a translation of the report that could be shared with local communities, partners, and government staff.  The DEL Manager, at this point, did not attempt to attribute changes to any particular development actor (local government, other NGOs, the community themselves, ChildFund, etc.), but did share the results with the communities for validation.

Rather, the final DEF component I want to describe was used to determine impact.

*

Statements of Impact (Component #12 in Figure 1, Above)

The most exciting part of this process was how we used the changes measured over three years in the Outcome Indicators to assess Impact (defined, as described above, as change plus attribution.)

The heart of this process was a several-day-long workshop at which local people would review and discuss changes in the Outcome Indicators, and attribute the changes to different actors in the area.  In other words, if a particular indicator (say, the percentage of boys and girls between 12 and 16 years of age who had completed primary school) had changed significantly, people at the workshop would discuss why the change had occurred – had the local education department done something to cause the change?  Had ChildFund had an impact?  Other NGOs?  The local community members themselves?

Finally, people in the workshop would decide the level of ChildFund’s contribution to the change (“attribution”) on a five-point scale: none, little, some, a lot, completely.   This assessment, made by local people in an informed and considered way, would then serve as the basic content for a “Statement of Impact” that would be finalized by the DEL Manager together with his or her senior colleagues in-country, Sydney-based IPT staff and, finally, myself.

*

We carried out the very first of these “Impact” workshops in Svay Rieng, Cambodia, in February 2014.  Because this was the first of these important workshops, DEL Managers from Laos and Viet Nam attended, to learn, along with three of us from Sydney.

Here are some images of the ChildFund team as we gathered and prepared for the workshop in Svay Rieng:

IMG_2151

IMG_2169

IMG_2202

 

Here are images of the workshop.  First, I’m opening the session:

IMG_8605

 

Lots of group discussion:

IMG_8758

 

The DEL Manager in Cambodia, Chan Solin, prepared a summary booklet for each participant in the workshop.  These booklets were a challenge to prepare, because they would be used by local government, partners, and community members; but Solin did an outstanding job.  (He also prepared the overall workshop, with Richard Geeves, and managed proceedings very capably.)  The booklet presented the results of the re-survey of the Outcome Indicators as compared with the baseline:

IMG_8817

IMG_8795

 

Here participants are discussing results, and attribution to different organizations that had worked in Svay Rieng District over the three years:

IMG_9612

 

Subgroups would then present their discussions and recommendations for attribution.  Note the headphones – since this was our first Impact Workshop, and ChildFund staff were attending from Laos, Viet Nam, and Australia in addition to Cambodia, we provided simultaneous translation:

IMG_9694

 

Here changes in several Outcome Indicators over the three years (in blue and red) can be seen.  The speaker is describing subgroup deliberations on attribution of impact to the plenary group:

IMG_9703

IMG_9719

IMG_9699

IMG_9701

IMG_9747

IMG_9728

IMG_9763

 

Finally, a vote was taken to agree the attribution of positive changes to Outcome Indicators.  Participants voted according to their sense of ChildFund’s contribution to the change: none, a little, some, a lot, or completely.  Here is a ballot and a tabulation sheet:

IMG_9790

 

Finally, here is an image of the participants in that first Statement of Impact Workshop: Local Community Members, Government Staff, ChildFund Staff (From The Local Area, Country Office, Sydney, and From Neighboring Viet Nam):

IMG_2299

 

*

Once the community workshops were finished, our local Senior Management would review the findings and propose adjustments to our work.  Then the DEL Managers would prepare a final report, which we described as “Statements of Impact.”

Generally speaking, these reports would include:

  • An introduction from the Country Director;
  • A description of the location where the Statement of Impact was produced, and a summary of work that ChildFund had done there;
  • An outline of how the report was produced, noting the three-year gap between baseline and repeat survey;
  • Findings agreed by the community regarding changes to each Outcome Indicator along with any attribution of positive change to ChildFund Australia;
  • Concluding comments and a plan of action for improvement, agreed by the local Country Office team and myself.

Examples of these reports are shared below.

*

This process took some time to get going, because of the three-year delay to allow for re-surveying, but once it commenced it was very exciting.  Seeing the “Statement of Impact” reports come through to Sydney, in draft, from different program countries, was incredible.  They showed, conclusively, that ChildFund was really making a difference in the lives of children, in ways that were consistent with our Theory of Change.

Importantly, they were credible, at least to me, because they showed some areas where we were not making a difference, either because we had chosen not to work in a particular domain (to focus on higher priorities) or because we needed to improve our work.

*

I’m able to share four ChildFund Australia Statements of Impact, downloaded recently from the organization’s website.  These were produced as described in this blog article:

*

Here are a few of the findings from that first “Statement of Impact” in Svay Chrum:

  • ChildFund made a major contribution to the increase in primary-school completion in the district:

Screen Shot 2018-06-27 at 8.49.40 AM.png

 

  • Although the understanding of diarrhea management had improved dramatically, it was concluded that ChildFund had not contributed to this, because we hadn’t implemented any related projects.  “Many development actors contributed to the change”:

Screen Shot 2018-06-27 at 8.52.47 AM.png

 

  • ChildFund had a major responsibility for the improvement in access to hygienic toilets in the district:

Screen Shot 2018-06-27 at 8.49.54 AM.png

 

  • ChildFund made a significant contribution to the increase in access to improved, affordable water in the district:

Screen Shot 2018-06-27 at 8.54.41 AM.png

 

  • ChildFund had made a major contribution to large increases in the percentage of children and youth who reported having opportunities to voice their opinions:

Screen Shot 2018-06-27 at 8.56.08 AM.png

  • Although the percentage of women of child-bearing age in the district who were knowledgeable regarding how to prevent infection with HIV, it was determined the ChildFund had made only a minor contribution to this improvement.  And recommendations were made by the group regarding youth knowledge, which had actually declined:

Screen Shot 2018-06-27 at 8.57.47 AM.png

 

To me, this is fantastic stuff, especially given that the results emerged from deep and informed consultations with the community, local partners, and local authorities.  Really, this was the Holy Grail – accountability, and lots of opportunity for learning.  The results were credible to me, because they seemed to reflect the reality of what ChildFund had worked on, and pointed out areas where we needed to improve; the report wasn’t all positive!

*

For me, the way that the Outcome Indicator Surveys and Statements of Impact worked was a big step forward, and a major accomplishment.  ChildFund Australia now had a robust and participatory way of assessing impact so that we could take steps to confidently improve our work.  With these last two components of the DEF coming online, we had managed to put in place a comprehensive development-effectiveness system, the kind of system that we had not been able to implement in Plan.

As I shared the DEF – its design, the documents and reports it produced – with our teams, partners, Australian government, donors – I began to get lots of positive feedback.   At least for its time, in Australia, the ChildFund Australia DEF was the most comprehensive, robust, participatory, useful system put into place that anybody had ever seen.  Not the most scientific, perhaps, but something much better: usable, useful, and empowering.

*

My congratulations and thanks to the people who played central roles in creating, implementing, and supporting the DEF:

  • In Sydney: Richard Geeves and Rouena Getigan;
  • And the DEL Managers in our Country Offices: Chan Solin (Cambodia), Joe Pasen (PNG), Marieke Charlet (Laos), and Luu Ngoc Thuy and Bui Van Dung (Viet Nam).

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. Mt Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System.

 

 

Mt Bond (36) – “Case Studies” In ChildFund Australia’s Development Effectiveness Framework

June, 2018

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 35 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”

Last time I described the ChildFund Australia “Development Effectiveness Framework,” the system that would help us make sure we were doing what we said we were going to do and, crucially, verifying that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

This time, I want to go into more depth on one component of the DEF, the “Case Studies” that described the lived experience of people that we worked with.  Next time, I’ll describe how we measured the impact of our work.

But first…

*

On 10 August, 2017, I climbed three 4000-footers in one very long day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a tough day, covering 22 miles and climbing three very big mountains.  At the end of the hike, I felt like I was going to lose the toenails on both big toes (which, in fact, I did!) … it was a bit much!

Last time I wrote about climbing to the top of Bondcliff, the first summit of that day.  This time, I will describe the brief walk from there to the top of Mt Bond, the tallest of the three Bonds.  And next time I’ll finish describing that day, with the ascent of West Bond and the return to the trail-head at Lincoln Woods.

*

As I described last time, I arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  I was able to get an early start because I had stayed the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.

It was a bright and mostly-sunny day, with just a few clouds and some haze.  The path between Bondcliff and Mt Bond is quite short – really just dropping down to a saddle, and then back up again, only 1.2 miles:

Bond Map - 6b

 

It took me about an hour to cover that distance and reach the top of Mt Bond from Bondcliff at 11:30am.  The path was rocky as it descended from Bondcliff, in the alpine zone, with many large boulders as I began to go back up towards Mt Bond – some scrambling required.

This photo was taken at the saddle between Bondcliff and Mt Bond: on the left is Bondcliff, on the right is West Bond, and in the middle, in the distance, is Franconia Ridge; Mt Bond is behind me.  A glorious view on an amazing day for climbing:

IMG_1929.jpg

From the Left: Bondcliff, Franconia Ridge, West Bond

 

It got even steeper climbing up from the saddle to the summit, passing through some small pine shrubs, until just before the top.

The views were spectacular at the summit of Mt Bond, despite the sky being slightly hazy – I could see the four 4000-footers of the Franconia Ridge to the west and Owl’s Head in the foreground, the Presidential Range to the east, and several other 4000-footers to the south and south-west:

IMG_1948 (1)

Looking To The West From The Summit Of Mt Bond

 

And I had a nice view back down the short path from the top of Bondcliff:

IMG_1943 (1)

 

There were a few people at the top, and I had a brief conversation with a couple that were walking from Zealand trailhead across the same three mountains I was climbing, and finishing at Lincoln Woods.  This one-way version of what I was doing in an up-and-back trip was possible because they had left a car at Lincoln Woods, driving to the Zealand trailhead in a second vehicle.  They would then ferry themselves back to Zealand from Lincoln Woods.

Kindly, they offered to pick up my car down at Lincoln Woods and drive it to Zealand, which would have saved me three miles.  I should have accepted, because finishing what became 22 miles, and three 4000-foot peaks, would end up hobbling me for a while, and causing two toenails to come off!  But I didn’t have a clear sense of how the day would go, so I declined their offer, with sincere thanks…

Getting to the top of Mt Bond was my 36th 4000-footer – just 12 more to go!

I didn’t stay too long at the top of Mt Bond on the way up, continuing towards West Bond… stay tuned for that next time!

*

Jean and I had moved to Sydney in July of 2009, where I would take up the newly-created position of International Program Director for ChildFund Australia.  It was an exciting opportunity for me to work in a part of the world I knew and loved (Southeast Asia: Cambodia, Laos, Myanmar and Viet Nam) and in a challenging new country (Papua New Guinea).  It was a great chance to work with some really amazing people – in Sydney and in our Country Offices… to use what I had learned to help build and lead effective teams.  Living in Sydney would not be a hardship post, either!  Finally, it was a priceless chance for me to put together a program approach that incorporated everything I had learned to that point, over 25 years working in poverty reduction and social justice.

In the previous article in this series, I described how we developed a “Development Effectiveness System” (“DEF”) for ChildFund Australia, and I went through most of the components of the DEF in great detail.

My ambition for the DEF was to bring together our work into one comprehensive system – building on our Theory of Change and organizational Vision and Mission, creating a consistent set of tools and processes for program design and assessment, and making sure to close the loop with defined opportunities for learning, reflection, and improvement.

Here is the graphic that we used to describe the system:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

As I said last time, I felt that three components of the DEF were particularly innovative, and worth exploring in more detail in separate blog articles:

  • I will describe components #2 (“Outcome Indicator Surveys) and #12 (Statement of Impact) in my next article.  Together, these components of the DEF were meant to enable us to measure the impact of our work in a robust, participatory way, so that we could learn and improve;
  • this time, I want to explore component #3 of the DEF: “Case Studies.”

*

It might seem strange to say it this way, but the “Case Studies” were probably my favorite of all the components of the DEF!  I loved them because they offered direct, personal accounts of the impact of projects and programs from children, youth, men and women from the communities in which ChildFund worked and the staff and officials of local agencies and government offices with whom ChildFund partnered.  We didn’t claim that the Case Studies were random or representative samples; rather, their value was simply as stories of human experience, offering insights would not have been readily gained from quantitative data.

Why was this important?  Why did it appeal to me so much?

*

Over my years working with international NGOs, I had become uneasy with the trend towards exclusive reliance on linear logic and quantitative measurement, in our international development sector.  This is perhaps a little bit ironic, since I had joined the NGO world having been educated as an engineer, schooled in the application of scientific logic and numerical analysis for practical applications in the world.

Linear logic is important, because it introduces rigor in our thinking, something that had been weak or lacking when I joined the sector in the mid-1980s.  And quantitative measurement, likewise, forced us to face evidence of what we had or had not achieved. So both of these trends were positive…

But I had come to appreciate that human development was far more complex than building a water system (for example), much more complicated than we could fully capture in linear models.  Yes, a logical, data-driven approach was helpful in many ways, perhaps nearly all of the time, but it didn’t seem to fit every situation in communities that I came to know in Latin America, Africa, and Asia.  In fact, I began to see that an over-emphasis on linear approaches to human development was blinding us to ways that more qualitative, non-linear thinking could help; we seemed to be dismissing the qualitative, narrative insights that should also have been at the heart of our reflections.  No reason not to include both quantitative and qualitative measures.  But we weren’t.

My career in international development began at a time when the private-sector, business culture, started to influence our organizations in a big way: as a result of the Ethiopian famine of the mid-1980’s, INGOs were booming and, as a result, were professionalizing, introducing business practices.  All the big INGOs started to bring in people from the business world, helping “professionalize” our work.

I’ve written elsewhere about the positive and negative effects that business culture had on NGOs: on the positive side, we benefited from systems and approaches the improved the internal management of our agencies, such as clear delegations of authority, financial planning and audit, etc.  Overall, it was a very good, and very necessary evolution.

But there were some negatives.  In particular, the influx of private-sector culture into our organizations meant that:

  • We began increasingly to view the world as a linear, logical place;
  • We came to embrace the belief that bigger is always better;
  • “Accountability” to donors became so fundamental that sometimes it seemed to be our highest priority;
  • Our understanding of human nature, of human poverty, evolved towards the purely material, things that we could measure quantitatively.

I will attach a copy of the article I wrote on this topic here:  mcpeak-trojan-horse.

In effect, this cultural shift had the effect of emphasizing linear logic and quantitative measures to such a degree, with such force, that narrative, qualitative approaches were sidelined as, somehow, not business-like enough.

As I thought about the overall design of the DEF, I wanted to make 100% sure that we were able to measure the quantitative side of our work, the concrete outputs that we produced and the measurable impact that we achieved (more on that next time).  Because the great majority of our work was amenable to that form of measurement, and being accountable for delivering the outputs (projects, funding) that we had promised was hugely important.

But I was equally determined that we would include qualitative elements that would enable us to capture the lived experience of people who facing poverty.  In other words, because poverty is experienced holistically by people, including children, in ways that can be captured quantitatively and qualitatively, we needed to incorporate both quantitative and qualitative measurement approaches if we were to be truly effective.

The DEF “Case Studies” was one of the ways that we accomplished this goal.  It made me proud that we were successful in this regard.

*

There was another reason that I felt that the DEF Case Studies were so valuable, perhaps just as important as the way that they enabled us to measure poverty more holistically.  Observing our organizations, and seeing my own response to how we were evolving, I clearly saw that the influence of private-sector, business culture was having positive and negative effects.

One of the most negative impacts I saw was an increasing alienation of our people from the basic motivations that led them to join the NGO sector, a decline in the passion for social justice that had characterized us.  Not to exaggerate, but it seemed that we were perhaps losing our human connection with the hope and courage and justice that, when we were successful, we helped make for individual women and men, girls and boys.  The difference we were making in the lives of individual human beings was becoming obscured behind the statistics that we were using, behind the mechanical approaches we were taking to our work.

Therefore, I was determined to use the DEF Case Studies as tools for reconnecting us, ChildFund Australia staff and board, to the reason that we joined in the first place.  All of us.

*

So, what were the DEF Case Studies, and how were they produced and used?

In practice, Development Effectiveness and Learning Managers in ChildFund’s program countries worked with other program staff and partners to write up Case Studies that depicted the lived experience of people involved in activities supported by ChildFund.  The Case Studies were presented as narratives, with photos, which sought to capture the experiences, opinions and ideas of the people concerned, in their own words, without commentary.  They were not edited to fit a success-story format.  As time went by, our Country teams started to add a summary of their reflections to the Case Studies, describing their own responses to the stories told there.

Initially we found that field staff had a hard time grasping the idea, because they were so used to reporting their work in the dry, linear, quantitative ways that we had become used to.  Perhaps program staff felt that narrative reports were the territory of our Communications teams, meant for public-relations purposes, describing our successes in a way that could attract support for our work.  Nothing wrong with that, they seemed to feel, but not a program thing!

Staff seemed at a loss, unable to get going.  So we prepared a very structured template for the Case Studies, specifying length and tone and approach in detail.  This was a mistake, because we really wanted to encourage creativity while keeping the documents brief; emphasizing the “voice” of people in communities rather than our own views; covering failures as much as successes.  Use of a template tended to lead our program staff into a structured view of our work, so once we gained some experience with the idea, as staff became more comfortable with the idea and we began to use these Case Studies, we abandoned the rigid template and encouraged innovation.

*

So these Case Studies were a primary source of qualitative information on the successes and failures of ChildFund Australia’s work, offering insights from children, youth and adults from communities where we worked and the staff of local agencies and government offices with whom ChildFund Australia partnered.

In-country staff reviewed the Case Studies, accepting or contesting the opinions of informants about ChildFund Australia’s projects.  These debates often led to adjustments to existing projects but also triggered new thinking – at the project activity level but also at program level or even the overall program approach.

Case Studies were forwarded to Sydney, where they were reviewed by the DEF Manager; some were selected for a similar process of review by International Program staff, members of the Program Review Committee and, on occasion, by the ChildFund Australia Board.

The resulting documents were stored in a simple cloud-based archive, accessible by password to anyone within the organization.  Some Case Studies were also included on ChildFund Australia’s website; we encouraged staff from our Communications team in Sydney to review the Case Studies and, if suitable, to re-purpose them for public purposes.  Of course, we were careful to obtain informed consent from people included in the documents.

*

Through Case Studies, as noted above, local informants were able to pass critical judgement on the appropriateness of ChildFund’s strategies, how community members perceived our aims and purposes (not necessarily as we intended); and they could alert us to unexpected consequences (both positive and negative) of what we did.

For example, one of the first Case Studies written up in Papua New Guinea revealed that home garden vegetable cultivation not only resulted in increased family income for the villager concerned (and positive impact on children in terms of nutrition and education), it also enhanced his social standing through increasing his capacity to contribute to traditional cultural events.

Here are three images from that Case Study:

Screen Shot 2018-06-09 at 3.07.54 PM

Screen Shot 2018-06-09 at 3.07.27 PM

Screen Shot 2018-06-09 at 3.07.41 PM

 

And here is a copy of the Case Study itself:  PNG Case Study #1 Hillary Vegetable farming RG edit 260111.  Later I was able to visit Hillary at his farm!

Another Case Study came from the ChildFund Connect project, an exciting effort led by my former colleagues Raúl Caceres and Kelly Royds, who relocated from Sydney to Boston in 2016.  I climbed Mt Moriah with them in July, 2017, and also Mt Pierce and Mt Eisenhower in August of 2016.  ChildFund Connect was an innovative project that linked children across Laos, Viet Nam, Australia and Sri Lanka, providing a channel for them directly to build understanding of their differing realities.   This Case Study on their project came from Laos: LAO Case Study #3 Connect DRAFT 2012.

In a future article in this series, I plan on describing work we carried out building the power (collective action) of people living in poverty.  It can be a sensitive topic, particularly in areas of Southeast Asia without traditions of citizen engagement.  Here is a Case Study from Viet Nam describing how ChildFund helped local citizens connect productively with authorities to resolve issues related to access to potable water: VTM Case Study #21 Policy and exclusion (watsan)-FINAL.

*

Dozens of Case Studies were produced, illustrating a wide range of experiences with the development processes supported by ChildFund in all of the countries where we managed program implementation.  Reflections from many of these documents helped us improve our development practice, and at the same time helped us stay in touch with the deeper purpose of our having chosen to work to promote social justice, accompanying people living in poverty as they built better futures.

*

A few of the DEF Case Studies focused, to some extent, on ChildFund Australia itself.  For example, here is the story of three generations of Hmong women in Nonghet District in Xieng Khoung Province in Laos.  It describes how access to education has evolved across those generations:  LAO Case Study #5 Ethnic Girls DRAFT 2012.  It’s a powerful description of change and progress, notable also because one of the women featured in the Case Study was a ChildFund employee, along with her mother and daughter!

Two other influential Case Studies came from Cambodia, both of which touched on how ChildFund was attempting to manage our child-sponsorship mechanisms with our programmatic commitments.  I’ve written separately, some time ago, about the advantages of child sponsorship: when managed well (as we did in Plan and especially in ChildFund Australia), and these two Case Studies evocatively illustrated the challenge, and the ways that staff in Cambodia were making it all work well.

One Case Study describes some of the tensions implicit in the relationship between child sponsorship and programming, and the ways that we were making progress in reconciling these differing priorities: CAM Case Study 6 Sponsorship DRAFT 2012.  This Case Study was very influential, with our staff in Cambodia and beyond, with program staff in Sydney, and with our board.  It powerfully communicated a reality that our staff, and families in communities, were facing.

A second Case Study discussed how sponsorship and programs were successfully integrated in the field in Cambodia: CAM Case Study #10 Program-SR Integration Final.

*

As I mentioned last time, given the importance of the system, relying on our feeling that the DEF was a great success wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

The first review (attached here: External DEF Review – November 2012), which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in my next blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advanced in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

I included an overview of the conclusions reached by both reviewers last time.  Here I want to quote from the first evaluation, with particular reference to the DEF Case Studies:

One of the primary benefits of the DEF is that it equips ChildFund Australia with an increased quantity and quality of evidence-based information for communications with key stakeholders including the Board and a public audience. In particular, there is consolidated output data that can be easily accessed by the communications team; there is now a bank of high quality Case Studies that can be drawn on for communication and reflection; and there are now dedicated resources in-country who have been trained and are required to generate information that has potential for communications purposes. The increase in quantity and quality of information equips ChildFund Australia to communicate with a wide range of stakeholders.

One of the strengths of the DEF recognized by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through Case Studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

This focus on building tools, systems and the overall capacity of the organization places ChildFund Australia in a strong position to tackle a second phase of the DEF which looks at how the organization will use performance information for learning and development. It has already started on this journey, with various parts of the organization using Case Studies for reflection. ChildFund Australia has already undertaken an exercise of coding the bank of Case Studies to assist further analysis and learning. There is lots of scope for next steps with this bank of Case Studies, including thematic reflections. Again, the benefits of this aspect have not been realised yet as the first stages of the DEF roll-out have been focused on data collection and embedding the system in CF practices.

In most Country Offices, Case Studies have provided a new formal opportunity for country program staff to reflect on their work and this has been used as a really constructive process. The Laos Country Office is currently in the process of translating Case Studies so that they can be used to prompt discussion and learning at the country level. In PNG, the team is also interested in using the Case Studies as a communication tool with local communities to demonstrate some of the achievements of ChildFund Australia programs.

In some cases, program staff have found Case Studies confronting when they have highlighted program challenges or weaknesses. The culture of critical reflection may take time to embed in some country offices and may be facilitated by cross-country reflection opportunities. Currently, however, Country Office staff do not know how to access Case Studies from other country programs. ChildFund Australia is exploring how the ‘bank’ of DEF Case Studies would be most accessible and useful to country office personnel.

One of the uses of Case Studies has been as a prompt for discussion and reflection by the programs team in Sydney and by the Board. Case Studies have been seen as a really useful way to provide an insight into a program, practice and ChildFund Australia achievements.

At an organizational level, an indexing and cross-referencing system has been implemented which enables Case Studies to be searched by country and by theme. The system is yet to be introduced to MEL and Program users, but has potential to be a very useful bank of qualitative data for reflection and learning. It also provides a bank of data from which to undertake thematic reflections across and between countries. One idea for consideration is that ChildFund draw on groups of Case Studies to develop practice notes.

In general Case Studies are considered to be the most ‘successful’ part of the DEF by those involved in collecting information.

The second reviewer concentrated on other components, mainly aspects I will describe in more detail in my next article, not so much the Case Studies…

*

So the Case Studies were a very important element in the overall DEF.  I tried very hard to incorporate brief reflections on selected Case Studies at every formal meeting of the International Program Team, of ChildFund Australia’s Program Review Committee, and (less frequently) at meetings of our Board of Directors.  More often than not, time pressures on the agendas of these meetings led to us dropping the Case Studies from discussion, but often enough we did spend time (usually at the beginning of the meetings) reflecting on what we saw in them.

At the beginning, when we first began to use the Case Studies, our discussion tended to be mechanical: pointing out errors in the use of English, or questioning how valid the observations might be, challenging the statistical reliability of the conclusions.  But, over time, I noticed that our teams began to use the Case Studies as they were designed: to gain insight into the lived experience of particular human beings, and to reconnect with the realities of people’s struggle for better lives for themselves and their children.

This was a great success, and really worked as I had hoped.  The Case Studies complemented the more rigorous, quantitative components of the DEF, helping the system be holistic, enabling us to see more deeply into the effect that our work was having while also enhancing our accountability.

*

Next time, I will describe getting to the top of West Bond, and all the way down the 11 miles from there to the Lincoln Woods parking lot, where I staggered back to my car with such damage to my feet that I soon would lose toenails on both my big toes!  And I will share details of the final two components of the DEF that I want to highlight: the Outcome Indicator Surveys and Statements of Impact were probably the culmination of the whole system.

So, stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System.

 

 

Bondcliff (35) – ChildFund Australia’s Development Effectiveness Framework

June, 2018

I began a new journey just over two years ago, in May, 2016, tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers. I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

In each article, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

So, when I wrap things up in this series, there should be 48 articles…

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia, a newly-created position.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team – my intention was to lead and manage with clarity, trust, and inspiration.  A few weeks ago, I wrote describing the role and staffing and structural iterations of ChildFund’s International Program Team and, last time, I outlined the foundational program approach we put in place – a Theory of Change and Outcome and Output Indicators.

Once the program approach was in place, as a strong foundation, we moved forward to build a structured approach to development effectiveness.  I am very proud of what we achieved: the resulting ChildFund Australia “Development Effectiveness Framework” (“DEF”) was, I think, state-of-the-art for international NGOs at the time.  Certainly few (if any) other INGOs in Australia had such a comprehensive, practical, useful system for ensuring the accountability and improvement of their work.

Since the DEF was so significant, I’m going to write three articles about it:

  1. In this article I will describe the DEF – its components, some examples of products generated by the DEF, and how each part of the system worked with the other parts.  I will also share results of external evaluations that we commissioned on the DEF itself;
  2. Next time, I will highlight one particular component of the DEF, the qualitative “Case Studies” of the lived experience of human change.  I was especially excited to see these Case Studies when they started arriving in Sydney from the field, so I want to take a deep dive into what these important documents looked like, and how we attempted to use them;
  3. Finally, I will the last two DEF components that came online (Outcome Indicator Surveys and Statements of Impact), the culmination of the system, where we assessed the impact of our work.

So there will be, in total, three articles focused on the DEF.  This is fitting, because I climbed three mountains on one day in August of 2017…

*

On 10 August, 2017, I climbed three 4000-footers in one day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a very long, very tough day, covering 22 miles and climbing three mountains in one go.  At the end of the hike, I felt like I was going to lose the toenails on both big toes… and, in fact, that’s what happened.  As a result, for the rest of the season I would be unable to hike in boots and had to use hiking shoes instead!

Knowing that the day would be challenging, I drove up from Durham the afternoon before and camped, so I could get the earliest start possible the next morning.  I got a spot at Hancock Campground, right near the trailhead where I would start the climb:

IMG_1871.jpg

 

The East Branch of the Pemigewassit River runs alongside this campground, and I spent a pleasant late afternoon reading a book by Jean Paul Lederach there, and when it was dark I crawled into my sleeping bag and got a good night’s sleep.

IMG_1868

IMG_1869

 

Here is a map of the long ascent that awaited me the next morning, getting to the top of Bondcliff:

Bond Map - 3.jpg

 

After Bondcliff, the plan was that I would continue on to climb Mt Bond and West Bond, and to then return to Lincoln Woods… more on that in the next two articles in this series.  In this one I will describe climbing the first 4000-footer of that day, Bondcliff.

I got an early start on 10 August, packing up my tent-site and arriving at the trailhead at Lincoln Woods at about 6:30am:

IMG_1873.jpg

 

It was just two weeks earlier that I had parked here to climb Owl’s Head, which I had enjoyed a lot.  This time, I would begin the same way – walking up the old, abandoned forestry railway for about 2.6 miles on Lincoln Woods Trail, to where I had turned left up the Franconia Brook Trail towards Owl’s Head.  I arrived at that junction at about 7:30am:

IMG_1883.jpg

IMG_1891.jpg

 

 

This time I would continue straight at that intersection, continuing onto the Wilderness Trail, which winds through forest for a short distance, before opening out again along another old logging railway, complete with abandoned hardware along the way, discarded over 130 years ago:

IMG_1893.jpg

 

At the former (and now abandoned) Camp 16 (around 4.4 miles from the parking lot at Lincoln Woods), I took a sharp left and joined a more normal trail – no more old railway.  I began to ascend moderately, going up alongside Black Brook: now I was on the Bondcliff Trail.

 

I crossed Black Brook twice on the way up after leaving the Wilderness Trail, and then crossed two dry beds of rock, which were either rock slides or upper reaches of Black Brook that were dry that day.

IMG_1898.jpg

 

It’s a long climb up Black Brook; after the second dry crossing, Bondcliff Trail takes a sharp left turn and continues ascending steadily.  Just before reaching the alpine area, and the summit of Bondcliff, there is a short steep section, where I had to scramble up some bigger boulders.  Slow going…

But then came the reward: spectacular views to the west, across Owl’s Head to Franconia Ridge, up to Mt Garfield, and over to West Bond and Mt Bond.  Here Mt Lincoln and Mt Lafayette are on the left, above Owl’s Head, with Mt Garfield to the right:

IMG_1905

Lincoln and Lafayette In The Distance On The Left, Mt Garfield In The Distance On The Right

 

Here is a view looking to the southwest from the top of Bondcliff:

IMG_1907

From The Summit Of Bondcliff

IMG_1920

From The Summit Of Bondcliff

 

And this is the view towards Mt Bond, looking up from the top of Bondcliff:

IMG_1925

West Bond Is On The Left, And Mt Bond On The Right

 

I got to the top of Bondcliff at about 10:30am, just about four hours from the start of the hike.  Feeling good … at this point!  Here is a spectacular view back down towards Bondcliff, taken later in the day, from the top of West Bond:

IMG_1964.jpg

 

I would soon continue the climb, with a short hop from Bondcliff up to the top of Mt Bond.  Stay tuned!

*

Last time I wrote about how we built the foundations for ChildFund Australia’s new program approach: a comprehensive and robust “Theory of Change” that described what we were going to accomplish at a high level, and why; a small number of reliable, measurable, and meaningful “Outcome Indicators” that would enable us to demonstrate the impact of our work as related explicitly to our Outcome Indicators; and a set of “Output Indicators” that would allow us to track our activities in a consistent and comparable manner, across our work across all our programs: in Cambodia, Laos, Papua New Guinea, and Viet Nam.  (Myanmar was a slightly different story, as I will explain later…)

Next, on that foundation, we needed a way of thinking holistically about the effectiveness of our development work: a framework for planning our work in each location, each year; for tracking whether we were doing what we had planned; for understanding how well we were performing; and improving the quality and impact of our work.  And doing all this in partnership with local communities, organizations, and governments.

This meant being able to answer five basic questions:

  1. In light of our organizational Theory of Change, what are we going to do in each location, each year?
  2. how will we know that we are doing what we planned to do?
  3. how will we know that our work makes a difference and gets results consistent with our Theory of Change?;
  4. how will we learn from our experience, to improve the way we work?;
  5. how can community members and local partners directly participate in the planning, implementation, and evaluation of the development projects that ChildFund Australia supports?

Looking back, I feel that what we built and implemented to answer those questions – the ChildFund Australia “Development Effectiveness Framework” (“DEF”) – was our agency’s most important system.  Because what could be more important than the answers to those five questions?

*

I mentioned last time that twice, during my career with Plan International, we had tried to produce such a system, and failed (at great expense).  We had fallen into several traps that I was determined to avoid repeating this time, in ChildFund Australia, as we developed and implemented the DEF:

  • We would build a system that could be used by our teams with the informed participation of local partners and staff, practically – that was “good enough” for its purpose, instead of a system that had to be managed by experts, as we had done in Plan;
  • We would include both quantitative and qualitative information, serving the needs of head and heart, instead of building a wholly-quantitative system for scientific or academic purposes, as we had done in Plan;
  • We would not let “the best be the enemy of the good,” and I would make sure that we moved to rapidly prototype, implement, and improve the system instead of tinkering endlessly, as we had done in Plan.

I go into more detail about the reasons for Plan’s lack of success in that earlier article.

*

Here is a graphic that Caroline Pinney helped me create, which I used very frequently to explain how the DEF was designed, functioned, and performed:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

In this article, I will describe each component of the DEF, outlining how each relates to each other and to the five questions outlined above.

However, I’m going to reserve discussion of three of those components for my next two articles:

  • Next time, I will cover #3 in Figure 1, the “Case Studies” that we produced.  These documents helped us broaden our focus from the purely quantitative to include consideration of the lived experience of people touched by the programs supported by ChildFund Australia.  In the same way, the Case Studies served as valuable tools for our staff, management, and board to retain a human connection to the spirit that motivated us to dedicate our careers to social justice;
  • And, after that, I will devote an article to our “Outcome Indicator Surveys” (#2 in Figure 1, above) and Statements of Impact (#12 in Figure 1). The approach we took to demonstrating impact was innovative and very participatory, and successful.  So I want to go into a bit of depth describing the two DEF components involved.

Note: I prepared most of what follows.  But I have included and adapted some descriptive material produced by the two DEF Managers that worked in the International Program Team:  Richard Geeves and Rouena Getigan.  Many thanks to them!

*

Starting Points

The DEF was based on two fundamental statements of organizational identity.  As such, it was built to focus us on, and enable us to be accountable for, what we were telling the world we were:

  1. On the bottom left of the DEF schematic (Figure 1, above) we reference the basic documents describing ChildFund’s identity: our Vision, Mission, Strategic Plan, Program Approach, and Policies – all agreed and approved by our CEO (Nigel Spence) and Board of Directors.  The idea was that the logic underlying our approach to Development Effectiveness would therefore be grounded in our basic purpose as an organization, overall.  I was determined that the DEF would serve to bring us together around that purpose, because I had seen Plan tend to atomize, with each field location working towards rather different aims.  Sadly, Plan’s diversity seemed to be far greater than required if it were simply responding to the different conditions we worked in.  For example, two Field Offices within 20 km of each other in the same country might have very different programs.  This excessive diversity seemed to relate more to the personal preferences of Field Office leadership than to any difference in the conditions of child poverty or the local context.  The DEF would help ChildFund Australia cohere, because our starting point was our organizational identity;
  2. But each field location did need a degree flexibility to respond to their reality, within ChildFund’s global identity, so at the bottom of the diagram we placed the Country Strategy Paper (“CSP”), quite centrally.  This meant that, in addition to building on ChildFund Australia’s overall purpose and identity globally, we would also build our approach to Development Effectiveness on how we chose to advance that basic purpose in each particular country where we worked, with that country’s particular characteristics.

Country Strategy Paper

The purpose and outline of the CSP was included in the ChildFund Australia Program Handbook:

To clarify, define, communicate and share the role, purpose and structure of ChildFund in-country – our approach, operations and focus. The CSP aims to build a unity of purpose and contribute to the effectiveness of our organisation.

When we develop the CSP we are making choices, about how we will work and what we will focus on as an organisation. We will be accountable for the commitments we make in the CSP – to communities, partners, donors and to ourselves.

While each CSP will be different and reflect the work and priorities of the country program, each CSP will use the same format and will be consistent with ChildFund Australia’s recent program development work.

During the development of the CSP it is important that we reflect on the purpose of the document. It should be a useful and practical resource that can inform our development work. It should be equally relevant to both our internal and external stakeholders. The CSP should be clear, concise and accessible while maintaining a strategic perspective. It should reflect clear thinking and communicate our work and our mission. It should reflect the voice of children.  Our annual work plans and budgets will be drawn from the CSP and we will use it to reflect on and review our performance over the three year period.

Implementation of the DEF flowed from each country’s CSP.

More details are found in Chapter 5 of the Program Handbook, available here: Program Handbook – 3.3 DRAFT.  Two examples of actual ChildFund Australia Country  Strategy Papers from my time with the organization are attached here:

For me, these are clear, concise documents that demonstrate coherence with ChildFund’s overall purpose along with choices driven by the situation in each country.

*

Beginning from the Country Strategy Paper, the DEF branches in two inter-related (in fact, nested) streams, covering programs (on the left side) and projects (on the right side).  Of course, projects form part of programs, consistent with our program framework:

Screen Shot 2018-05-28 at 2.16.30 PM

Figure 2: ChildFund Australia Program Framework

 

But it was difficult to depict this embedding on the two dimensions of a graphic!  So Figure 1 showed programs on one side and projects on the other.

Taking the “program” (left) side first:

Program Description

Moving onto the left side of Figure 1, derived from the Country Strategy Paper, and summarized in the CSP, each Country Office defined a handful (some countries had 3, others ended up with 5) “Program Descriptions” (noted as #1 in Figure 1), each one describing how particular sets of projects would create impact, together, as measured using ChildFund Australia’s Outcome Indicators – in other words, a “Theory of Change,” detailing how the projects included in the program linked together to create particular  positive change.

The purpose and outline of the Program Description was included in the ChildFund Australia Program Handbook:

ChildFund Australia programs are documented and approved through the use of “Program Descriptions”.  All Program Descriptions must be submitted by the Country Director for review and approval by the Sydney International Program Director, via the International Program Coordinator.

For ChildFund Australia: a “program” is an integrated set of projects that, together, have direct or indirect impact on one or more of our agreed organisational outcome indicators.   Programs normally span several geographical areas, but do not need to be implemented in all locations; this will depend on the geographical context.  Programs are integrated and holistic. They are designed to achieve outcomes related to ChildFund Australia’s mission, over longer periods, while projects are meant to produce outputs over shorter timeframes.

Program Descriptions were summarized in the CSP, contained a listing of the types of projects (#5 in Figure 1) that would be implemented, and were reviewed every 3 or 4 years (Program Review, #4 in Figure 1).

To write a Program Description, ChildFund staff (usually program managers in a particular Country Office) were expected to review our program implementation to-date, carry out extensive situational analyses of government policies, plans and activities in the sector and of communities’ needs in terms of assets, aspirations and ability to work productively with local government officials responsible for service provision. The results of ChildFund’s own Outcome Indicator surveys and community engagement events obviously provided very useful evidence in this regard.

Staff then proposed a general approach for responding to the situation and specific strategies which could be delivered through a set of projects.  They would also show that the approach and strategies proposed are consistent with evidence from good practice both globally and in-country, demonstrated that their choices were evidence-based.

Here are 2 examples of Program Descriptions:

Producing good, high-quality Program Descriptions was a surprising challenge for us, and I’m not sure we ever really got this component of the DEF right.  Probably the reason that we struggled was that these documents were rather abstract, and our staff weren’t used to operating at this level of abstraction.

Most of the initial draft Program Descriptions were quite superficial, and were approved only as place-holders.  Once we started to carry out “Program Reviews” (see below), however, where more rigor was meant to be injected into the documents, we struggled.  It was a positive, productive struggle, but a struggle nonetheless!

We persisted, however, because I strongly believed that our teams should be able to articulate why they were doing what they were doing, and the Program Descriptions were the basic tool for that exact explanation.  So we perservered, hoping that the effort would result in better programs, more sophisticated and holistic work, and more impact on children living in poverty.

*

 

 

Program Reviews

For the same reasons outlined above, in my discussion of the “Program Descriptions” component of the DEF, we also struggled with the “Program Review” (#4 in Figure 1, above).  In these workshops, our teams would consider an approved “Program Description” (#1 in Figure 1) every three or four years, subjecting the document to a formal process of peer review.

ChildFund staff from other countries visited the host country to participate in the review process and then wrote a report making recommendations for how the Program under review might be improved.  The host country accepted (or debated and adjusted) the  recommendations, acted on them and applied them to a revision of the Program Description: improving it, tightening up the logic, incorporating lessons learned from implementation, etc.

Program Reviews were therefore fundamentally about learning and improvement, so we made sure that, in addition to peers from other countries, the host Country Office invited in-country partners and relevant experts.  And International Program Coordinators from Sydney were asked to always attend Program Reviews in the countries that they were supporting, again for learning and improvement purposes.

The Program Reviews that I attended were useful and constructive, but I certainly sensed a degree of frustration.  In addition to struggling with the relatively-high levels of abstraction required, our teams were not used to having outsiders (even their peers other ChildFund offices) critique their efforts.  So, overall, this was a good and very-important component of the DEF, designed correctly, but it needed more time for our teams to learn how to manage this process and to be open to such a public process of review.

*

Projects and Quarterly Reports

As shown on the right hand side of Figure 1, ChildFund’s field staff and partners carried out routine monitoring of projects (#6 in the Figure) to ensure that they were on track, and on which they based their reporting on activities and outputs.  Project staff summarized their monitoring through formal Quarterly Reports (#7) on each project documenting progress against project plans, budgets, and targets to ensure projects are well managed.  These Quarterly Reports were reviewed in each Country Office and most were also forwarded to ChildFund’s head office in Sydney (and, often, donors) for review.

When I arrived, ChildFund Australia’s Quarterly reporting was well-developed and of high quality, so I didn’t need to focus on this aspect of our work.  We simply incorporated it into the more-comprehensive DEF.

*

Quarterly Output Tracking

As described last time, ChildFund developed and defined a set of Outputs which became standard across the organization in FY 2011-12.  Outputs in each project were coded and  tracked from Quarter to Quarter by project.  Some of the organizational outputs were specific to a sector such as education, health and water sanitation or a particular target group such as children, youth or adults.  Other Outputs were generic and might be found in any project, for example, training or awareness raising, materials production and consultation.

Organizational Outputs were summarized for all projects in each country each Quarter and country totals were aggregated in Sydney for submission to our Board of Directors (#8 in Figure 1, above).  In March 2014 there were a total of 47 organizational Outputs – they were listed in my last article in this series.

One purpose of this tracking was to enhance our accountability, so a summary was reviewed every Quarter in Sydney by the International Program Team and our Program Review Committee.

Here is an example of how we tracked outputs: this is a section of a Quarterly Report produced by the International Program Team for our Board and Program Review Committee: Output Report – Q4FY15.

*

Project Evaluations

ChildFund also conducted reviews or evaluations of all projects (#9 in Figure 1, above) – in different ways.  External evaluators were employed under detailed terms of reference to evaluate multi-year projects with more substantial budgets or which were significant for learning or to a particular donor.  Smaller projects were generally evaluated internally.  All evaluators were expected to gather evidence of results against output targets and performance indicators written against objectives.

*

All development effectiveness systems have, at their heart, mechanisms for translating operational experiences into learning and program improvement.  In the representation of ChildFund’s DEF in Figure 1, this was represented by the central circle in the schematic which feeds back evidence from a variety of sources into our organizational and Country Strategy Papers, Program Descriptions and project planning and design.

Our program staff found that their most effective learning often occurred during routine monitoring through observation of project activities and conversations in communities with development partners.  Through thoughtful questioning and attentive listening, staff could make the immediate decisions and quick adjustments which kept project activities relevant and efficient.

Staff also had more formal opportunities to document and reflect on learning.  The tracking of outputs and aggregation each Quarter drew attention to progress and sometimes signaled the need to vary plans or redirect resources.

Project evaluations (#9 in Figure 1, above) provided major opportunities for learning, especially when external evaluators bring their different experiences to bear and offer fresh perspectives on a ChildFund project.

*

The reader can easily grasp that, for me, the DEF was a great success, a significant asset for ChildFund Australia that enabled us to be more accountable and effective.  Some more-technically-focused agencies were busy carrying out sophisticated impact evaluations, using control groups and so forth, but that kind of effort didn’t suit the vast majority of INGOs.  We could benefit from the learnings that came from those scientific evaluations, but we didn’t have the resources to introduce such methodologies ourselves.  And so, though not perfect, I am not aware of any comparable organization that succeeded as we did with our DEF.

While the system built on what I had learned over nearly 30 years, and even though I felt that it was designed comprehensively and working very well, that was merely my opinion!

Given the importance of the system, relying on my opinion (no matter how sound!) wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

*

The first review, which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in an upcoming blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advance in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

In that light, this first external review of the DEF concluded the following:

The development of the DEF places ChildFund Australia in a sound position within the sector in the area of development effectiveness. The particular strength of ChildFund Australia’s framework is that it binds the whole organisation to a set of common indicators and outputs. This provides a basis for focussing the organisation’s efforts and ensuring that programming is strategically aligned to common objectives. The other particular strength that ChildFund Australia’s framework offers is that it provides a basis for aggregating its achievements across programs, thereby strengthening the organisation’s overall claims of effectiveness.

Within ChildFund Australia, there is strong support for the DEF and broad agreement among key DEF stakeholders and users that the DEF unites the agency on a performance agenda. This is in large part due to dedicated resources having been invested and the development of a data collection system has been integrated into the project management system (budgeting and planning, and reporting), thereby making DEF a living and breathing function throughout the organisation. Importantly, the definition of outcomes and outputs indicators provides clarity of expectations across ChildFund Australia.

One of the strengths of the DEF recognised by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through case studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

Significantly, the DEF signals a focus on effectiveness to donors and the sector. One of the benefits already felt by ChildFund Australia is that it is able to refer to its effectiveness framework in funding submissions and in communication with its major donors who have an increasing interest on performance information.

Overall, the review found that the pilot of the DEF has been implemented well, with lots of consultation and engagement with country offices, and lots of opportunity for refinement. Its features are strong, enabling ChildFund to both measure how much it is doing, and the changes that are experienced by communities over time. The first phase of the DEF has focused on integrating effectiveness measurement mechanisms within program management and broader work practices, while the second phase of the DEF will look at the analysis, reflection and learning aspects of effectiveness. This second phase is likely to assist various stakeholders involved in collecting effectiveness information better understand and appreciate the linkages between their work and broader organisational learning and development. This is an important second phase and will require ongoing investment to maximise the potential of the DEF. It place ChildFund Australia in a strong position within the Australian NGO sector to engage in the discourse around development effectiveness and demonstrate its achievements.

A full copy of this first review, removing only the name of the author, is attached here: External DEF Review – November 2012.

In early 2015 we carried out a second review.  This time, we had implemented the entire DEF, carrying out (for example) Statement of Impact workshops in several locations.  The whole system was now working.

At that point, we were very confident in the DEF – from our point of view, all components were working well, producing good and reliable information that was being used to improve our development work.  Our board, program-review committee, and donors were all enthusiastic.  More importantly, local staff and communities were positive.

The only major concern that remained related to the methodology we used in the Outcome Indicator Surveys.  I will examine this issue in more detail in an upcoming blog article in this series; but the reader will notice that this second formal, external evaluation focuses very much on the use of the LQAS methodology in gathering information for our Outcome Indicator workshops and Statements of Impact.

That’s why the external evaluator we engaged to carry out this second review was an expert in survey methodologies (in general) and in the LQAS (in particular.)

In that light, this second external review of the DEF concluded the following:

ChildFund Australia is to be commended for its commitment to implementing a comprehensive and rigorous monitoring and evaluation framework with learning at its centre to support and demonstrate development effectiveness. Over the past five years, DEL managers in Cambodia, Laos, Papua New Guinea and Vietnam, with support and assistance from ChildFund Australia, country directors and program managers and staff, have worked hard to pilot, refine and embed the DEF in the broader country programs.  Implementing the DEF, in particular the Outcome Indicator Survey using LQAS, has presented several challenges.  With time, many of the early issues have been resolved, tools improved and guidelines developed.  Nevertheless, a few issues remain that must be addressed if the potential benefits are to be fully realised at the organisational, country and program levels.

Overall, the DEF is well suited for supporting long-term development activities in a defined geographic area.  The methodologies, scope and tools employed to facilitate Outcome Indicator Surveys and to conduct Community Engagement and Attribution of Impact processes are mostly fit for purpose, although there is considerable room for improvement.  Not all of the outcome indicators lend themselves to assessment via survey; those that are difficult to conceptualise and measure being most problematic. For some indicators in some places, a ceiling effect is apparent limiting their value for repeated assessment. While outcome indicators may be broadly similar across countries, both the indicators and the targets with which they are to be compared should be locally meaningful if the survey results are to be useful—and used—locally.

Used properly, LQAS is an effective and relatively inexpensive probability sampling method.  Areas for improvement in its application by ChildFund include definition of the lots, identification of the sampling frame, sample selection, data analysis and interpretation, and setting targets for repeated surveys.

Community Engagement and the Attribution of Impact processes have clearly engaged the community and local stakeholders.  Experience to date suggests that they can be streamlined to some extent, reducing the burden on staff as well as communities.  These events are an important opportunity to bring local stakeholders together to discuss local development needs and set future directions and priorities.  Their major weakness lies in the quality of the survey results that are presented for discussion, and their interpretation.  This, in turn, affects the value of the Statement of Impact and other documents that are produced.

The DEF participatory processes have undoubtedly contributed to the empowerment of community members involved. Reporting survey results in an appropriate format, together with other relevant data, in a range of inviting and succinct documents that will meet the needs of program staff and partners is likely to increase their influence.

A full copy of this second review, removing only the name of the author, is attached here: DEF Evaluation – April 2015.

*

Great credit is due to ChildFund staff that contributed to the conceptualization, development, and implementation of the DEF.  In particular, Richard Geeves and Rouena Getigan in the International Program Team in Sydney worked very hard to translate my sometimes overly-ambitious concepts into practical guidelines, and ably supported our Country Offices.

One of the keys to the success of the DEF was that we budgeted for dedicated in-country support, with each Country Office able to hire a DEL Manager (two in Viet Nam, given the scale of our program there.)

Many thanks to Solin in Cambodia, Marieke in Laos, Joe in Papua New Guinea, and Thuy and Dung in Viet Nam: they worked very hard to make the DEF function in their complex realities.  I admire how that made it work so well.

*

In this article, I’ve outlined how ChildFund Australia designed a comprehensive and very robust Development Effectiveness System.  Stay tuned next time, when I describe climbing Mt Bond, and then go into much more depth on one particular component (the Case Studies, #3 in Figure 1, above).

After that, in the following article, I plan to cover reaching the top of West Bond and descending back across Mt Bond and Bondcliff (and losing toenails on both big toes!) and go into some depth to describe how we carried out Outcome Indicator Surveys (#2 in Figure 1) and Statements of Impact (#12) – in many ways, the culmination of the DEF.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change.

 

 

Owl’s Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change

May, 2018

I began a new journey just over two years ago (May, 2016), tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers.  I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

So, in each article in this series, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team with clarity, trust, and inspiration.  Last time I described the role and staffing and structural iterations of the International Program Team there.

This time, I want to begin to unpack the program approach that we put in place, building on what was already there, and on the lessons I had learned in the previous 25 years.

But first…

*

Owl’s Head (4025ft, 1227m) is described by many hikers as uninteresting, boring, and challenging – something that “should not be left to the end” of the 48 peaks.  I guess that’s because climbers want to finish their long voyage up so many great mountains in a blaze of glory, but they find Owl’s Head to be a letdown after the challenges and thrills of the other 47 4000-footers.

I climbed Owl’s Head on 26 July, 2017, and enjoyed every minute of it!

Yes, it’s long and mostly in the forest.  Yes, getting up the rock slide on the western side of Owl’s Head is tough going.  Yes, there are several river crossings which can be problematic when the water’s high.  And, yes, it’s not a ridge walk, so the views are (mostly) obstructed.  But on this late-July day, the walking was fantastic, the river crossings were nerve-wracking but doable, and the views going up (and coming down) the rock slide, looking across at Franconia Ridge, were fantastic.

I left Durham at about 6am, getting an early start because my calculations were that the ascent would be over 6 hours, just getting to the top.  Figuring in a descent of 4 hours, at least, made me want to get walking as soon as possible.  As has been my normal routine these days, I stopped in Tilton for coffee, and I bought a sandwich for lunch in Lincoln, very near the trailhead.

Screen Shot 2017-07-29 at 5.12.23 PM.png

 

I had brought sandals to carry with me for the river crossings, just in case.

After parking at the Lincoln Woods Visitor Center, I started the hike at 8:10am.

IMG_1546.jpg

 

It was a beautiful, cool, sunny day.  Just beyond the Visitor Center, two trails head up the East Branch of the Pemigewasset River: the Pemi East Side Trail and the Lincoln Woods Trail.  To get to the Lincoln Woods Trail, which I would take, I crossed a suspension bridge and took a right turn to head north:

IMG_1558.jpg

IMG_1561.jpg

 

The Lincoln Woods Trail runs along an old forest railway, and is wide and straight for over two miles.  Dappled, high forest, just gorgeous, crisp day.  Nervous about how long I thought it would take me to reach Owl’s Head, and return, I flew up this first easy part, almost trotting up the gentle incline:

IMG_1566.jpg

Lincoln Woods Trail – Formerly a Forest Railway, Straight and Wide

 

Old railway ties can be seen in the image, above.  Here is an image of one of the nails in a tie:

IMG_1710.jpg

 

There were a few other hikers heading up the Lincoln Woods Trail along with me, more than I expected on a summer Wednesday, but it wasn’t crowded.  I reached the junction with the Osseo Trail at 8:33am, and Black Point Trail at 8:53am:

 

Just before 9am, I arrived at the junction with Franconia Brook Trail.  So it had taken me about 50 minutes to walk up the 2.6 miles from the Lincoln Woods Visitor Center.  It had been gently up hill the whole way so far.

Here, just after a small footbridge over Franconia Brook, I would turn left, up the Franconia Brook Trail:

IMG_1572.jpg

Footbridge Over Franconia Brook

IMG_1576.jpg

 

(A few weeks later I would come to this junction once again, but would continue straight on the Bondcliff Trail.)

Franconia Brook Trail was a real trail, at least at the beginning, but soon, as I headed north up the Franconia Brook, there were long sections that must have also been old railway – straight, and wide, and gradually uphill.  Pleasant walking!  I thought that coming down would be even faster.

From here, the water level in Franconia Brook didn’t look too high:

IMG_1574.jpg

 

I hiked up Franconia Brook Trail, 1.7 miles, and reached the junction with Lincoln Brook Trail at 9:33am.  I was still making very good time – 1.7 miles in about 30 minutes.  But I didn’t feel that I was rushing, it was very nice hiking through the woods on the wide trail!

Here I would swing west to walk around Owl’s Head in a clockwise sense, following (and repeatedly crossing) the Lincoln Brook until reaching Owl’s Head Path:

IMG_1580.jpg

 

I would cross Franconia Brook four times going up the west side of Owl’s Head, and four times coming back down, retracing my steps.  The first crossing, at 9:44am, was the most difficult, and I almost gave my boots a good bath that time.  It was a little dicey…

Of course, as I climbed up the valley, the Brook became smaller as I walked above different streams that were feeding into it.  So the first (and last, when returning) crossing had the most water.

IMG_1583.jpg

IMG_1671.jpg

IMG_1588.jpg

IMG_1680.jpg

IMG_1593.jpg

 

 

The trail was less maintained here, certainly not an old forest railway, though I did see two trail crews working on it that day.

I reached the turnoff for Owl’s Head Path at 11:08am.  I had become nervous that I had passed it, feeling that I should have reached the turnoff some time before, and there were no signs.  By the time I reached the cairns marking the turnoff I was quite anxious and was thinking vaguely about turning back.  But, luckily, as I was approaching the cairns that can be seen in the next image, a young woman came down from having gone up Owl’s Head, and she confirmed that I had reached the junction!

IMG_1656.jpg

The Junction With Owl’s Head Path – Steeply Up From Here!

 

So it had taken me nearly an hour and a half to walk Lincoln Brook Trail, from Franconia Brook Trail to Owl’s Head Path, including four stream crossings.  Since Owl’s Head Path was supposed to be quite steep for some time, up a rock slide, I decided to leave some weight here at the bottom; so I took a quart of water and my sandals out of my pack and hid them at the junction.

I started up Owl’s Head at 11:17am, a bit lighter, after having a snack.  Soon I reached the famous rock slide, which was very steep, indeed.  Mostly gravel, so lots of sliding downward which made it heavy going.

IMG_1647.jpg

 

It was slippery and challenging, and did I mention that it was very steep?  Another young person came down and we crossed paths; she was very unhappy and had turned back before reaching the summit.  It was too dangerous and she was giving up, and was vocal about how unpleasant it was.  This would have been summit number 29 for her, but when carrying a full pack it wasn’t possible.  It was very heavy going, relentless and challenging!

But the views from the rock slide were fantastic, looking back towards Franconia Ridge I could see all four of the 4000-footers there: Flume, Liberty, Lincoln and Lafayette.  The light was still good, not yet noon, so the sun shined on the ridge from the east:

IMG_1642.jpg

Flume Is On The Far Left, Then Liberty, Lincoln, And Then Lafayette.

IMG_1643.jpg

 

Here is a video of that view from the rock slide, looking over to Franconia Ridge:

The White Mountain Guide indicates that the top of Owl’s Head is not very accessible, and that the end of Owl’s Head Path, which is just short of the actual summit, qualifies as reaching the top.  Apparently, at least when my edition of the Guide was published, reaching the actual summit involved a fair amount of bush-whacking.

Owl’s Head Path began to flatten out at about 12:09pm, and I reached what (I think) was the former end of the Path at about 12:15pm.

IMG_1628.jpg

The End Of Owl’s Head Path – Not The Summit!

 

Here I was able to turn left, to the north, and there was a path heading towards the actual summit – not a very wide path, switching back and forth a lot, but certainly not bush-whacking.

I got to the actual top at about 12:30pm, and had lunch.  Though I had seen a few other climbers after I passed the discouraged young woman, I had the summit to myself for lunch – it was very pleasant!

IMG_1615

Owl’s Head Summit

IMG_1620

IMG_1617.jpg

Some Vestiges Of Lunch Are Visible!

 

I had really really enjoyed the walk so far… maybe partly because expectations had been so low?

I left the summit, after a nice lunch, still wet with sweat, at about 12:45pm.  I could see Franconia Ridge to the west, through the forest:

IMG_1630.jpg

 

And there were some views to the east, towards the Bonds, but the Owl’s Head ridge was more forested that way, so no photos were possible.  I got back to the top of Owl’s Head Path at about 1pm, and to the beginning of the rock slide about 20 minutes later.  I dropped down the slide, taking care and many photos, reaching the junction with Lincoln Woods Trail at about 2pm.  So, about an hour to descend carefully.

The walk back down Lincoln Woods Trail was pleasant:

IMG_1662

IMG_1663

 

Recrossing Lincoln Brook four times – simpler this time – and passing the trail-maintenance crews again, I got back to the junction with Franconia Brook Trail at about 3:36pm.  Here I turned south and walked back down that old railway line:

 

There was a bit of old railway hardware along the side of the trail:

IMG_1678.jpg

 

For much of this section, there were a few mosquitoes, but the walking was pleasant, on a soft bed of pine needles:

IMG_1688.jpg

 

I passed a young woman resting on the side of the trail, with a very full pack.  “You’re carrying a lot!” I said, and she replied: “I’m ready to let it go!” in a resigned tone of voice…

Ups and down … mostly downward gently.  Long and level and wide.  I reached the junction with Lincoln Woods Trail at about 4:11pm, and the Trail got even wider and straighter and easier.  Funnily enough, there is a section of measured length here, which (of course) I had passed on the way up: 200 yards.  The idea is to measure how many paces it took.  On the way up, I counted 41 (double) paces, and 44 on the way back.  So I was walking with shorter paces on the way down!

 

I reached the Lincoln Woods Visitor Center, and my car, at about 5:15pm.  It had taken me almost 9 hours to climb Owl’s Head, which was substantially less than I had calculated: from the White Mountain Guide, just the ascent, walking up, should have been about 6 1/2 hours.

But it was a great hike on a wonderful day.  I enjoyed every minute of it!

*

As I arrived in Sydney to take up the newly-created position of International Program Director, one of my biggest priorities was to clarify our program approach.  This would involve lots of internal discussion, research and reflection, and I was determined to bring to this task the lessons I had learned in the previous 25 years of working in the sector (and described in the articles in this series!)

I understood that our program approach needed to be built on a clear understanding of what we were going to achieve, and why.  After completing the staffing of the first iteration of the International Program Team in Sydney, getting to know our programs in Cambodia, Papua New Guinea, and Viet Nam, and settling in with other Sydney-based senior managers and our board, I got going!

*

I had first heard of the concept of “Theory of Change” when I asked Alan Fowler to critique an early draft of the UUSC Strategic Plan in 2005.  He had, quite rightly, pointed out that the draft Strategy was good, but that it didn’t really clarify why we wanted to do what we were describing: how did we understand the links between our actions and our vision and mission?

Reflecting on Alan’s observation, I understood that we should put together a clear statement of causality, linking our actions with the impact we sought in the world.  So we did that, and ended up with a very important statement that really helped UUSC be clear about things:

Human rights and social justice have never advanced without struggle. It is increasingly clear that sustained, positive change is built through the work of organized, transparent and democratic civic actors, who courageously and steadfastly challenge and confront oppression. 

UUSC’s strategy derived from that statement in a powerful way.

Perhaps a better definition of the concept comes from the “Theory of Change Community”:

Theory of Change is essentially a comprehensive description and illustration of how and why a desired change is expected to happen in a particular context. It is focused in particular on mapping out or “filling in” what has been described as the “missing middle” between what a program or change initiative does (its activities or interventions) and how these lead to desired goals being achieved. It does this by first identifying the desired long-term goals and then works back from these to identify all the conditions (outcomes) that must be in place (and how these related to one another causally) for the goals to occur. These are all mapped out in an Outcomes Framework.

The Outcomes Framework then provides the basis for identifying what type of activity or intervention will lead to the outcomes identified as preconditions for achieving the long-term goal. Through this approach the precise link between activities and the achievement of the long-term goals are more fully understood. This leads to better planning, in that activities are linked to a detailed understanding of how change actually happens. It also leads to better evaluation, as it is possible to measure progress towards the achievement of longer-term goals that goes beyond the identification of program outputs.

At ChildFund Australia, one of my earliest actions was to develop and finalize a Theory of Change and the associated Outcomes Framework and Outputs.  In this article, I want to describe how we did this, and what we achieved.

*

First, some definitions.  Strangely, my experience is that when we in the INGO community try to agree on a common set of definitions, we usually end up arguing intensely and never agreeing!  The concepts we seek to define can be viewed productively in different ways; for me, it seemed most useful to find definitions that we could all live with, and use them, rather than trying to reach full consensus (which, over time, seemed to be an impossible dream!)

Here is the visual framework and definitions that we used in ChildFund Australia:

Screen Shot 2018-05-28 at 2.16.30 PM.png

 

A set of Inputs producing a consistent set of Outputs is a Project; a set of Projects producing a consistent set of Outcomes is a Program; a set of Programs producing a consistent set of Impacts is a Strategic Plan.

Note that:

  • “Inputs” are usually time or money;
  • “Outputs” are tangible and concrete products delivered by or through ChildFund: for example, a training course, a trip or meeting, a publication, rent, a latrine – see below;
  • “Outcomes” are changes in the Outcome Indicators that we developed – see below;
  • “Impact” is the highest-level of organisational achievement, related directly to the achievement of our mission.

This is pretty standard stuff, nothing particularly innovative.  But ChildFund Australia hadn’t formally adopted these definitions, which now began to provide a common language for our program work.

*

When we began to develop ChildFund Australia’s Theory of Change, Outcomes Framework, and Outputs, I took care to bring into the process several important lessons I had learned from previous experiences:

  • As mentioned above, from my experience at UUSC I had learned that the creation of a Theory of Change had the potential to be energizing and unifying, if it was carried out in a participatory manner;
  • Along the way, as the loyal reader of this series will have seen, my own view of development and poverty had grown to incorporate elements of social justice, collective action, and human rights.  I wanted to recognize these important elements into ChildFund Australia’s understanding of child poverty and development;
  • I recognized the significant complexity and cost associated with crafting and measuring Outcome Indicators, which would essentially articulate how we would hold ourselves accountable to our purpose.  Outcome Indicators are complex to use and expensive to measure.  So I felt that we should rely on the work done by technical agencies (the UNDP and UNICEF, other INGOs, and other ChildFund members) whenever possible, and to rely on national-government measurement systems when available and credible.  That meant that using MDG-related indicators, where appropriate, would be our first priority, because of the enormous effort that had been put into creating and measuring them around most of the world;
  • From my work with CCF, especially having participated in their child-poverty study, I had learned that children experience poverty in a more-complex way than we had earlier recognized: as deprivation, certainly; but also as exclusion and vulnerability.  We would incorporate this DEF framework now in Australia;
  • In my next blog article, I will describe how we created a “Development Effectiveness Framework” for ChildFund Australia.  The “DEF” would describe and detail the processes and products through which we would use the Theory of Change, Outcomes Framework, and Outcomes to operationally improve the effectiveness of our development work.  Twice, during my career with Plan International, we had tried to produce such a system, and failed comprehensively (and at great expense.)  We had failed due to several fundamental mistakes that I was determined to avoid making in Australia:
    • At Plan, we fell into the trap of designing a system whose purpose was, mostly, the demonstration of impact rather than learning and improvement of programming.   This led to a complex, and highly-technical system that was never actually able to be implemented.  I wanted, this time, to do both – to demonstrate impact and to improve programs – but fundamentally to create a practical system that could be implemented in the reality of our organization;
    • One of the consequences of the complexity of the systems we tried to design at Plan was that community members were simply not able to participate in the system in any meaningful way, except by using the data to participate in project planning.  We would change this at ChildFund, and build in many more, meaningful areas for community involvement;
    • Another mistake we made at Plan was to allow the creation of hundreds of “outputs.”  It seemed that everybody in that large organization felt that their work was unique, and had to have unique descriptors.  I was determined to keep the DEF as simple and practical as possible;
    • The Plan system was entirely quantitative, in keeping with its underlying (and fallacious) pseudo-scientific purpose.  But I had learned that qualitative information was just as valid as quantitative information, illustrating a range of areas for program improvement that complemented and extended the purely quantitative.  So I was going to work hard to include elements in the DEF that captured the human experience of change in narrative ways;
    • Both times we tried to create a DEF-like system in Plan, we never really quite finished, the result was never fully finalized and rolled out to the organization.  So, on top of the mistakes we made in developing the system, at great expense, the waste was even more appalling because little good came of the effort of so many people, and the spending of so much time and money.  In ChildFund, we would not let “the best be the enemy of the good,” and I would make sure to move to rapidly prototype, implement, and improve the system;
  • Finally, I had learned of the advantages and disadvantages of introducing this kind of fundamental change quickly, or slowly:
    • Moving slowly enables more participation and ownership, but risks getting bogged down and losing windows of opportunity for change are often short-lived;
    • Moving quickly allows the organization to make the change and learn from it within that short window of enthusiasm and patience.  The risk is that, at least for organizations that are jaded by too many change initiatives, the process can be over before people actually take it seriously, which can lead to a perception that participation was lacking.

I decided to move quickly, and our CEO (Nigel Spence) and board of directors seemed comfortable with that choice.

*

The ChildFund Australia Theory of Change

Upon arrival in Sydney in July of 2009, I moved quickly to put in place the basic foundation of the whole system: our Theory of Change.  Once staffing in the IPT was in place, we began.  Firstly, since we knew that effective programs address the causes of the situation they seek to change, building on the work of Amartya Sen, we defined poverty as the deprivation of the capabilities and freedoms people need to live the life they value.

Then I began to draft and circulate versions of a Theory of Change statement, incorporating input from our board, senior managers (in Sydney and in our Country Offices in Cambodia, Papua New Guinea and Viet Nam), and program staff across the agency.

This process went very well, perhaps because it felt very new to our teams.  Quickly we settled on the following statement:

Theory of Change.001

The ChildFund Australia “Theory of Change”

 

Note here that we had included a sense of social justice and activism in the Theory of Change, by incorporating “power” (which, practically, would mean “collective action”) as one central pillar.  And it’s clear that the CCF “DEV” framework was also incorporated explicitly.

The four dot-points at the end of the Theory of Change would come to fundamentally underpin our new program approach.  We would:

  • Build human, capital, natural and social assets around the child, including the caregiver.  This phrasing echoed the Ford Foundation’s work on asset-based development, and clarified what we would do to address child deprivation;
  • Build the voice and agency of poor people and poor children.  This pillar incorporated elements of “empowerment,” a concept we had pioneered in Plan South America long before, along with notions of stages of child and human development; and
  • Build the power of poor people and poor children.  Here we were incorporating the sense that development is related to human rights, and that human rights don’t advance without struggle and collective action; and we would
  • Work to ensure that children and youth are protected from risks in their environments.  Our research had shown that poverty was increasingly being experienced by children as related to vulnerability, and that building their resilience and the resilience of the caregivers and communities around them was crucial in the modern context.

This Theory of Change would serve admirably, and endure unchanged, through the next five years of program development and implementation.

*

Output Indicators

Now, how would we measure our accomplishment of the lofty aims articulated in the Theory of Change?  We would need to develop a set of Outcome and Output Indicators.

Recall that, according to the definitions that we had agreed earlier, Outputs were seen as: tangible and concrete products delivered by or through ChildFund: for example, a training course, a trip or meeting, a publication, rent, a latrine.

Defining Outputs was an important step for several reasons, mostly related to accountability.  Project planning and monitoring, in a classical sense, focuses on determining the outputs that are to be delivered, tracking whether or not they are actually produced, and adjusting implementation along the way.

For ChildFund Australia, and for our public stakeholders, being able to accurately plan and track the production of outputs represented a basic test of competence: did we know what we were doing?  Did we know what we had done?  Being able to answer those questions (for example, “we planned to drill 18 wells, and train 246 new mothers, and ended up drilling 16 wells and training 279 new mothers”) would build our creditability.  Perhaps more pungently, if we could not answer those questions (“we wanted to do the best we could, but don’t really know where our time and the budget went…”!) our credibility would suffer.  Of course, we wanted to know much more than that – our DEF would measure much more – but tracking outputs was basic and fundamental.

To avoid the trap we had fallen into in Plan, where we ended up with many hundreds of Outputs, I was determined to keep things simple.  We had already planned to bring all our Program Managers to Sydney in October of 2009, for another purpose, and I managed to commandeer this key group for a day.  I locked them in a meeting room for a day with the task of listing all the outputs that they were producing, and agreeing a short and comprehensive list.  We would then work with this draft and use it as a starting point.

The process worked very well.  Our Program Managers produced a list of around 35 Output Indicators that covered, well-enough, pretty much all the work they were doing.  Over the next three years, as our programming evolved and matured, we ended up adding about 15 more Output Indicators, with the final list (as of March, 2014) as follows:

Screen Shot 2018-05-28 at 3.01.27 PM.png

 

This listing worked very well, enabling us to design, approve, monitor and manage project activities in an accountable way.  As will be seen when I describe our Development Effectiveness Framework, in the next article in this series, we incorporated processes for documenting ChildFund Australia’s planning for Output production through the project-development process, and for tracking actual Output delivery.

Outcome Indicators

Designing Outcome Indicators was a bigger challenge.  Several of our colleague ChildFund agencies (mostly the US member) had developed indicators that were Outcome-like, and I was aware of the work of several other INGOs that we could “borrow.”  Most importantly, as outlined above, I wanted to align our child-focused Outcome Indicators with the Millennium Development Goals as much as possible.  These were robust, scientific, reliable and, in most countries, measured fairly accurately.

As we drafted sets of Outcome Indicators and circulated them for comment with our Board Program Review Committee, Senior Management, and program staff, our CEO (Nigel Spence) was insistent that we kept the number of Outcome Indicators as small as possible.

I agreed with Nigel, in general (“keep things simple”) and in particular (in Plan we had been swamped by too many indicators, and never actually implemented either system).  But it was a big challenge to measure the lofty concepts included in our Theory of Change with just a few indicators!

When we finalized the first iteration, approved by our Board of Directors in June of 2010, we had only 16 Outcome Indicators:

Screen Shot 2018-05-28 at 3.16.59 PM.png

Screen Shot 2018-05-28 at 3.17.10 PM.png

 

 

Nigel thought this was too many; I thought we had missed covering several crucial areas.  So it seemed a good compromise!

It would take some time to work out the exact mechanism for measuring these Indicators in our field work, but in the end we were able to keep things fairly simple and we began to work with communities to assess change and determine attribution (more on that in the next article in this series.)

Additional Outcome Indicators were introduced over the next few years, elaborating especially the domains of “Protection” and “Power,” which were relatively undeveloped in that initial package of 16, finalized in June of 2010.

*

So, by the time I was celebrating one year at ChildFund Australia, we had agreed and  approved a clear and comprehensive Theory of Change, a coherent and concise set of robust Outcome Indicators, and a complete set of (not too many) Output Indicators.

*

Looking back, I think we got this right.  The process was very inclusive and participatory, yet agile and productive.  The results were of high quality, reflecting the state of the art of our sector, and my own learning through the years.  It was a big step forward for ChildFund Australia.

This meant that the foundation for a strong Development Effectiveness Framework was in place, a framework which would help us make our program work as effective as possible in building brighter futures for children.  This was (if I do say so myself!), a huge achievement in such a complex organization, especially that we accomplished it in only one year.

From the perspective of 2018, there is little I would change about how we took on this challenge, and what we produced.

*

My next article in this series will describe how we build the ChildFund Australia Development Effectiveness Framework on the foundation of our Theory of Change and Outcome and Output Indicators.  Stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (1): the ChildFund Australia International Program Team.