South Twin (41) – Disaster Risk Reduction

September, 2018

During my six years at ChildFund Australia, the program approach that we developed early in my tenure meant that reducing vulnerability became one of our biggest priorities.  This was new territory for us, with lots of learning and testing: what did reducing child vulnerability mean for ChildFund Australia?  What kinds of vulnerability would we address?  And where?

In this article I will focus on one aspect of vulnerability that we worked on: “disaster risk reduction” (DRR).  And, in particular, I want to highlight our participation in the United Nations World Conference on Disaster Risk Reduction, which took place in March, 2015, in Sendai, Japan.

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 34 years ago: on development, social justice, conflict, experiences along the way, etc.

But first…

*

Last time I described how our Granddaughter V and I had climbed North Twin Mountain on 2 September 2017.  Since it was V’s first real mountain climb, we had agreed that we would decide about continuing to South Twin when we got to the top of North Twin.

We had arrived at the outlook near the top of North Twin at about 1:30pm and had lunch, getting to the top just after 2pm.  V was enthusiastic about continuing, and she was certainly keeping up with me on the climb – no problems at all – so we decided to continue on towards the top of South Twin (4902ft, 1494m) – the 8th-highest 4000-footer, and my 41st on this round.  It would be V’s second 4000-footer that day!

Screen Shot 2017-09-05 at 4.01.41 PM.png

 

We took the North Twin Spur, leaving North Twin a bit after 2pm.  We dropped down into a saddle between North and South Twin, and walked through pleasant forest for 1.3miles:

IMG_2414.jpg

 

Here the top of South Twin is visible in the near distance, with a few people at the summit:

IMG_2415.jpg

 

We arrived at the top of South Twin at 2:45pm, and spent some time there taking pictures.  It was clear, and spectacular, not that many people.  Probably the very best views of the White Mountains I’ve ever seen – towards the northeast and the Presidentials; to the south and Mt Bond and West Bond (Bondcliff was obscured by West Bond from this viewpoint) and Waterville Valley; over towards the west and Franconia Ridge and Garfield:

 

IMG_2435.jpg

IMG_2443.jpg

IMG_2456.jpg

Summit Of South Twin – Franconia Ridge In The Background

 

 

Here we turned around, leaving South Twin at around 3:15pm, heading back towards the trail-head.  In this section, V developed a painful cramp in her knee, briefly, which she was able to shake off.  So we kept going…

IMG_2469.jpg

 

We arrived back at the top of North Twin about an hour after leaving South Twin:

IMG_2476

IMG_2472.jpg

 

And then we began the long walk back down North Twin trail, dropping down fairly steeply at first, then more gradually as we arrived at Little River.  I filmed V crossing the river at about 6pm:

 

I had estimated that we’d be back at the trail-head, where the car was parked, by 6:30pm. It turned out to be 6:51pm by the time we got there – a long, 8-hour hike, around 11 miles total, but a spectacular day.

IMG_2493.jpg

 

South Twin was number 41 for me, and V’s second 4000-footer in one day!  She did a great job on her first 4000-footers.

Now I had only seven more to go.

*

Very early in my time at ChildFund Australia, we developed a program approach founded upon a comprehensive “theory of change.”  I’ve written about that earlier in this series.

The development of that program approach was done through a great process of reflection and collaboration.  In the end, our experience, learning, and reflection led us to understand that people are poor because:

  1. they are deprived of assets (human, capital, natural, and social);
  2. they are excluded from their societies, and are invisible (voice and agency);
  3. of power differentials in their families, communities, societies, and across nations.

And we understood that (4) children and youth are particularly vulnerable to risks in their environment, which can result in dramatic increases in poverty; they therefore require protection from physical and psycho-social threats, and sexual abuse, natural and human-caused emergencies, slow-onset disasters, civil conflict, etc.

Because we understood that these are the four causes of child poverty, we set ourselves the collective challenge of improving children’s futures by:

  • building human, capital, natural and social assets around the child, including the caregiver;
  • building the voice and agency, and
  • power of poor people and poor children, while
  • working to ensure that children and youth are protected from risks in their environments.

*

A few weeks ago I wrote about the third domain of our work, building the power of poor people and poor children.  That was a very new area of work for us.

Risk reduction was also new, though we had some experience with child protection.

We had designed outcome indicators which we would use, through our Development Effectiveness Framework, to measure the impact of all our work, and giving a sense of our priorities.  There was one outcome indicator that corresponded to risk reduction:

Indicator 11: % of communities with a disaster preparedness plan based on a survey of risks, including those related to adaptation to anticipated climate change, relevant to local conditions, known to the community, and consistent with national standards.

I liked this outcome indicator.  It would enable us to reflect the results of our DRR work in a broad sense.  It showed connectivity with efforts related to climate-change, and linked nicely with the work of others at country level.  Along with other areas of child protection, disaster risk reduction would become a priority for us.

Here’s an illustration of why this made sense: this graph shows the dramatic increase in the number of people affected by natural disasters from 1900 through to 2011.

Screen Shot 2018-09-19 at 4.13.26 PM.png

 

No organization working to create better futures for children, or even human beings, can afford to ignore the fact that vulnerability is increasing very quickly, and dramatically.

*

And it was becoming very apparent that the Australian government was quite keen on both responding to disaster response and tackling climate change.  AusAID had already set up a pre-approval mechanism for emergency-response work, through which a designated group of NGOs could respond very quickly to emergencies; ChildFund Australia wasn’t in the group.

But it all meant that, if we developed capacity, and a strong record of working in DRR and emergency response, there might be opportunities for collaboration with the Australian government, including the possibility of funding …

*

Our ChildFund Alliance partners in the US, ChildFund International, had been doing some good emergency response work back when I was doing my “Bright-Futures”-related consulting work, around 2003, but their leadership that came onboard later had (mistakenly, in my view) wound up that effort soon after.

(For me the decision to exit emergency-response work was a mistake for several reasons.  Firstly, they were already doing good work in that space, which was helping children in very difficult situations.  It was an area with great potential for fundraising, because of the urgency involved as well as the priority being put on this work by donors, particularly government bi-lateral donors.  And because raising funds in this space was less costly than other revenue sources, overheads could be used to support program development in other areas.  Finally, as general child-poverty levels dropped, and our climate changed, child poverty began to be much more related to vulnerability, as we in Australia had determined in the design of our program approach.

ChildFund International – the US Member of the ChildFund Alliance, would later get back into the emergency-response business, and we would begin to structure a formal collaboration with them, and with our Canadian partners, in the humanitarian space.  But we all lost time – for example, perhaps if ChildFund International had stayed engaged in emergency response, we in Australia might have qualified for AusAID’s pre-approval pool …)

And, likewise, there was little DRR-related work going on across the Alliance, and certainly we at ChildFund Australia had no significant track record in that area.  So if we were to build up our expertise, we needed to start by bringing it in from outside.

Happily, at around that time Nigel Spence (the ChildFund Australia CEO) and I were in a meeting in Canberra, talking about these issues.  A very senior AusAID leader told us that if we sent her an ambitious plan to help communities in southeast Asia prepare for disasters, at scale, it would be funded.  Especially if it involved a consortium.

So I went back to Sydney and got to work.  In short order I recruited two other preeminent Australian INGOs (Plan International Australia, and Save the Children Australia) and wrote up an extremely-ambitious proposal: we would reduce climate- and disaster-related risks in 1000 communities in five countries across Asia, reaching approximately 362,500 children and adults directly, and over 1.5m people indirectly.  Among the expected outcomes of the project, we would also seek to empower children and youth:

  • Children and youth will be recognised within communities as effective agents of positive change;
  • Children and youth will have increased knowledge and understanding, and thus capacity, to anticipate, plan and react appropriately to short-term risks and longer-term threats and trends.

These aims are very consistent with our presentation at the United Nations World Conference on Disaster Risk Reduction  that took place in Japan in March, 2015.  More on that conference below!

*

The response from AusAID was quite positive.  But:

Mistake #1: I didn’t get anything in writing!

However, given that we had gotten such a strong green light from Canberra, Nigel and I agreed that we’d go ahead and recruit a DRR expert to help us finalize the proposal and begin to prepare for the work.

*

We had a good response to our recruitment outreach and, in the end, I was lucky to recruit Sanwar Ali from Oxfam Australia to be our first “Senior Advisor for ER and DRR.”  Sanwar had long experience in emergency response across Asia and parts of Africa, and was also deeply experienced with DRR.  At the same time, I felt the Sydney International Program Team could benefit from Sanwar’s background – he would round out the team on several dimensions...

IMG_3483 copy 2.jpg

Sanwar Ali, ChildFund Australia’s Senior Advisor for Emergency Response and Disaster Risk Reduction

 

Soon after Sanwar joined, AusAID let us know that their “green light” was actually just encouragement to include DRR work in our normal project portfolio, so there would be no ambitious funding for scaling-up across Asia!

That was very frustrating.

No matter.  Our program approach had committed us to working to help protect children from disasters, and Sanwar would be key to that program development effort.  It was just unfortunate that the funding for his position evaporated!

*

Soon Sanwar led the development of policies for Emergency Response and Disaster Risk Reduction, which we would incorporate into the ChildFund Australia Program Handbook.

The ER Policy was an expanded update to previous policy, but the DRR policy was new.  Like all our program policies, this one was rather brief.  Its introduction and policy statement were succinct:

Introduction

The frequency with which disasters are occurring is increasing dramatically, in part because of human-induced climate change. This trend represents a threat to children, youth, and caregivers, and has the potential to undermine progress made in improving wellbeing and reducing poverty.

At the same time, however, thanks to efforts of local communities, national governments, the international community, and INGOs such as ChildFund, the human impact of these
disasters has been reduced over time.

Reducing risks for children, youth, and caregivers is central to ChildFund’s program approach, because communities which are resilient to risks are best positioned to provide security and ensure continued wellbeing of vulnerable children.

This policy provides an organisational framework for action related to disaster risk reduction.

Policy Statement

ChildFund Australia will work to ensure that disaster risk reduction (DRR) plans, known as Community Action Plans – CAPs) are in place in all communities where we work. These CAPs will be developed in a participatory manner, consistent with the Hyogo Framework for Action, and according to relevant guidelines in each country.

DRR efforts will be mainstreamed in our development, humanitarian and advocacy activities whenever appropriate.

The rest of the policy document outlined “Key Actions” required by ChildFund Australia staff at various levels and locations, and outlined how work in this area was connected to our organizational Outcome Indicators.

*

Sanwar and I worked together on two ChildFund Alliance-wide projects.  I had proposed that the operational Members of the Alliance (which, initially, meant Australia, Canada, and the US, with Japan and Korea observing) work together in Emergency Response and DRR, partly because we would be able to show global reach that way.  So we planned to develop a set of common policies and procedures through which we would respond to humanitarian disasters jointly.

I want to describe the other project that Sanwar and I worked on in a bit more detail: I led the ChildFund Alliance delegation to the United Nations World Conference on Disaster Risk Reduction that took place in Japan in March, 2015; a delegation from ChildFund had a big presence at that conference.  And, the week before the UN Conference, I visited areas of Fukushima Prefecture in Japan that had been affected by the earthquake-tsunami-nuclear explosion that had devastated that area exactly four years before.  My visit was part of JCC2015 (Japan CSO Coalition for 2015 WCDRR) conference, which concluded the day after the field visit with an important program of sessions.

I will bring in some content from blogs I published here in 2015, just after my trip to Japan…

*

The visit to Fukushima was unforgettable. The impact of the horrific events of four years ago was still very apparent, as was the strong and continuing resilience of the local people, even those who (at that point) remained in “temporary” camps.

A good summary of the events of the events of 2011, along with some lessons we should learn, is contained in the publication “Ten Lessons from Fukushima.” In brief, a massive (magnitude 9) earthquake, at 2:46pm on March 11, 2011, caused extensive damage across northern Japan, and triggered an enormous tsunami.  This tsunami struck coastal zones of northern Japan an hour after the earthquake, destroying vast areas and killing many. The Fukushima “Dai-Ichi” (number one) nuclear plant, located on the coast, was severely damaged by the tsunami. The next day at 3:36pm, core meltdown and a massive explosion destroyed reactor unit 1. Other reactors subsequently failed.

It has been estimated that the equivalent of 168 Hiroshima nuclear bomb’s worth of radiation was released when Fukushima Dai-Ichi reactor unit 1 exploded.

Evacuation orders were slow to come, partly due to the loss of communications facilities, but also due to startling management and leadership errors.   Eventually, after suffering serious exposure to radiation, some 300,000 people were evacuated from areas inside a 30km radius around the reactor complex.

This map shows the 30km evacuation zone, and the radiation plume.

Screen Shot 2015-03-23 at 2.16.21 pm

We would visit areas well within the red area during our trip.

Radiation drifted with the wind, falling onto land and people and animals, leaving extremely high levels of iodine, cesium, and other radioactive elements. Radioactivity fell across inhabited, farmed, and forested areas according to the wind direction at the time, and radioactive water was released into the sea – even four years later, when I visited, something like 600 tons of radioactive water were being released into the ocean each and every day. “Safe” levels of radiation were repeatedly raised.

Investigations found that the nuclear disaster was preventable; appropriate safety mechanisms – existing at the time – were not incorporated into the Fukushima Dai-Ichi reactors when they were built.

Another good document covering the disaster and its aftermath was created by the Citizens’ Commission on Nuclear Energy.

I had not realised that vast areas of Fukushima Prefecture were still closed due to extremely high radiation levels, and that the Fukushima Dai-Ichi reactor complex was dangerously unstable, and events could have spun out of control again at any time. Wreckage littered a vast area, and radiation in many of the places we visited was startlingly higher than what is considered to be safe, if “safe” levels even exist.  Radiation contamination seriously impeded recovery efforts, as workers could not stay in the area for very long. (I was struck by the contrast with Hurricane Katrina where, even with the bungled response, cleanup was far more advanced four years after the storm than what we saw in Fukushima… The difference?  The radiation.)

We visited Iitate village, a place where some of the highest levels of radiation were found just after the meltdown – it’s right near the center of the darkest plume in the map above. We visited Namie town, where we met with local officials who are doing their best to deal with the situation, spending their days in highly-contaminated areas.   We visited tsunami-affected areas of Namie, where we could see vast areas of wreckage, damaged housing, vehicles crushed by the power of the waves.

IMG_3919 IMG_3904

 

And we drove to within 4km of the Fukushima Dai-Itchi plant.  We finished the day by visiting a group of evacuees from Namie town, at that point still in “temporary” housing in Kohri town. Their courage and resilience was powerful and inspiring.

Radiation levels on our bus were high. This reading, taken during our lunch break, is 0.82 micro Sieverts per hour, which is considered “safe” for short-term visits only.

IMG_3872

Throughout the area, thousands of one-ton bags of soil, wood, debris, etc., were piling up. Topsoil, down 20cm, from farms was being removed; areas 20m around houses were similarly being cleaned. I was told that these bags, nearly a million of them now, would be taken to gigantic incinerators for processing.  Nineteen hugely-expensive incinerators were being built, each with a short lifetime of just two years.  Meanwhile, the residual ash was being stored, “temporarily”, at the Fukushima plant.

IMG_3866IMG_3910

Radiation “hotspots” remained, however.

The catastrophe took place four years before our visit.  The situation when we were there is described in the “Fukushima Lessons” publication noted above, and in a recent article in The Economist. There were still over 120,000 “nuclear refugees” who could not return to their communities, their homes, the areas we visited.  They probably never will, despite financial incentives being offered, because levels of contamination were still so high.

We met with a group of displaced people, from Namie town, now living in Kohri town in “temporary” housing far from their homes, to which they will never return. They lived virtually on top of each other, able to hear the softest sounds from their neighbours (such as snoring!) – quite a change from Namie, where their ancestral homes and farms were.

IMG_3968

We were lucky to be presented with a narrative, using a hand-made storyboard, of their experience in the days after the tragedy.  It was a powerful and moving description of their suffering; and in the telling of the story, of their resilience and spirit.

IMG_3946

Discussing Fukushima was sensitive in Japan. I found it interesting, the next week at the WCDRR Conference in Sendai, that the many references to the Fukushima disaster refer to it as the “Great East Japan Earthquake” or “Great East Japan Earthquake and Tsunami” disaster.  No reference to the horrific and ongoing events at the nuclear power plant, which were somehow not relevant, or too sensitive to mention.

Some final reflections:

  • Several times during the day in Fukushima, we heard local people use a striking visual metaphor: “nuclear power is like a house without a toilet.” Until there is a safe and permanent place for nuclear waste, we should not be using this source of power;
  • In our increasingly-unstable world, the use of a technology with such potential for devastation and tragedy – Fukushima, Chernobyl – seems foolish. The precautionary principle should be followed;
  • Let’s try to remember that the impact of these mega-disasters still persists.  Four years after the disaster, there were 120,000 people still displaced, even in Japan, such a rich country.  And cleanup and stabilization in Chernobyl, even more than 30 years later, is still far from finished.

Ultimately, I took a sense of inspiration from the stories of the people affected, of those who responded and who are responding still, and those who are living lives of advocacy to ensure that the injustices that took place in Fukushima are corrected, and not repeated.

Reflecting on what we saw, those (government, TEPCO, etc.) who were saying that the area had recovered, or was on the road to recovery, were either not seeing what we saw, or not telling the truth.

*

I want to share links to two articles from The Guardian, describing the situation in 2017, two years after my visit to Fukushima: this article describes how radiation levels inside the collapsed reactor had risen to “extraordinary” levels, unexpectedly; and this article provides a lengthy update of what are described as “faltering” cleanup efforts.  They make for depressing reading.

*

On a more positive note, and with the importance of DRR clearly in my mind after the visit to Fukushima, I led the ChildFund delegation to the United Nations World Conference on Disaster Risk Reduction in Sendai.

I want to share content from a summary article I published in DevEx soon after the conference ended:

After several sleepless nights of negotiations, representatives from 187 governments agreed the Sendai Framework for Disaster Risk Reduction 2015-2030. As one of more than 6,500 participants in the Sendai conference, I can attest to the exhausted sense of relief that many of us felt when the framework was finally announced.

The framework that emerged contains seven global targets — nonbinding and with funding left unspecified — focused on reducing disaster risk and loss of lives and livelihoods from disasters.

Sendai was a monumental effort, involving government representatives, senior-most leaders of the United Nations and several of its specialized agencies, staff from hundreds of civil society organizations and private sector businesses, and the media, with venues scattered across the city. The urgent need for international action to reduce disaster risks was thrown into stark relief when Cyclone Pam — one of the most intense storms ever to occur in the Pacific — tore across Vanuatu just as the conference began.

Why does what happened in Sendai matter? Because hazards — both man-made and natural — are growing, as our climate changes and growing inequality contributes to a sense of injustice in many populations. Doing nothing to prepare for these increased risks is not a viable option for our future.

Also, Sendai matters because the conference was the first of four crucial U.N. gatherings this year. What happened in Sendai will influence the Third International Conference on Financing for Development, coming in Ethiopia in July; the post-2015 sustainable development goals that will be discussed at the global U.N. summit in September; and the U.N. Climate Change Conference in Paris in December.

ChildFund’s delegation, part of the “Children in a Changing Climate” coalition, had several objectives in Sendai. In particular, we worked to make the case that children and young people should be seen as agents of change in any new DRR framework.

Children and young people are normally seen as helpless, passive victims of disasters. During and after emergencies, the mainstream media, even many organizations in our own international NGO sector, portray children and young people as needing protection and rescue. Of course, children and young people do need protection. When disasters strike they need rescue and care. But what such images fail to show is that children also have the capacity — and the right — to participate, not only in preparing for disasters but in the recovery process.

Since the last U.N. agreement on DRR, in 2005, we have learned that children and young people must be actively engaged so that they understand the risks of disasters in their communities and can play a role in reducing those risks. Children’s participation in matters that concern them is their right — enshrined in the 1989 Convention on the Rights of the Child — and strengthens their resilience and self-esteem.

And, crucially, we know that young people’s participation in DRR activities leads to better preparation within families and in communities.

My presentation at Sendai included examples of how youth brigades in the Visayas region in the Philippines helped in the Typhoon Haiyan response.

 

The example we used to make our case came from ChildFund in the Philippines.  In 2011, with support from local government and the Australian government, ChildFund worked with several youth groups to help them prepare for disasters, and to help them help their communities prepare. We engaged young people in Iloilo and Zamboanga del Norte provinces to identify hazards, to develop local DRR and disaster risk management plans, to train children and young people in disaster risk management, and to raise awareness of DRR in eight communities.

Little did we know that, just 18 months after the project concluded, this effort would really pay off.  Many of us remember vividly the images of Typhoon Haiyan barreling across the Philippines in November 2013, just north of where our project was carried out.  As local and national government in the Philippines began to respond to the typhoon, with massive support from the international community, we could see that the efforts of children and young people we had worked with were proving to be important elements in managing the impact of the storm.

Advocacy of children and young people during the project had led to the local government investing more in preparedness and mitigation, which was crucial as the storm hit. Young people trained in the project trained other groups of parents and youths, building the capacity of people who were affected by, and responded to, Haiyan. Local government units mobilized disaster risk reduction committees, including youth members, who were involved in evacuation of families living in high-risk areas. Youth volunteers helped prepare emergency supplies and facilitated sessions for children in child-centered spaces that were set up after the typhoon passed.

This experience led ChildFund to strongly support elements of the Sendai Framework that recognize the importance of the meaningful participation of children and youth in DRR activities. We are happy to see the text calling for governments to engage with children and youth and to promote their leadership, and recognizing children and young people as agents of change who should be given the space and modalities to contribute to disaster risk reduction.

But two major weaknesses can be seen in the Sendai Framework: Its targets are not binding and are not quantified; and no global commitments to funding DRR actions were made. Many observers feel that governments were keen to establish (or not establish!) precedents at Sendai that would bind them (or not bind them!) in the high-stakes conferences to come. These weaknesses are serious, and greatly undercut the urgency of our task and likely effectiveness of our response.

Still, on balance, the Sendai Framework is good for children and youth, certainly better than failure to agree would have been. Let’s hope for even stronger action in Addis Ababa, New York and Paris, with binding targets and clear financial commitments.

Then our children, and grandchildren, will look back at Sendai as a milestone in building a better, fairer and safer world.

*

After the success of Sendai, Felipe Cala (then working with the ChildFund Alliance secretariat) and I took a couple of days to visit Kyoto, a marvelous place full of culture and beautiful scenery.  And we enjoyed the modern “bullet” trains, that made our countries look so backward.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. West Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System;
  37. Mt Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness System;
  38. Mt Waumbek (38) – “Building the Power of Poor People and Poor Children…”
  39. Mt Cabot (39) – ChildFund Australia’s Teams In Cambodia, Laos, Myanmar, Papua New Guinea, and Viet Nam;
  40. North Twin (40) – Value for Money.

 

 

West Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness Framework

June, 2018

International NGOs do their best to demonstrate the impact of their work, to be accountable, to learn and improve.  But it’s very challenging and complicated to measure change in social-justice work, and even harder to prove attribution.  At least, to do these things in affordable and participatory ways…

Two times in Plan International, earlier in my career, I had worked to develop and implement systems that would demonstrate impact – and both times, we had failed.

In this article I want to describe how, in ChildFund Australia, we succeeded, and were able to build and implement a robust and participatory system for measuring and attributing impact in our work.

Call it the Holy Grail!

*

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 36 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”  This is my 37th post in the series.

In recent posts in this series I’ve been describing aspects of the ChildFund Australia “Development Effectiveness Framework” (“DEF”) the system that would help us make sure we were doing what we said we were going to do and, crucially, verify that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

There are three particular components of the overall DEF that I am detailing in more depth, because I think they were especially interesting and innovative.  In my previous blog I described how we used Case Studies to complement the more quantitative aspects of the system.  These Case Studies were qualitative narratives of the lived experience of people experiencing change related to ChildFund’s work, which we used to gain human insights, and to reconnect ourselves to the passions that brought us to the social-justice sector in the first place.

This time, I want to go into more depth on two final, interrelated components of the ChildFund Australia DEF: Outcome Indicator Surveys and Statements of Impact.  Together, these two components of the DEF enabled us to understand the impact that ChildFund Australia was making, consistent with our Theory of Change and organizational vision and mission.  Important stuff!

But first…

*

Last time I described climbing to the top of Mt Bond on 10 August 2017, after having gotten to the top of Bondcliff.  After Mt Bond, I continued on to West Bond (4540ft, 1384m), the last of three 4000-footers I would scale that day.  (But, since this was an up-and-back trip, I would climb Mt Bond and Bondcliff twice!  It would be a very long day.)

As I described last time, I had arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  This early start was enabled by staying the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.  Then I had reached the top of Bondcliff at 10:30am, and the summit of Mt Bond at about 11:30am.

Now I would continue to the top of West Bond, and then retrace my steps to Lincoln Woods:

Bond Map - 6c.jpeg

 

So, picking up the story from the top of Mt Bond, the Bondcliff Trail drops down fairly quickly, entering high-altitude forest, mostly pine and ferns.

IMG_1952.jpg

 

After 20 minutes I reached the junction with the spur trail that would take me to the top of West Bond.  I took a left turn here.  The spur trail continues through forest for some distance:

IMG_1955.jpg

IMG_1958.jpg

 

I reached the top of West Bond at 12:30pm, and had lunch there.  The views here were remarkable; it was time for lunch, and I was fortunate to be by myself, so I took my time at the summit.

IMG_1965 (1).jpg

Bondcliff From West Bond

IMG_1972.jpg

At The Summit Of West Bond.  Franconia Ridge And Mt Garfield In The Background.  A Bit Tired!

IMG_1984.jpg

Mt Bond, On The Left, And Bondcliff On The Right

 

Here are two spectacular videos from the top of West Bond.  The first simply shows Bondcliff, with the southern White Mountains in the background:

 

And this second video is more of a full panorama, looking across to Owl’s Head, Franconia Ridge, Garfield, the Twins, Zealand, and back:

 

Isn’t that spectacular?!

After eating lunch at the top of West Bond, I left at a bit before 1pm, and began to retrace my steps towards Lincoln Woods.  To get there, I had to re-climb Mt Bond and Bondcliff.

I reached the top of Mt Bond, for the second time, at 1:20pm.  The view down towards Bondcliff was great!:

IMG_1996.jpg

Bondcliff From The Top Of Mt Bond, Now Descending…

 

Here is a view from near the saddle between Mt Bond and Bondcliff, looking up at the latter:

IMG_2005.jpg

Looking Up At Bondcliff

 

As I passed over Bondcliff, at 2:15pm, I was slowing down, and my feet were starting to be quite sore.  I was beginning to dread the descent down Bondcliff, Wilderness, and Lincoln Woods Trails… it would be a long slog.

Here’s a view from there back up towards Mt Bond:

IMG_2007.jpg

A Glorious White Mountain Day – Mt Bond And West Bond, From Bondcliff

 

But there were still 8 or 9 miles to go!  And since I had declined the kind offer I had received to ferry my car up to Zealand trail-head, which would have saved me 3 miles, I had no other option but to walk back to Lincoln Woods.

It was nearly 5pm by the time I reached the junction with Twinway and the Lincoln Woods Trail.  By that time, I was truly exhausted, and my feet were in great pain, but (as I said) I had no option but to continue to the car: no tent or sleeping bag, no phone service here.

The Lincoln Woods Trail, as I’ve described in more detail elsewhere, is long and flat and wide, following the remnants of an old forest railway:

IMG_2024

IMG_2025

Sleepers From The Old Forestry Railway

 

Scratches from walking poles?

IMG_2026 (1).jpg

 

It was around 5:30 when I got to the intersection with Franconia Notch Trail, which is the path up Owl’s Head.

IMG_2028.jpg

IMG_2034.jpg

 

It was a very long slog down Lincoln Woods Trail – put one foot in front of the other, and repeat!  And repeat and repeat and repeat and repeat …

Finally I reached the Lincoln Woods Visitor Center, where I had parked my car at 6:30am that morning, at 6:40pm, having climbed three 4000-footers, walked 22 miles, and injured my feet in just over 12 hours.

Looking back, I had accomplished a great deal, and the views from the top of three of New Hampshire’s highest and most-beautiful were amazing.  But, at the time, I had little feeling of accomplishment!

IMG_2038 (1).jpg

Knackered!

 

*

Here is the diagram I’ve been using to describe the ChildFund Australia DEF:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework

 

In this article I want to describe two components of the DEF: #2, the Outcome Indicator Surveys; and #12, how we produced “Statements of Impact.”  Together, these two components enabled us to measure the impact of our work.

First, some terminology: as presented in an earlier blog article in this series, we had adopted fairly standard definitions of some related terms, consistent with the logical framework approach used in most mature INGOs:

Screen Shot 2018-05-28 at 2.16.30 PM

 

According to this way of defining things:

  • A Project is a set of Inputs (time, money, technology) producing a consistent set of Outputs (countable things delivered in a community);
  • A Program is a set of Projects producing a consistent set of Outcomes (measurable changes in human conditions related to the organization’s Theory of Change);
  • Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan.

But that definition of “Impact,” though clear and correct, wasn’t nuanced enough for us to design a system to measure it.  More specifically, before figuring out how to measure “Impact,” we needed to grapple with two fundamental questions:

  • How “scientific” did we want to be in measuring impact?  In other words, were we going to build the infrastructure needed to run randomized control group trials, or would we simply measure change in our Outcome Indicators?  Or somewhere in between?;
  • How would we gather data about change in the communities where we worked?  A census, surveying everybody in a community, which would be relatively costly?  If not, what method for sampling would we use that would enable us to claim that our results were accurate (enough)?

*

The question “how ‘scientific’ did we want to be” when we assessed our impact was a fascinating one, getting right to the heart of the purpose of the DEF.  The “gold standard” at that time, in technical INGOs and academic institutions, was to devise “randomized control group” trials, in which you would: implement your intervention in some places, with some populations; identify ahead of time a comparable population that would serve as a “control group” where you would not implement that intervention; and then compare the two groups after the intervention had concluded.

For ChildFund Australia, we needed to decide if we would invest in the capability to run randomized control group trials.  It seemed complex and expensive but, on the other hand, it  would have the virtue of being at the forefront of the sector and, therefore, appealing to technical donors.

When we looked at other comparable INGOs, in Australia and beyond, there were a couple that had gone that direction.  When I spoke with my peers in some of those organizations, they were generally quite cautious about the randomized control trial (“RCT”) approach: though appealing in principle, in practice it was complex, requiring sophisticated technical staff to design and oversee the measurements, and to interpret results.  So RCTs were very expensive.  Because of the cost, people with practical experience in the matter recommended using RCTs, if at all, only for particular interventions that were either expensive or were of special interest for other reasons.

For ChildFund Australia, this didn’t seem suitable, mainly because we were designing a comprehensive system that we hoped would allow us to improve the effectiveness of our development practice, while also involving our local partners, authorities, and people in communities where we worked.  Incorporating RCTs into such a comprehensive system would be very expensive, and would not be suitable for local people in any meaningful way.

The other option we considered, and ultimately adopted, hinged upon an operational definition of “Impact.”  Building on the general definition shown above (“Impact is a set of Programs producing a consistent set of changes to Outcome Indicators as set forth in the organization’s Strategic Plan”), operationally we decided that:

Screen Shot 2018-06-18 at 3.06.57 PM.png

 

In other words, we felt that ChildFund could claim that we had made an significant impact in the lives of children in a particular area if, and only if:

  1. There had been a significant, measured, positive change in a ChildFund Australia Outcome Indicator; and
  2. Local people (community members, organizations, and government staff) determined in a rigorous manner that ChildFund had contributed to a significant degree to that positive change.

In other words:

  • If there was no positive change in a ChildFund Australia Outcome Indicator over three years (see below for a discussion of why we chose three years), we would not be able to claim impact;
  • If there was a positive change in a ChildFund Australia Outcome Indicator over three years, and local people determined that we had contributed to that positive change, we would be able to claim impact.

(Of course, sometimes there might be a negative change in a ChildFund Australia Outcome Indicator, which would have been worse if we hadn’t been working in the community.  We were able to handle that situation in practice, in community  workshops.)

I felt that, if we approached measuring impact in this way it would be “good enough” for us – perhaps not as academically robust as using RCT methods, but (if we did it right) certainly good enough for us to work with local people to make informed decisions, together, about improving the effectiveness of our work, and to make public claims of the impact of our work.

So that’s what we did!

*

As a reminder, soon after I had arrived in Sydney we had agreed a “Theory of Change” which enabled us to design a set of organization-wide Outcome Indicators.  These indicators, designed to measure the status of children related to our Theory of Change, were described in a previous article, and are listed here:

Screen Shot 2018-05-28 at 3.16.59 PMScreen Shot 2018-05-28 at 3.17.10 PM

 

These Outcome Indicators had been designed technically, and were therefore robust.  And they had been derived from the ChildFund Australia Vision, Mission, and Program Approach, so they measured changes that would be organically related to the claims we were making in the world.

So we needed to set up a system to measure these Outcome Indicators; this would become component #2 in the DEF (see Figure 1, above).  And we had to design a way for local partners, authorities, and (most importantly) people from the communities where we worked to assess changes to these Outcome Indicators and reach informed conclusions about who was responsible for causing the changes.

First, let me outline how we measured the ChildFund Australia Outcome Indicators.

*

Outcome Indicator Surveys (Component #2 in Figure 1, Above)

Because impact comes rather slowly, an initial, baseline survey was carried out in each location and then, three years later, another measurement was carried out.  A three-year gap was somewhat arbitrary: one year was too short, but five years seemed a bit long.  So we settled on three years!

Even though we had decided not to attempt to measure impact using complex randomized control trials, these survey exercises were still quite complicated, and we wanted the measurements to be reliable.  This was why we ended up hiring a “Development Effectiveness and Learning Manager” in each Country Office – to support the overall implementation of the DEF and, in particular, to manage the Outcome Indicator Surveys.  And these surveys were expensive and tricky to carry out, so we usually hired students from local universities to do the actual surveying.

Then we needed to decide what kind of survey to carry out.  Given the number of people in the communities where we worked, we quickly determined that a “census,” that is, interviewing everybody, was not feasible.

So I contacted a colleague at the US Member of the ChildFund Alliance, who was an expert in this kind of statistical methodology.  She strongly advised me to use the survey method that they (the US ChildFund) were using, called “Lot Quality Assurance Sampling.”  LQAS seemed to be less expensive than other survey methodologies, and it was highly recommended by our expert colleague.

(In many cases, during this period, we relied on technical recommendations from ChildFund US.  They were much bigger than the Australia Member, with excellent technical staff, so this seemed logical and smart .  But, as with Plan International during my time there, the US ChildFund Member had very high turnover, which led to many changes in approach.  This meant, in practice for us, although ChildFund Australia had adopted several of the Outcome Indicators that ChildFund US was using, in the interests of commonality, and – as I said – we had begun to use LQAS for the same reason, soon the US Member was changing their Indicators and abandoning the use of LQAS because new  staff felt it wasn’t the right approach.  This led to the US Member expressing some disagreement with how we, in Australia, were measuring Impact – even though we were following their – previous – recommendations!  Sigh.)

Our next step was to carry out baseline LQAS surveys in each field location.  It took time to accomplish this, as even the relatively-simple LQAS was a complex exercise than we were typically used to.  Surveys were supervised by the DEL Managers, carried out usually by students from local universities.  Finally, the DEL Managers prepared baseline reports summarizing the status of each of the ChildFund Australia Outcome Indicators.

Then we waited three years and repeated the same survey in each location.

(In an earlier article I described how Plan International, where I had worked for 15 years, had failed twice to implement a DEF-like system, at great expense.  One of the several mistakes that Plan had made was that they never held their system constant enough to be comparable over time.  In other words, in the intervening years after measuring a baseline, they tinkered with [“improved”] the system so much that the second measurement couldn’t be compared to the first one!  So it was all for naught, useless.  I was determined to avoid this mistake, so I was very reluctant to change our Outcome Indicators after they were set, in 2010; we did add a few Indicators as we deepened our understanding of our Theory of Change, but that didn’t get in the way of re-surveying the Indicators that we had started with, which didn’t change.)

Once the second LQAS survey was done, three years after the baseline, the DEL Manager would analyze differences and prepare a report, along with a translation of the report that could be shared with local communities, partners, and government staff.  The DEL Manager, at this point, did not attempt to attribute changes to any particular development actor (local government, other NGOs, the community themselves, ChildFund, etc.), but did share the results with the communities for validation.

Rather, the final DEF component I want to describe was used to determine impact.

*

Statements of Impact (Component #12 in Figure 1, Above)

The most exciting part of this process was how we used the changes measured over three years in the Outcome Indicators to assess Impact (defined, as described above, as change plus attribution.)

The heart of this process was a several-day-long workshop at which local people would review and discuss changes in the Outcome Indicators, and attribute the changes to different actors in the area.  In other words, if a particular indicator (say, the percentage of boys and girls between 12 and 16 years of age who had completed primary school) had changed significantly, people at the workshop would discuss why the change had occurred – had the local education department done something to cause the change?  Had ChildFund had an impact?  Other NGOs?  The local community members themselves?

Finally, people in the workshop would decide the level of ChildFund’s contribution to the change (“attribution”) on a five-point scale: none, little, some, a lot, completely.   This assessment, made by local people in an informed and considered way, would then serve as the basic content for a “Statement of Impact” that would be finalized by the DEL Manager together with his or her senior colleagues in-country, Sydney-based IPT staff and, finally, myself.

*

We carried out the very first of these “Impact” workshops in Svay Rieng, Cambodia, in February 2014.  Because this was the first of these important workshops, DEL Managers from Laos and Viet Nam attended, to learn, along with three of us from Sydney.

Here are some images of the ChildFund team as we gathered and prepared for the workshop in Svay Rieng:

IMG_2151

IMG_2169

IMG_2202

 

Here are images of the workshop.  First, I’m opening the session:

IMG_8605

 

Lots of group discussion:

IMG_8758

 

The DEL Manager in Cambodia, Chan Solin, prepared a summary booklet for each participant in the workshop.  These booklets were a challenge to prepare, because they would be used by local government, partners, and community members; but Solin did an outstanding job.  (He also prepared the overall workshop, with Richard Geeves, and managed proceedings very capably.)  The booklet presented the results of the re-survey of the Outcome Indicators as compared with the baseline:

IMG_8817

IMG_8795

 

Here participants are discussing results, and attribution to different organizations that had worked in Svay Rieng District over the three years:

IMG_9612

 

Subgroups would then present their discussions and recommendations for attribution.  Note the headphones – since this was our first Impact Workshop, and ChildFund staff were attending from Laos, Viet Nam, and Australia in addition to Cambodia, we provided simultaneous translation:

IMG_9694

 

Here changes in several Outcome Indicators over the three years (in blue and red) can be seen.  The speaker is describing subgroup deliberations on attribution of impact to the plenary group:

IMG_9703

IMG_9719

IMG_9699

IMG_9701

IMG_9747

IMG_9728

IMG_9763

 

Finally, a vote was taken to agree the attribution of positive changes to Outcome Indicators.  Participants voted according to their sense of ChildFund’s contribution to the change: none, a little, some, a lot, or completely.  Here is a ballot and a tabulation sheet:

IMG_9790

 

Finally, here is an image of the participants in that first Statement of Impact Workshop: Local Community Members, Government Staff, ChildFund Staff (From The Local Area, Country Office, Sydney, and From Neighboring Viet Nam):

IMG_2299

 

*

Once the community workshops were finished, our local Senior Management would review the findings and propose adjustments to our work.  Then the DEL Managers would prepare a final report, which we described as “Statements of Impact.”

Generally speaking, these reports would include:

  • An introduction from the Country Director;
  • A description of the location where the Statement of Impact was produced, and a summary of work that ChildFund had done there;
  • An outline of how the report was produced, noting the three-year gap between baseline and repeat survey;
  • Findings agreed by the community regarding changes to each Outcome Indicator along with any attribution of positive change to ChildFund Australia;
  • Concluding comments and a plan of action for improvement, agreed by the local Country Office team and myself.

Examples of these reports are shared below.

*

This process took some time to get going, because of the three-year delay to allow for re-surveying, but once it commenced it was very exciting.  Seeing the “Statement of Impact” reports come through to Sydney, in draft, from different program countries, was incredible.  They showed, conclusively, that ChildFund was really making a difference in the lives of children, in ways that were consistent with our Theory of Change.

Importantly, they were credible, at least to me, because they showed some areas where we were not making a difference, either because we had chosen not to work in a particular domain (to focus on higher priorities) or because we needed to improve our work.

*

I’m able to share four ChildFund Australia Statements of Impact, downloaded recently from the organization’s website.  These were produced as described in this blog article:

*

Here are a few of the findings from that first “Statement of Impact” in Svay Chrum:

  • ChildFund made a major contribution to the increase in primary-school completion in the district:

Screen Shot 2018-06-27 at 8.49.40 AM.png

 

  • Although the understanding of diarrhea management had improved dramatically, it was concluded that ChildFund had not contributed to this, because we hadn’t implemented any related projects.  “Many development actors contributed to the change”:

Screen Shot 2018-06-27 at 8.52.47 AM.png

 

  • ChildFund had a major responsibility for the improvement in access to hygienic toilets in the district:

Screen Shot 2018-06-27 at 8.49.54 AM.png

 

  • ChildFund made a significant contribution to the increase in access to improved, affordable water in the district:

Screen Shot 2018-06-27 at 8.54.41 AM.png

 

  • ChildFund had made a major contribution to large increases in the percentage of children and youth who reported having opportunities to voice their opinions:

Screen Shot 2018-06-27 at 8.56.08 AM.png

  • Although the percentage of women of child-bearing age in the district who were knowledgeable regarding how to prevent infection with HIV, it was determined the ChildFund had made only a minor contribution to this improvement.  And recommendations were made by the group regarding youth knowledge, which had actually declined:

Screen Shot 2018-06-27 at 8.57.47 AM.png

 

To me, this is fantastic stuff, especially given that the results emerged from deep and informed consultations with the community, local partners, and local authorities.  Really, this was the Holy Grail – accountability, and lots of opportunity for learning.  The results were credible to me, because they seemed to reflect the reality of what ChildFund had worked on, and pointed out areas where we needed to improve; the report wasn’t all positive!

*

For me, the way that the Outcome Indicator Surveys and Statements of Impact worked was a big step forward, and a major accomplishment.  ChildFund Australia now had a robust and participatory way of assessing impact so that we could take steps to confidently improve our work.  With these last two components of the DEF coming online, we had managed to put in place a comprehensive development-effectiveness system, the kind of system that we had not been able to implement in Plan.

As I shared the DEF – its design, the documents and reports it produced – with our teams, partners, Australian government, donors – I began to get lots of positive feedback.   At least for its time, in Australia, the ChildFund Australia DEF was the most comprehensive, robust, participatory, useful system put into place that anybody had ever seen.  Not the most scientific, perhaps, but something much better: usable, useful, and empowering.

*

My congratulations and thanks to the people who played central roles in creating, implementing, and supporting the DEF:

  • In Sydney: Richard Geeves and Rouena Getigan;
  • And the DEL Managers in our Country Offices: Chan Solin (Cambodia), Joe Pasen (PNG), Marieke Charlet (Laos), and Luu Ngoc Thuy and Bui Van Dung (Viet Nam).

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System;
  36. Mt Bond (36) – “Case Studies” in ChildFund Australia’s Development Effectiveness System.

 

 

Mt Bond (36) – “Case Studies” In ChildFund Australia’s Development Effectiveness Framework

June, 2018

I’ve been writing a series of blog posts about climbing each of the 48 mountains in New Hampshire that are at least 4000 feet tall.  And, each time, I’ve also been reflecting a bit on the journey since I joined Peace Corps, 33 years ago: on development, social justice, conflict, experiences along the way, etc.

So far, I’ve described climbing 35 of the 48 peaks, and covered my journey from Peace Corps in Ecuador (1984-86) through to my arrival in Sydney in 2009, where I joined ChildFund Australia as the first “International Program Director.”

Last time I described the ChildFund Australia “Development Effectiveness Framework,” the system that would help us make sure we were doing what we said we were going to do and, crucially, verifying that we were making a difference in the lives of children and young people living in poverty.  So we could learn and improve our work…

This time, I want to go into more depth on one component of the DEF, the “Case Studies” that described the lived experience of people that we worked with.  Next time, I’ll describe how we measured the impact of our work.

But first…

*

On 10 August, 2017, I climbed three 4000-footers in one very long day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a tough day, covering 22 miles and climbing three very big mountains.  At the end of the hike, I felt like I was going to lose the toenails on both big toes (which, in fact, I did!) … it was a bit much!

Last time I wrote about climbing to the top of Bondcliff, the first summit of that day.  This time, I will describe the brief walk from there to the top of Mt Bond, the tallest of the three Bonds.  And next time I’ll finish describing that day, with the ascent of West Bond and the return to the trail-head at Lincoln Woods.

*

As I described last time, I arrived at the top of Bondcliff at about 10:30am, having left the trail-head at Lincoln Woods Visitor Center just after 6:30am.  I was able to get an early start because I had stayed the night before at Hancock Campsite on the Kancamagus road, just outside of Lincoln, New Hampshire.

It was a bright and mostly-sunny day, with just a few clouds and some haze.  The path between Bondcliff and Mt Bond is quite short – really just dropping down to a saddle, and then back up again, only 1.2 miles:

Bond Map - 6b

 

It took me about an hour to cover that distance and reach the top of Mt Bond from Bondcliff at 11:30am.  The path was rocky as it descended from Bondcliff, in the alpine zone, with many large boulders as I began to go back up towards Mt Bond – some scrambling required.

This photo was taken at the saddle between Bondcliff and Mt Bond: on the left is Bondcliff, on the right is West Bond, and in the middle, in the distance, is Franconia Ridge; Mt Bond is behind me.  A glorious view on an amazing day for climbing:

IMG_1929.jpg

From the Left: Bondcliff, Franconia Ridge, West Bond

 

It got even steeper climbing up from the saddle to the summit, passing through some small pine shrubs, until just before the top.

The views were spectacular at the summit of Mt Bond, despite the sky being slightly hazy – I could see the four 4000-footers of the Franconia Ridge to the west and Owl’s Head in the foreground, the Presidential Range to the east, and several other 4000-footers to the south and south-west:

IMG_1948 (1)

Looking To The West From The Summit Of Mt Bond

 

And I had a nice view back down the short path from the top of Bondcliff:

IMG_1943 (1)

 

There were a few people at the top, and I had a brief conversation with a couple that were walking from Zealand trailhead across the same three mountains I was climbing, and finishing at Lincoln Woods.  This one-way version of what I was doing in an up-and-back trip was possible because they had left a car at Lincoln Woods, driving to the Zealand trailhead in a second vehicle.  They would then ferry themselves back to Zealand from Lincoln Woods.

Kindly, they offered to pick up my car down at Lincoln Woods and drive it to Zealand, which would have saved me three miles.  I should have accepted, because finishing what became 22 miles, and three 4000-foot peaks, would end up hobbling me for a while, and causing two toenails to come off!  But I didn’t have a clear sense of how the day would go, so I declined their offer, with sincere thanks…

Getting to the top of Mt Bond was my 36th 4000-footer – just 12 more to go!

I didn’t stay too long at the top of Mt Bond on the way up, continuing towards West Bond… stay tuned for that next time!

*

Jean and I had moved to Sydney in July of 2009, where I would take up the newly-created position of International Program Director for ChildFund Australia.  It was an exciting opportunity for me to work in a part of the world I knew and loved (Southeast Asia: Cambodia, Laos, Myanmar and Viet Nam) and in a challenging new country (Papua New Guinea).  It was a great chance to work with some really amazing people – in Sydney and in our Country Offices… to use what I had learned to help build and lead effective teams.  Living in Sydney would not be a hardship post, either!  Finally, it was a priceless chance for me to put together a program approach that incorporated everything I had learned to that point, over 25 years working in poverty reduction and social justice.

In the previous article in this series, I described how we developed a “Development Effectiveness System” (“DEF”) for ChildFund Australia, and I went through most of the components of the DEF in great detail.

My ambition for the DEF was to bring together our work into one comprehensive system – building on our Theory of Change and organizational Vision and Mission, creating a consistent set of tools and processes for program design and assessment, and making sure to close the loop with defined opportunities for learning, reflection, and improvement.

Here is the graphic that we used to describe the system:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

As I said last time, I felt that three components of the DEF were particularly innovative, and worth exploring in more detail in separate blog articles:

  • I will describe components #2 (“Outcome Indicator Surveys) and #12 (Statement of Impact) in my next article.  Together, these components of the DEF were meant to enable us to measure the impact of our work in a robust, participatory way, so that we could learn and improve;
  • this time, I want to explore component #3 of the DEF: “Case Studies.”

*

It might seem strange to say it this way, but the “Case Studies” were probably my favorite of all the components of the DEF!  I loved them because they offered direct, personal accounts of the impact of projects and programs from children, youth, men and women from the communities in which ChildFund worked and the staff and officials of local agencies and government offices with whom ChildFund partnered.  We didn’t claim that the Case Studies were random or representative samples; rather, their value was simply as stories of human experience, offering insights would not have been readily gained from quantitative data.

Why was this important?  Why did it appeal to me so much?

*

Over my years working with international NGOs, I had become uneasy with the trend towards exclusive reliance on linear logic and quantitative measurement, in our international development sector.  This is perhaps a little bit ironic, since I had joined the NGO world having been educated as an engineer, schooled in the application of scientific logic and numerical analysis for practical applications in the world.

Linear logic is important, because it introduces rigor in our thinking, something that had been weak or lacking when I joined the sector in the mid-1980s.  And quantitative measurement, likewise, forced us to face evidence of what we had or had not achieved. So both of these trends were positive…

But I had come to appreciate that human development was far more complex than building a water system (for example), much more complicated than we could fully capture in linear models.  Yes, a logical, data-driven approach was helpful in many ways, perhaps nearly all of the time, but it didn’t seem to fit every situation in communities that I came to know in Latin America, Africa, and Asia.  In fact, I began to see that an over-emphasis on linear approaches to human development was blinding us to ways that more qualitative, non-linear thinking could help; we seemed to be dismissing the qualitative, narrative insights that should also have been at the heart of our reflections.  No reason not to include both quantitative and qualitative measures.  But we weren’t.

My career in international development began at a time when the private-sector, business culture, started to influence our organizations in a big way: as a result of the Ethiopian famine of the mid-1980’s, INGOs were booming and, as a result, were professionalizing, introducing business practices.  All the big INGOs started to bring in people from the business world, helping “professionalize” our work.

I’ve written elsewhere about the positive and negative effects that business culture had on NGOs: on the positive side, we benefited from systems and approaches the improved the internal management of our agencies, such as clear delegations of authority, financial planning and audit, etc.  Overall, it was a very good, and very necessary evolution.

But there were some negatives.  In particular, the influx of private-sector culture into our organizations meant that:

  • We began increasingly to view the world as a linear, logical place;
  • We came to embrace the belief that bigger is always better;
  • “Accountability” to donors became so fundamental that sometimes it seemed to be our highest priority;
  • Our understanding of human nature, of human poverty, evolved towards the purely material, things that we could measure quantitatively.

I will attach a copy of the article I wrote on this topic here:  mcpeak-trojan-horse.

In effect, this cultural shift had the effect of emphasizing linear logic and quantitative measures to such a degree, with such force, that narrative, qualitative approaches were sidelined as, somehow, not business-like enough.

As I thought about the overall design of the DEF, I wanted to make 100% sure that we were able to measure the quantitative side of our work, the concrete outputs that we produced and the measurable impact that we achieved (more on that next time).  Because the great majority of our work was amenable to that form of measurement, and being accountable for delivering the outputs (projects, funding) that we had promised was hugely important.

But I was equally determined that we would include qualitative elements that would enable us to capture the lived experience of people who facing poverty.  In other words, because poverty is experienced holistically by people, including children, in ways that can be captured quantitatively and qualitatively, we needed to incorporate both quantitative and qualitative measurement approaches if we were to be truly effective.

The DEF “Case Studies” was one of the ways that we accomplished this goal.  It made me proud that we were successful in this regard.

*

There was another reason that I felt that the DEF Case Studies were so valuable, perhaps just as important as the way that they enabled us to measure poverty more holistically.  Observing our organizations, and seeing my own response to how we were evolving, I clearly saw that the influence of private-sector, business culture was having positive and negative effects.

One of the most negative impacts I saw was an increasing alienation of our people from the basic motivations that led them to join the NGO sector, a decline in the passion for social justice that had characterized us.  Not to exaggerate, but it seemed that we were perhaps losing our human connection with the hope and courage and justice that, when we were successful, we helped make for individual women and men, girls and boys.  The difference we were making in the lives of individual human beings was becoming obscured behind the statistics that we were using, behind the mechanical approaches we were taking to our work.

Therefore, I was determined to use the DEF Case Studies as tools for reconnecting us, ChildFund Australia staff and board, to the reason that we joined in the first place.  All of us.

*

So, what were the DEF Case Studies, and how were they produced and used?

In practice, Development Effectiveness and Learning Managers in ChildFund’s program countries worked with other program staff and partners to write up Case Studies that depicted the lived experience of people involved in activities supported by ChildFund.  The Case Studies were presented as narratives, with photos, which sought to capture the experiences, opinions and ideas of the people concerned, in their own words, without commentary.  They were not edited to fit a success-story format.  As time went by, our Country teams started to add a summary of their reflections to the Case Studies, describing their own responses to the stories told there.

Initially we found that field staff had a hard time grasping the idea, because they were so used to reporting their work in the dry, linear, quantitative ways that we had become used to.  Perhaps program staff felt that narrative reports were the territory of our Communications teams, meant for public-relations purposes, describing our successes in a way that could attract support for our work.  Nothing wrong with that, they seemed to feel, but not a program thing!

Staff seemed at a loss, unable to get going.  So we prepared a very structured template for the Case Studies, specifying length and tone and approach in detail.  This was a mistake, because we really wanted to encourage creativity while keeping the documents brief; emphasizing the “voice” of people in communities rather than our own views; covering failures as much as successes.  Use of a template tended to lead our program staff into a structured view of our work, so once we gained some experience with the idea, as staff became more comfortable with the idea and we began to use these Case Studies, we abandoned the rigid template and encouraged innovation.

*

So these Case Studies were a primary source of qualitative information on the successes and failures of ChildFund Australia’s work, offering insights from children, youth and adults from communities where we worked and the staff of local agencies and government offices with whom ChildFund Australia partnered.

In-country staff reviewed the Case Studies, accepting or contesting the opinions of informants about ChildFund Australia’s projects.  These debates often led to adjustments to existing projects but also triggered new thinking – at the project activity level but also at program level or even the overall program approach.

Case Studies were forwarded to Sydney, where they were reviewed by the DEF Manager; some were selected for a similar process of review by International Program staff, members of the Program Review Committee and, on occasion, by the ChildFund Australia Board.

The resulting documents were stored in a simple cloud-based archive, accessible by password to anyone within the organization.  Some Case Studies were also included on ChildFund Australia’s website; we encouraged staff from our Communications team in Sydney to review the Case Studies and, if suitable, to re-purpose them for public purposes.  Of course, we were careful to obtain informed consent from people included in the documents.

*

Through Case Studies, as noted above, local informants were able to pass critical judgement on the appropriateness of ChildFund’s strategies, how community members perceived our aims and purposes (not necessarily as we intended); and they could alert us to unexpected consequences (both positive and negative) of what we did.

For example, one of the first Case Studies written up in Papua New Guinea revealed that home garden vegetable cultivation not only resulted in increased family income for the villager concerned (and positive impact on children in terms of nutrition and education), it also enhanced his social standing through increasing his capacity to contribute to traditional cultural events.

Here are three images from that Case Study:

Screen Shot 2018-06-09 at 3.07.54 PM

Screen Shot 2018-06-09 at 3.07.27 PM

Screen Shot 2018-06-09 at 3.07.41 PM

 

And here is a copy of the Case Study itself:  PNG Case Study #1 Hillary Vegetable farming RG edit 260111.  Later I was able to visit Hillary at his farm!

Another Case Study came from the ChildFund Connect project, an exciting effort led by my former colleagues Raúl Caceres and Kelly Royds, who relocated from Sydney to Boston in 2016.  I climbed Mt Moriah with them in July, 2017, and also Mt Pierce and Mt Eisenhower in August of 2016.  ChildFund Connect was an innovative project that linked children across Laos, Viet Nam, Australia and Sri Lanka, providing a channel for them directly to build understanding of their differing realities.   This Case Study on their project came from Laos: LAO Case Study #3 Connect DRAFT 2012.

In a future article in this series, I plan on describing work we carried out building the power (collective action) of people living in poverty.  It can be a sensitive topic, particularly in areas of Southeast Asia without traditions of citizen engagement.  Here is a Case Study from Viet Nam describing how ChildFund helped local citizens connect productively with authorities to resolve issues related to access to potable water: VTM Case Study #21 Policy and exclusion (watsan)-FINAL.

*

Dozens of Case Studies were produced, illustrating a wide range of experiences with the development processes supported by ChildFund in all of the countries where we managed program implementation.  Reflections from many of these documents helped us improve our development practice, and at the same time helped us stay in touch with the deeper purpose of our having chosen to work to promote social justice, accompanying people living in poverty as they built better futures.

*

A few of the DEF Case Studies focused, to some extent, on ChildFund Australia itself.  For example, here is the story of three generations of Hmong women in Nonghet District in Xieng Khoung Province in Laos.  It describes how access to education has evolved across those generations:  LAO Case Study #5 Ethnic Girls DRAFT 2012.  It’s a powerful description of change and progress, notable also because one of the women featured in the Case Study was a ChildFund employee, along with her mother and daughter!

Two other influential Case Studies came from Cambodia, both of which touched on how ChildFund was attempting to manage our child-sponsorship mechanisms with our programmatic commitments.  I’ve written separately, some time ago, about the advantages of child sponsorship: when managed well (as we did in Plan and especially in ChildFund Australia), and these two Case Studies evocatively illustrated the challenge, and the ways that staff in Cambodia were making it all work well.

One Case Study describes some of the tensions implicit in the relationship between child sponsorship and programming, and the ways that we were making progress in reconciling these differing priorities: CAM Case Study 6 Sponsorship DRAFT 2012.  This Case Study was very influential, with our staff in Cambodia and beyond, with program staff in Sydney, and with our board.  It powerfully communicated a reality that our staff, and families in communities, were facing.

A second Case Study discussed how sponsorship and programs were successfully integrated in the field in Cambodia: CAM Case Study #10 Program-SR Integration Final.

*

As I mentioned last time, given the importance of the system, relying on our feeling that the DEF was a great success wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

The first review (attached here: External DEF Review – November 2012), which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in my next blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advanced in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

I included an overview of the conclusions reached by both reviewers last time.  Here I want to quote from the first evaluation, with particular reference to the DEF Case Studies:

One of the primary benefits of the DEF is that it equips ChildFund Australia with an increased quantity and quality of evidence-based information for communications with key stakeholders including the Board and a public audience. In particular, there is consolidated output data that can be easily accessed by the communications team; there is now a bank of high quality Case Studies that can be drawn on for communication and reflection; and there are now dedicated resources in-country who have been trained and are required to generate information that has potential for communications purposes. The increase in quantity and quality of information equips ChildFund Australia to communicate with a wide range of stakeholders.

One of the strengths of the DEF recognized by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through Case Studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

This focus on building tools, systems and the overall capacity of the organization places ChildFund Australia in a strong position to tackle a second phase of the DEF which looks at how the organization will use performance information for learning and development. It has already started on this journey, with various parts of the organization using Case Studies for reflection. ChildFund Australia has already undertaken an exercise of coding the bank of Case Studies to assist further analysis and learning. There is lots of scope for next steps with this bank of Case Studies, including thematic reflections. Again, the benefits of this aspect have not been realised yet as the first stages of the DEF roll-out have been focused on data collection and embedding the system in CF practices.

In most Country Offices, Case Studies have provided a new formal opportunity for country program staff to reflect on their work and this has been used as a really constructive process. The Laos Country Office is currently in the process of translating Case Studies so that they can be used to prompt discussion and learning at the country level. In PNG, the team is also interested in using the Case Studies as a communication tool with local communities to demonstrate some of the achievements of ChildFund Australia programs.

In some cases, program staff have found Case Studies confronting when they have highlighted program challenges or weaknesses. The culture of critical reflection may take time to embed in some country offices and may be facilitated by cross-country reflection opportunities. Currently, however, Country Office staff do not know how to access Case Studies from other country programs. ChildFund Australia is exploring how the ‘bank’ of DEF Case Studies would be most accessible and useful to country office personnel.

One of the uses of Case Studies has been as a prompt for discussion and reflection by the programs team in Sydney and by the Board. Case Studies have been seen as a really useful way to provide an insight into a program, practice and ChildFund Australia achievements.

At an organizational level, an indexing and cross-referencing system has been implemented which enables Case Studies to be searched by country and by theme. The system is yet to be introduced to MEL and Program users, but has potential to be a very useful bank of qualitative data for reflection and learning. It also provides a bank of data from which to undertake thematic reflections across and between countries. One idea for consideration is that ChildFund draw on groups of Case Studies to develop practice notes.

In general Case Studies are considered to be the most ‘successful’ part of the DEF by those involved in collecting information.

The second reviewer concentrated on other components, mainly aspects I will describe in more detail in my next article, not so much the Case Studies…

*

So the Case Studies were a very important element in the overall DEF.  I tried very hard to incorporate brief reflections on selected Case Studies at every formal meeting of the International Program Team, of ChildFund Australia’s Program Review Committee, and (less frequently) at meetings of our Board of Directors.  More often than not, time pressures on the agendas of these meetings led to us dropping the Case Studies from discussion, but often enough we did spend time (usually at the beginning of the meetings) reflecting on what we saw in them.

At the beginning, when we first began to use the Case Studies, our discussion tended to be mechanical: pointing out errors in the use of English, or questioning how valid the observations might be, challenging the statistical reliability of the conclusions.  But, over time, I noticed that our teams began to use the Case Studies as they were designed: to gain insight into the lived experience of particular human beings, and to reconnect with the realities of people’s struggle for better lives for themselves and their children.

This was a great success, and really worked as I had hoped.  The Case Studies complemented the more rigorous, quantitative components of the DEF, helping the system be holistic, enabling us to see more deeply into the effect that our work was having while also enhancing our accountability.

*

Next time, I will describe getting to the top of West Bond, and all the way down the 11 miles from there to the Lincoln Woods parking lot, where I staggered back to my car with such damage to my feet that I soon would lose toenails on both my big toes!  And I will share details of the final two components of the DEF that I want to highlight: the Outcome Indicator Surveys and Statements of Impact were probably the culmination of the whole system.

So, stay tuned!

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change;
  35. Bondcliff (35) – ChildFund Australia’s Development Effectiveness System.

 

 

Bondcliff (35) – ChildFund Australia’s Development Effectiveness Framework

June, 2018

I began a new journey just over two years ago, in May, 2016, tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers. I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

In each article, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

So, when I wrap things up in this series, there should be 48 articles…

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia, a newly-created position.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team – my intention was to lead and manage with clarity, trust, and inspiration.  A few weeks ago, I wrote describing the role and staffing and structural iterations of ChildFund’s International Program Team and, last time, I outlined the foundational program approach we put in place – a Theory of Change and Outcome and Output Indicators.

Once the program approach was in place, as a strong foundation, we moved forward to build a structured approach to development effectiveness.  I am very proud of what we achieved: the resulting ChildFund Australia “Development Effectiveness Framework” (“DEF”) was, I think, state-of-the-art for international NGOs at the time.  Certainly few (if any) other INGOs in Australia had such a comprehensive, practical, useful system for ensuring the accountability and improvement of their work.

Since the DEF was so significant, I’m going to write three articles about it:

  1. In this article I will describe the DEF – its components, some examples of products generated by the DEF, and how each part of the system worked with the other parts.  I will also share results of external evaluations that we commissioned on the DEF itself;
  2. Next time, I will highlight one particular component of the DEF, the qualitative “Case Studies” of the lived experience of human change.  I was especially excited to see these Case Studies when they started arriving in Sydney from the field, so I want to take a deep dive into what these important documents looked like, and how we attempted to use them;
  3. Finally, I will the last two DEF components that came online (Outcome Indicator Surveys and Statements of Impact), the culmination of the system, where we assessed the impact of our work.

So there will be, in total, three articles focused on the DEF.  This is fitting, because I climbed three mountains on one day in August of 2017…

*

On 10 August, 2017, I climbed three 4000-footers in one day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a very long, very tough day, covering 22 miles and climbing three mountains in one go.  At the end of the hike, I felt like I was going to lose the toenails on both big toes… and, in fact, that’s what happened.  As a result, for the rest of the season I would be unable to hike in boots and had to use hiking shoes instead!

Knowing that the day would be challenging, I drove up from Durham the afternoon before and camped, so I could get the earliest start possible the next morning.  I got a spot at Hancock Campground, right near the trailhead where I would start the climb:

IMG_1871.jpg

 

The East Branch of the Pemigewassit River runs alongside this campground, and I spent a pleasant late afternoon reading a book by Jean Paul Lederach there, and when it was dark I crawled into my sleeping bag and got a good night’s sleep.

IMG_1868

IMG_1869

 

Here is a map of the long ascent that awaited me the next morning, getting to the top of Bondcliff:

Bond Map - 3.jpg

 

After Bondcliff, the plan was that I would continue on to climb Mt Bond and West Bond, and to then return to Lincoln Woods… more on that in the next two articles in this series.  In this one I will describe climbing the first 4000-footer of that day, Bondcliff.

I got an early start on 10 August, packing up my tent-site and arriving at the trailhead at Lincoln Woods at about 6:30am:

IMG_1873.jpg

 

It was just two weeks earlier that I had parked here to climb Owl’s Head, which I had enjoyed a lot.  This time, I would begin the same way – walking up the old, abandoned forestry railway for about 2.6 miles on Lincoln Woods Trail, to where I had turned left up the Franconia Brook Trail towards Owl’s Head.  I arrived at that junction at about 7:30am:

IMG_1883.jpg

IMG_1891.jpg

 

 

This time I would continue straight at that intersection, continuing onto the Wilderness Trail, which winds through forest for a short distance, before opening out again along another old logging railway, complete with abandoned hardware along the way, discarded over 130 years ago:

IMG_1893.jpg

 

At the former (and now abandoned) Camp 16 (around 4.4 miles from the parking lot at Lincoln Woods), I took a sharp left and joined a more normal trail – no more old railway.  I began to ascend moderately, going up alongside Black Brook: now I was on the Bondcliff Trail.

 

I crossed Black Brook twice on the way up after leaving the Wilderness Trail, and then crossed two dry beds of rock, which were either rock slides or upper reaches of Black Brook that were dry that day.

IMG_1898.jpg

 

It’s a long climb up Black Brook; after the second dry crossing, Bondcliff Trail takes a sharp left turn and continues ascending steadily.  Just before reaching the alpine area, and the summit of Bondcliff, there is a short steep section, where I had to scramble up some bigger boulders.  Slow going…

But then came the reward: spectacular views to the west, across Owl’s Head to Franconia Ridge, up to Mt Garfield, and over to West Bond and Mt Bond.  Here Mt Lincoln and Mt Lafayette are on the left, above Owl’s Head, with Mt Garfield to the right:

IMG_1905

Lincoln and Lafayette In The Distance On The Left, Mt Garfield In The Distance On The Right

 

Here is a view looking to the southwest from the top of Bondcliff:

IMG_1907

From The Summit Of Bondcliff

IMG_1920

From The Summit Of Bondcliff

 

And this is the view towards Mt Bond, looking up from the top of Bondcliff:

IMG_1925

West Bond Is On The Left, And Mt Bond On The Right

 

I got to the top of Bondcliff at about 10:30am, just about four hours from the start of the hike.  Feeling good … at this point!  Here is a spectacular view back down towards Bondcliff, taken later in the day, from the top of West Bond:

IMG_1964.jpg

 

I would soon continue the climb, with a short hop from Bondcliff up to the top of Mt Bond.  Stay tuned!

*

Last time I wrote about how we built the foundations for ChildFund Australia’s new program approach: a comprehensive and robust “Theory of Change” that described what we were going to accomplish at a high level, and why; a small number of reliable, measurable, and meaningful “Outcome Indicators” that would enable us to demonstrate the impact of our work as related explicitly to our Outcome Indicators; and a set of “Output Indicators” that would allow us to track our activities in a consistent and comparable manner, across our work across all our programs: in Cambodia, Laos, Papua New Guinea, and Viet Nam.  (Myanmar was a slightly different story, as I will explain later…)

Next, on that foundation, we needed a way of thinking holistically about the effectiveness of our development work: a framework for planning our work in each location, each year; for tracking whether we were doing what we had planned; for understanding how well we were performing; and improving the quality and impact of our work.  And doing all this in partnership with local communities, organizations, and governments.

This meant being able to answer five basic questions:

  1. In light of our organizational Theory of Change, what are we going to do in each location, each year?
  2. how will we know that we are doing what we planned to do?
  3. how will we know that our work makes a difference and gets results consistent with our Theory of Change?;
  4. how will we learn from our experience, to improve the way we work?;
  5. how can community members and local partners directly participate in the planning, implementation, and evaluation of the development projects that ChildFund Australia supports?

Looking back, I feel that what we built and implemented to answer those questions – the ChildFund Australia “Development Effectiveness Framework” (“DEF”) – was our agency’s most important system.  Because what could be more important than the answers to those five questions?

*

I mentioned last time that twice, during my career with Plan International, we had tried to produce such a system, and failed (at great expense).  We had fallen into several traps that I was determined to avoid repeating this time, in ChildFund Australia, as we developed and implemented the DEF:

  • We would build a system that could be used by our teams with the informed participation of local partners and staff, practically – that was “good enough” for its purpose, instead of a system that had to be managed by experts, as we had done in Plan;
  • We would include both quantitative and qualitative information, serving the needs of head and heart, instead of building a wholly-quantitative system for scientific or academic purposes, as we had done in Plan;
  • We would not let “the best be the enemy of the good,” and I would make sure that we moved to rapidly prototype, implement, and improve the system instead of tinkering endlessly, as we had done in Plan.

I go into more detail about the reasons for Plan’s lack of success in that earlier article.

*

Here is a graphic that Caroline Pinney helped me create, which I used very frequently to explain how the DEF was designed, functioned, and performed:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

In this article, I will describe each component of the DEF, outlining how each relates to each other and to the five questions outlined above.

However, I’m going to reserve discussion of three of those components for my next two articles:

  • Next time, I will cover #3 in Figure 1, the “Case Studies” that we produced.  These documents helped us broaden our focus from the purely quantitative to include consideration of the lived experience of people touched by the programs supported by ChildFund Australia.  In the same way, the Case Studies served as valuable tools for our staff, management, and board to retain a human connection to the spirit that motivated us to dedicate our careers to social justice;
  • And, after that, I will devote an article to our “Outcome Indicator Surveys” (#2 in Figure 1, above) and Statements of Impact (#12 in Figure 1). The approach we took to demonstrating impact was innovative and very participatory, and successful.  So I want to go into a bit of depth describing the two DEF components involved.

Note: I prepared most of what follows.  But I have included and adapted some descriptive material produced by the two DEF Managers that worked in the International Program Team:  Richard Geeves and Rouena Getigan.  Many thanks to them!

*

Starting Points

The DEF was based on two fundamental statements of organizational identity.  As such, it was built to focus us on, and enable us to be accountable for, what we were telling the world we were:

  1. On the bottom left of the DEF schematic (Figure 1, above) we reference the basic documents describing ChildFund’s identity: our Vision, Mission, Strategic Plan, Program Approach, and Policies – all agreed and approved by our CEO (Nigel Spence) and Board of Directors.  The idea was that the logic underlying our approach to Development Effectiveness would therefore be grounded in our basic purpose as an organization, overall.  I was determined that the DEF would serve to bring us together around that purpose, because I had seen Plan tend to atomize, with each field location working towards rather different aims.  Sadly, Plan’s diversity seemed to be far greater than required if it were simply responding to the different conditions we worked in.  For example, two Field Offices within 20 km of each other in the same country might have very different programs.  This excessive diversity seemed to relate more to the personal preferences of Field Office leadership than to any difference in the conditions of child poverty or the local context.  The DEF would help ChildFund Australia cohere, because our starting point was our organizational identity;
  2. But each field location did need a degree flexibility to respond to their reality, within ChildFund’s global identity, so at the bottom of the diagram we placed the Country Strategy Paper (“CSP”), quite centrally.  This meant that, in addition to building on ChildFund Australia’s overall purpose and identity globally, we would also build our approach to Development Effectiveness on how we chose to advance that basic purpose in each particular country where we worked, with that country’s particular characteristics.

Country Strategy Paper

The purpose and outline of the CSP was included in the ChildFund Australia Program Handbook:

To clarify, define, communicate and share the role, purpose and structure of ChildFund in-country – our approach, operations and focus. The CSP aims to build a unity of purpose and contribute to the effectiveness of our organisation.

When we develop the CSP we are making choices, about how we will work and what we will focus on as an organisation. We will be accountable for the commitments we make in the CSP – to communities, partners, donors and to ourselves.

While each CSP will be different and reflect the work and priorities of the country program, each CSP will use the same format and will be consistent with ChildFund Australia’s recent program development work.

During the development of the CSP it is important that we reflect on the purpose of the document. It should be a useful and practical resource that can inform our development work. It should be equally relevant to both our internal and external stakeholders. The CSP should be clear, concise and accessible while maintaining a strategic perspective. It should reflect clear thinking and communicate our work and our mission. It should reflect the voice of children.  Our annual work plans and budgets will be drawn from the CSP and we will use it to reflect on and review our performance over the three year period.

Implementation of the DEF flowed from each country’s CSP.

More details are found in Chapter 5 of the Program Handbook, available here: Program Handbook – 3.3 DRAFT.  Two examples of actual ChildFund Australia Country  Strategy Papers from my time with the organization are attached here:

For me, these are clear, concise documents that demonstrate coherence with ChildFund’s overall purpose along with choices driven by the situation in each country.

*

Beginning from the Country Strategy Paper, the DEF branches in two inter-related (in fact, nested) streams, covering programs (on the left side) and projects (on the right side).  Of course, projects form part of programs, consistent with our program framework:

Screen Shot 2018-05-28 at 2.16.30 PM

Figure 2: ChildFund Australia Program Framework

 

But it was difficult to depict this embedding on the two dimensions of a graphic!  So Figure 1 showed programs on one side and projects on the other.

Taking the “program” (left) side first:

Program Description

Moving onto the left side of Figure 1, derived from the Country Strategy Paper, and summarized in the CSP, each Country Office defined a handful (some countries had 3, others ended up with 5) “Program Descriptions” (noted as #1 in Figure 1), each one describing how particular sets of projects would create impact, together, as measured using ChildFund Australia’s Outcome Indicators – in other words, a “Theory of Change,” detailing how the projects included in the program linked together to create particular  positive change.

The purpose and outline of the Program Description was included in the ChildFund Australia Program Handbook:

ChildFund Australia programs are documented and approved through the use of “Program Descriptions”.  All Program Descriptions must be submitted by the Country Director for review and approval by the Sydney International Program Director, via the International Program Coordinator.

For ChildFund Australia: a “program” is an integrated set of projects that, together, have direct or indirect impact on one or more of our agreed organisational outcome indicators.   Programs normally span several geographical areas, but do not need to be implemented in all locations; this will depend on the geographical context.  Programs are integrated and holistic. They are designed to achieve outcomes related to ChildFund Australia’s mission, over longer periods, while projects are meant to produce outputs over shorter timeframes.

Program Descriptions were summarized in the CSP, contained a listing of the types of projects (#5 in Figure 1) that would be implemented, and were reviewed every 3 or 4 years (Program Review, #4 in Figure 1).

To write a Program Description, ChildFund staff (usually program managers in a particular Country Office) were expected to review our program implementation to-date, carry out extensive situational analyses of government policies, plans and activities in the sector and of communities’ needs in terms of assets, aspirations and ability to work productively with local government officials responsible for service provision. The results of ChildFund’s own Outcome Indicator surveys and community engagement events obviously provided very useful evidence in this regard.

Staff then proposed a general approach for responding to the situation and specific strategies which could be delivered through a set of projects.  They would also show that the approach and strategies proposed are consistent with evidence from good practice both globally and in-country, demonstrated that their choices were evidence-based.

Here are 2 examples of Program Descriptions:

Producing good, high-quality Program Descriptions was a surprising challenge for us, and I’m not sure we ever really got this component of the DEF right.  Probably the reason that we struggled was that these documents were rather abstract, and our staff weren’t used to operating at this level of abstraction.

Most of the initial draft Program Descriptions were quite superficial, and were approved only as place-holders.  Once we started to carry out “Program Reviews” (see below), however, where more rigor was meant to be injected into the documents, we struggled.  It was a positive, productive struggle, but a struggle nonetheless!

We persisted, however, because I strongly believed that our teams should be able to articulate why they were doing what they were doing, and the Program Descriptions were the basic tool for that exact explanation.  So we perservered, hoping that the effort would result in better programs, more sophisticated and holistic work, and more impact on children living in poverty.

*

 

 

Program Reviews

For the same reasons outlined above, in my discussion of the “Program Descriptions” component of the DEF, we also struggled with the “Program Review” (#4 in Figure 1, above).  In these workshops, our teams would consider an approved “Program Description” (#1 in Figure 1) every three or four years, subjecting the document to a formal process of peer review.

ChildFund staff from other countries visited the host country to participate in the review process and then wrote a report making recommendations for how the Program under review might be improved.  The host country accepted (or debated and adjusted) the  recommendations, acted on them and applied them to a revision of the Program Description: improving it, tightening up the logic, incorporating lessons learned from implementation, etc.

Program Reviews were therefore fundamentally about learning and improvement, so we made sure that, in addition to peers from other countries, the host Country Office invited in-country partners and relevant experts.  And International Program Coordinators from Sydney were asked to always attend Program Reviews in the countries that they were supporting, again for learning and improvement purposes.

The Program Reviews that I attended were useful and constructive, but I certainly sensed a degree of frustration.  In addition to struggling with the relatively-high levels of abstraction required, our teams were not used to having outsiders (even their peers other ChildFund offices) critique their efforts.  So, overall, this was a good and very-important component of the DEF, designed correctly, but it needed more time for our teams to learn how to manage this process and to be open to such a public process of review.

*

Projects and Quarterly Reports

As shown on the right hand side of Figure 1, ChildFund’s field staff and partners carried out routine monitoring of projects (#6 in the Figure) to ensure that they were on track, and on which they based their reporting on activities and outputs.  Project staff summarized their monitoring through formal Quarterly Reports (#7) on each project documenting progress against project plans, budgets, and targets to ensure projects are well managed.  These Quarterly Reports were reviewed in each Country Office and most were also forwarded to ChildFund’s head office in Sydney (and, often, donors) for review.

When I arrived, ChildFund Australia’s Quarterly reporting was well-developed and of high quality, so I didn’t need to focus on this aspect of our work.  We simply incorporated it into the more-comprehensive DEF.

*

Quarterly Output Tracking

As described last time, ChildFund developed and defined a set of Outputs which became standard across the organization in FY 2011-12.  Outputs in each project were coded and  tracked from Quarter to Quarter by project.  Some of the organizational outputs were specific to a sector such as education, health and water sanitation or a particular target group such as children, youth or adults.  Other Outputs were generic and might be found in any project, for example, training or awareness raising, materials production and consultation.

Organizational Outputs were summarized for all projects in each country each Quarter and country totals were aggregated in Sydney for submission to our Board of Directors (#8 in Figure 1, above).  In March 2014 there were a total of 47 organizational Outputs – they were listed in my last article in this series.

One purpose of this tracking was to enhance our accountability, so a summary was reviewed every Quarter in Sydney by the International Program Team and our Program Review Committee.

Here is an example of how we tracked outputs: this is a section of a Quarterly Report produced by the International Program Team for our Board and Program Review Committee: Output Report – Q4FY15.

*

Project Evaluations

ChildFund also conducted reviews or evaluations of all projects (#9 in Figure 1, above) – in different ways.  External evaluators were employed under detailed terms of reference to evaluate multi-year projects with more substantial budgets or which were significant for learning or to a particular donor.  Smaller projects were generally evaluated internally.  All evaluators were expected to gather evidence of results against output targets and performance indicators written against objectives.

*

All development effectiveness systems have, at their heart, mechanisms for translating operational experiences into learning and program improvement.  In the representation of ChildFund’s DEF in Figure 1, this was represented by the central circle in the schematic which feeds back evidence from a variety of sources into our organizational and Country Strategy Papers, Program Descriptions and project planning and design.

Our program staff found that their most effective learning often occurred during routine monitoring through observation of project activities and conversations in communities with development partners.  Through thoughtful questioning and attentive listening, staff could make the immediate decisions and quick adjustments which kept project activities relevant and efficient.

Staff also had more formal opportunities to document and reflect on learning.  The tracking of outputs and aggregation each Quarter drew attention to progress and sometimes signaled the need to vary plans or redirect resources.

Project evaluations (#9 in Figure 1, above) provided major opportunities for learning, especially when external evaluators bring their different experiences to bear and offer fresh perspectives on a ChildFund project.

*

The reader can easily grasp that, for me, the DEF was a great success, a significant asset for ChildFund Australia that enabled us to be more accountable and effective.  Some more-technically-focused agencies were busy carrying out sophisticated impact evaluations, using control groups and so forth, but that kind of effort didn’t suit the vast majority of INGOs.  We could benefit from the learnings that came from those scientific evaluations, but we didn’t have the resources to introduce such methodologies ourselves.  And so, though not perfect, I am not aware of any comparable organization that succeeded as we did with our DEF.

While the system built on what I had learned over nearly 30 years, and even though I felt that it was designed comprehensively and working very well, that was merely my opinion!

Given the importance of the system, relying on my opinion (no matter how sound!) wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

*

The first review, which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in an upcoming blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advance in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

In that light, this first external review of the DEF concluded the following:

The development of the DEF places ChildFund Australia in a sound position within the sector in the area of development effectiveness. The particular strength of ChildFund Australia’s framework is that it binds the whole organisation to a set of common indicators and outputs. This provides a basis for focussing the organisation’s efforts and ensuring that programming is strategically aligned to common objectives. The other particular strength that ChildFund Australia’s framework offers is that it provides a basis for aggregating its achievements across programs, thereby strengthening the organisation’s overall claims of effectiveness.

Within ChildFund Australia, there is strong support for the DEF and broad agreement among key DEF stakeholders and users that the DEF unites the agency on a performance agenda. This is in large part due to dedicated resources having been invested and the development of a data collection system has been integrated into the project management system (budgeting and planning, and reporting), thereby making DEF a living and breathing function throughout the organisation. Importantly, the definition of outcomes and outputs indicators provides clarity of expectations across ChildFund Australia.

One of the strengths of the DEF recognised by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through case studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

Significantly, the DEF signals a focus on effectiveness to donors and the sector. One of the benefits already felt by ChildFund Australia is that it is able to refer to its effectiveness framework in funding submissions and in communication with its major donors who have an increasing interest on performance information.

Overall, the review found that the pilot of the DEF has been implemented well, with lots of consultation and engagement with country offices, and lots of opportunity for refinement. Its features are strong, enabling ChildFund to both measure how much it is doing, and the changes that are experienced by communities over time. The first phase of the DEF has focused on integrating effectiveness measurement mechanisms within program management and broader work practices, while the second phase of the DEF will look at the analysis, reflection and learning aspects of effectiveness. This second phase is likely to assist various stakeholders involved in collecting effectiveness information better understand and appreciate the linkages between their work and broader organisational learning and development. This is an important second phase and will require ongoing investment to maximise the potential of the DEF. It place ChildFund Australia in a strong position within the Australian NGO sector to engage in the discourse around development effectiveness and demonstrate its achievements.

A full copy of this first review, removing only the name of the author, is attached here: External DEF Review – November 2012.

In early 2015 we carried out a second review.  This time, we had implemented the entire DEF, carrying out (for example) Statement of Impact workshops in several locations.  The whole system was now working.

At that point, we were very confident in the DEF – from our point of view, all components were working well, producing good and reliable information that was being used to improve our development work.  Our board, program-review committee, and donors were all enthusiastic.  More importantly, local staff and communities were positive.

The only major concern that remained related to the methodology we used in the Outcome Indicator Surveys.  I will examine this issue in more detail in an upcoming blog article in this series; but the reader will notice that this second formal, external evaluation focuses very much on the use of the LQAS methodology in gathering information for our Outcome Indicator workshops and Statements of Impact.

That’s why the external evaluator we engaged to carry out this second review was an expert in survey methodologies (in general) and in the LQAS (in particular.)

In that light, this second external review of the DEF concluded the following:

ChildFund Australia is to be commended for its commitment to implementing a comprehensive and rigorous monitoring and evaluation framework with learning at its centre to support and demonstrate development effectiveness. Over the past five years, DEL managers in Cambodia, Laos, Papua New Guinea and Vietnam, with support and assistance from ChildFund Australia, country directors and program managers and staff, have worked hard to pilot, refine and embed the DEF in the broader country programs.  Implementing the DEF, in particular the Outcome Indicator Survey using LQAS, has presented several challenges.  With time, many of the early issues have been resolved, tools improved and guidelines developed.  Nevertheless, a few issues remain that must be addressed if the potential benefits are to be fully realised at the organisational, country and program levels.

Overall, the DEF is well suited for supporting long-term development activities in a defined geographic area.  The methodologies, scope and tools employed to facilitate Outcome Indicator Surveys and to conduct Community Engagement and Attribution of Impact processes are mostly fit for purpose, although there is considerable room for improvement.  Not all of the outcome indicators lend themselves to assessment via survey; those that are difficult to conceptualise and measure being most problematic. For some indicators in some places, a ceiling effect is apparent limiting their value for repeated assessment. While outcome indicators may be broadly similar across countries, both the indicators and the targets with which they are to be compared should be locally meaningful if the survey results are to be useful—and used—locally.

Used properly, LQAS is an effective and relatively inexpensive probability sampling method.  Areas for improvement in its application by ChildFund include definition of the lots, identification of the sampling frame, sample selection, data analysis and interpretation, and setting targets for repeated surveys.

Community Engagement and the Attribution of Impact processes have clearly engaged the community and local stakeholders.  Experience to date suggests that they can be streamlined to some extent, reducing the burden on staff as well as communities.  These events are an important opportunity to bring local stakeholders together to discuss local development needs and set future directions and priorities.  Their major weakness lies in the quality of the survey results that are presented for discussion, and their interpretation.  This, in turn, affects the value of the Statement of Impact and other documents that are produced.

The DEF participatory processes have undoubtedly contributed to the empowerment of community members involved. Reporting survey results in an appropriate format, together with other relevant data, in a range of inviting and succinct documents that will meet the needs of program staff and partners is likely to increase their influence.

A full copy of this second review, removing only the name of the author, is attached here: DEF Evaluation – April 2015.

*

Great credit is due to ChildFund staff that contributed to the conceptualization, development, and implementation of the DEF.  In particular, Richard Geeves and Rouena Getigan in the International Program Team in Sydney worked very hard to translate my sometimes overly-ambitious concepts into practical guidelines, and ably supported our Country Offices.

One of the keys to the success of the DEF was that we budgeted for dedicated in-country support, with each Country Office able to hire a DEL Manager (two in Viet Nam, given the scale of our program there.)

Many thanks to Solin in Cambodia, Marieke in Laos, Joe in Papua New Guinea, and Thuy and Dung in Viet Nam: they worked very hard to make the DEF function in their complex realities.  I admire how that made it work so well.

*

In this article, I’ve outlined how ChildFund Australia designed a comprehensive and very robust Development Effectiveness System.  Stay tuned next time, when I describe climbing Mt Bond, and then go into much more depth on one particular component (the Case Studies, #3 in Figure 1, above).

After that, in the following article, I plan to cover reaching the top of West Bond and descending back across Mt Bond and Bondcliff (and losing toenails on both big toes!) and go into some depth to describe how we carried out Outcome Indicator Surveys (#2 in Figure 1) and Statements of Impact (#12) – in many ways, the culmination of the DEF.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change.