Bondcliff (35) – ChildFund Australia’s Development Effectiveness Framework

June, 2018

I began a new journey just over two years ago, in May, 2016, tracing two long arcs in my life:

  • During those two years, I’ve been climbing all 48 mountains in New Hampshire that are at least 4000 feet tall (1219m), what is called “peak-bagging” by local climbers. I’m describing, in words and images, the ascent of each of these peaks – mostly done solo, but sometimes with a friend or two;
  • Alongside descriptions of those climbs, I’ve been sharing what it was like working in international development during the MDG era: as it boomed, and evolved, from the response to the Ethiopian crisis in the mid-1980’s through to the conclusion of the Millennium Development Goals in 2015.

In each article, I am writing about climbing each of those mountains and, each time, I reflect a bit on the journey since I began to work in social justice, nearly 34 years ago: on development, human rights, conflict, experiences along the way, etc.

So, when I wrap things up in this series, there should be 48 articles…

*

In 2009 Jean and I moved to Sydney, where I took up a new role as International Program Director for ChildFund Australia, a newly-created position.  On my way towards Sydney, I was thinking a lot about how to build a great program, and how I would approach building a strong team – my intention was to lead and manage with clarity, trust, and inspiration.  A few weeks ago, I wrote describing the role and staffing and structural iterations of ChildFund’s International Program Team and, last time, I outlined the foundational program approach we put in place – a Theory of Change and Outcome and Output Indicators.

Once the program approach was in place, as a strong foundation, we moved forward to build a structured approach to development effectiveness.  I am very proud of what we achieved: the resulting ChildFund Australia “Development Effectiveness Framework” (“DEF”) was, I think, state-of-the-art for international NGOs at the time.  Certainly few (if any) other INGOs in Australia had such a comprehensive, practical, useful system for ensuring the accountability and improvement of their work.

Since the DEF was so significant, I’m going to write three articles about it:

  1. In this article I will describe the DEF – its components, some examples of products generated by the DEF, and how each part of the system worked with the other parts.  I will also share results of external evaluations that we commissioned on the DEF itself;
  2. Next time, I will highlight one particular component of the DEF, the qualitative “Case Studies” of the lived experience of human change.  I was especially excited to see these Case Studies when they started arriving in Sydney from the field, so I want to take a deep dive into what these important documents looked like, and how we attempted to use them;
  3. Finally, I will the last two DEF components that came online (Outcome Indicator Surveys and Statements of Impact), the culmination of the system, where we assessed the impact of our work.

So there will be, in total, three articles focused on the DEF.  This is fitting, because I climbed three mountains on one day in August of 2017…

*

On 10 August, 2017, I climbed three 4000-footers in one day: Bondcliff (4265ft, 1300m), Mt Bond (4698ft, 1432m), and West Bond (4540ft, 1384m).  This was a very long, very tough day, covering 22 miles and climbing three mountains in one go.  At the end of the hike, I felt like I was going to lose the toenails on both big toes… and, in fact, that’s what happened.  As a result, for the rest of the season I would be unable to hike in boots and had to use hiking shoes instead!

Knowing that the day would be challenging, I drove up from Durham the afternoon before and camped, so I could get the earliest start possible the next morning.  I got a spot at Hancock Campground, right near the trailhead where I would start the climb:

IMG_1871.jpg

 

The East Branch of the Pemigewassit River runs alongside this campground, and I spent a pleasant late afternoon reading a book by Jean Paul Lederach there, and when it was dark I crawled into my sleeping bag and got a good night’s sleep.

IMG_1868

IMG_1869

 

Here is a map of the long ascent that awaited me the next morning, getting to the top of Bondcliff:

Bond Map - 3.jpg

 

After Bondcliff, the plan was that I would continue on to climb Mt Bond and West Bond, and to then return to Lincoln Woods… more on that in the next two articles in this series.  In this one I will describe climbing the first 4000-footer of that day, Bondcliff.

I got an early start on 10 August, packing up my tent-site and arriving at the trailhead at Lincoln Woods at about 6:30am:

IMG_1873.jpg

 

It was just two weeks earlier that I had parked here to climb Owl’s Head, which I had enjoyed a lot.  This time, I would begin the same way – walking up the old, abandoned forestry railway for about 2.6 miles on Lincoln Woods Trail, to where I had turned left up the Franconia Brook Trail towards Owl’s Head.  I arrived at that junction at about 7:30am:

IMG_1883.jpg

IMG_1891.jpg

 

 

This time I would continue straight at that intersection, continuing onto the Wilderness Trail, which winds through forest for a short distance, before opening out again along another old logging railway, complete with abandoned hardware along the way, discarded over 130 years ago:

IMG_1893.jpg

 

At the former (and now abandoned) Camp 16 (around 4.4 miles from the parking lot at Lincoln Woods), I took a sharp left and joined a more normal trail – no more old railway.  I began to ascend moderately, going up alongside Black Brook: now I was on the Bondcliff Trail.

 

I crossed Black Brook twice on the way up after leaving the Wilderness Trail, and then crossed two dry beds of rock, which were either rock slides or upper reaches of Black Brook that were dry that day.

IMG_1898.jpg

 

It’s a long climb up Black Brook; after the second dry crossing, Bondcliff Trail takes a sharp left turn and continues ascending steadily.  Just before reaching the alpine area, and the summit of Bondcliff, there is a short steep section, where I had to scramble up some bigger boulders.  Slow going…

But then came the reward: spectacular views to the west, across Owl’s Head to Franconia Ridge, up to Mt Garfield, and over to West Bond and Mt Bond.  Here Mt Lincoln and Mt Lafayette are on the left, above Owl’s Head, with Mt Garfield to the right:

IMG_1905

Lincoln and Lafayette In The Distance On The Left, Mt Garfield In The Distance On The Right

 

Here is a view looking to the southwest from the top of Bondcliff:

IMG_1907

From The Summit Of Bondcliff

IMG_1920

From The Summit Of Bondcliff

 

And this is the view towards Mt Bond, looking up from the top of Bondcliff:

IMG_1925

West Bond Is On The Left, And Mt Bond On The Right

 

I got to the top of Bondcliff at about 10:30am, just about four hours from the start of the hike.  Feeling good … at this point!  Here is a spectacular view back down towards Bondcliff, taken later in the day, from the top of West Bond:

IMG_1964.jpg

 

I would soon continue the climb, with a short hop from Bondcliff up to the top of Mt Bond.  Stay tuned!

*

Last time I wrote about how we built the foundations for ChildFund Australia’s new program approach: a comprehensive and robust “Theory of Change” that described what we were going to accomplish at a high level, and why; a small number of reliable, measurable, and meaningful “Outcome Indicators” that would enable us to demonstrate the impact of our work as related explicitly to our Outcome Indicators; and a set of “Output Indicators” that would allow us to track our activities in a consistent and comparable manner, across our work across all our programs: in Cambodia, Laos, Papua New Guinea, and Viet Nam.  (Myanmar was a slightly different story, as I will explain later…)

Next, on that foundation, we needed a way of thinking holistically about the effectiveness of our development work: a framework for planning our work in each location, each year; for tracking whether we were doing what we had planned; for understanding how well we were performing; and improving the quality and impact of our work.  And doing all this in partnership with local communities, organizations, and governments.

This meant being able to answer five basic questions:

  1. In light of our organizational Theory of Change, what are we going to do in each location, each year?
  2. how will we know that we are doing what we planned to do?
  3. how will we know that our work makes a difference and gets results consistent with our Theory of Change?;
  4. how will we learn from our experience, to improve the way we work?;
  5. how can community members and local partners directly participate in the planning, implementation, and evaluation of the development projects that ChildFund Australia supports?

Looking back, I feel that what we built and implemented to answer those questions – the ChildFund Australia “Development Effectiveness Framework” (“DEF”) – was our agency’s most important system.  Because what could be more important than the answers to those five questions?

*

I mentioned last time that twice, during my career with Plan International, we had tried to produce such a system, and failed (at great expense).  We had fallen into several traps that I was determined to avoid repeating this time, in ChildFund Australia, as we developed and implemented the DEF:

  • We would build a system that could be used by our teams with the informed participation of local partners and staff, practically – that was “good enough” for its purpose, instead of a system that had to be managed by experts, as we had done in Plan;
  • We would include both quantitative and qualitative information, serving the needs of head and heart, instead of building a wholly-quantitative system for scientific or academic purposes, as we had done in Plan;
  • We would not let “the best be the enemy of the good,” and I would make sure that we moved to rapidly prototype, implement, and improve the system instead of tinkering endlessly, as we had done in Plan.

I go into more detail about the reasons for Plan’s lack of success in that earlier article.

*

Here is a graphic that Caroline Pinney helped me create, which I used very frequently to explain how the DEF was designed, functioned, and performed:

Slide1

Figure 1: The ChildFund Australia Development Effectiveness Framework (2014)

 

In this article, I will describe each component of the DEF, outlining how each relates to each other and to the five questions outlined above.

However, I’m going to reserve discussion of three of those components for my next two articles:

  • Next time, I will cover #3 in Figure 1, the “Case Studies” that we produced.  These documents helped us broaden our focus from the purely quantitative to include consideration of the lived experience of people touched by the programs supported by ChildFund Australia.  In the same way, the Case Studies served as valuable tools for our staff, management, and board to retain a human connection to the spirit that motivated us to dedicate our careers to social justice;
  • And, after that, I will devote an article to our “Outcome Indicator Surveys” (#2 in Figure 1, above) and Statements of Impact (#12 in Figure 1). The approach we took to demonstrating impact was innovative and very participatory, and successful.  So I want to go into a bit of depth describing the two DEF components involved.

Note: I prepared most of what follows.  But I have included and adapted some descriptive material produced by the two DEF Managers that worked in the International Program Team:  Richard Geeves and Rouena Getigan.  Many thanks to them!

*

Starting Points

The DEF was based on two fundamental statements of organizational identity.  As such, it was built to focus us on, and enable us to be accountable for, what we were telling the world we were:

  1. On the bottom left of the DEF schematic (Figure 1, above) we reference the basic documents describing ChildFund’s identity: our Vision, Mission, Strategic Plan, Program Approach, and Policies – all agreed and approved by our CEO (Nigel Spence) and Board of Directors.  The idea was that the logic underlying our approach to Development Effectiveness would therefore be grounded in our basic purpose as an organization, overall.  I was determined that the DEF would serve to bring us together around that purpose, because I had seen Plan tend to atomize, with each field location working towards rather different aims.  Sadly, Plan’s diversity seemed to be far greater than required if it were simply responding to the different conditions we worked in.  For example, two Field Offices within 20 km of each other in the same country might have very different programs.  This excessive diversity seemed to relate more to the personal preferences of Field Office leadership than to any difference in the conditions of child poverty or the local context.  The DEF would help ChildFund Australia cohere, because our starting point was our organizational identity;
  2. But each field location did need a degree flexibility to respond to their reality, within ChildFund’s global identity, so at the bottom of the diagram we placed the Country Strategy Paper (“CSP”), quite centrally.  This meant that, in addition to building on ChildFund Australia’s overall purpose and identity globally, we would also build our approach to Development Effectiveness on how we chose to advance that basic purpose in each particular country where we worked, with that country’s particular characteristics.

Country Strategy Paper

The purpose and outline of the CSP was included in the ChildFund Australia Program Handbook:

To clarify, define, communicate and share the role, purpose and structure of ChildFund in-country – our approach, operations and focus. The CSP aims to build a unity of purpose and contribute to the effectiveness of our organisation.

When we develop the CSP we are making choices, about how we will work and what we will focus on as an organisation. We will be accountable for the commitments we make in the CSP – to communities, partners, donors and to ourselves.

While each CSP will be different and reflect the work and priorities of the country program, each CSP will use the same format and will be consistent with ChildFund Australia’s recent program development work.

During the development of the CSP it is important that we reflect on the purpose of the document. It should be a useful and practical resource that can inform our development work. It should be equally relevant to both our internal and external stakeholders. The CSP should be clear, concise and accessible while maintaining a strategic perspective. It should reflect clear thinking and communicate our work and our mission. It should reflect the voice of children.  Our annual work plans and budgets will be drawn from the CSP and we will use it to reflect on and review our performance over the three year period.

Implementation of the DEF flowed from each country’s CSP.

More details are found in Chapter 5 of the Program Handbook, available here: Program Handbook – 3.3 DRAFT.  Two examples of actual ChildFund Australia Country  Strategy Papers from my time with the organization are attached here:

For me, these are clear, concise documents that demonstrate coherence with ChildFund’s overall purpose along with choices driven by the situation in each country.

*

Beginning from the Country Strategy Paper, the DEF branches in two inter-related (in fact, nested) streams, covering programs (on the left side) and projects (on the right side).  Of course, projects form part of programs, consistent with our program framework:

Screen Shot 2018-05-28 at 2.16.30 PM

Figure 2: ChildFund Australia Program Framework

 

But it was difficult to depict this embedding on the two dimensions of a graphic!  So Figure 1 showed programs on one side and projects on the other.

Taking the “program” (left) side first:

Program Description

Moving onto the left side of Figure 1, derived from the Country Strategy Paper, and summarized in the CSP, each Country Office defined a handful (some countries had 3, others ended up with 5) “Program Descriptions” (noted as #1 in Figure 1), each one describing how particular sets of projects would create impact, together, as measured using ChildFund Australia’s Outcome Indicators – in other words, a “Theory of Change,” detailing how the projects included in the program linked together to create particular  positive change.

The purpose and outline of the Program Description was included in the ChildFund Australia Program Handbook:

ChildFund Australia programs are documented and approved through the use of “Program Descriptions”.  All Program Descriptions must be submitted by the Country Director for review and approval by the Sydney International Program Director, via the International Program Coordinator.

For ChildFund Australia: a “program” is an integrated set of projects that, together, have direct or indirect impact on one or more of our agreed organisational outcome indicators.   Programs normally span several geographical areas, but do not need to be implemented in all locations; this will depend on the geographical context.  Programs are integrated and holistic. They are designed to achieve outcomes related to ChildFund Australia’s mission, over longer periods, while projects are meant to produce outputs over shorter timeframes.

Program Descriptions were summarized in the CSP, contained a listing of the types of projects (#5 in Figure 1) that would be implemented, and were reviewed every 3 or 4 years (Program Review, #4 in Figure 1).

To write a Program Description, ChildFund staff (usually program managers in a particular Country Office) were expected to review our program implementation to-date, carry out extensive situational analyses of government policies, plans and activities in the sector and of communities’ needs in terms of assets, aspirations and ability to work productively with local government officials responsible for service provision. The results of ChildFund’s own Outcome Indicator surveys and community engagement events obviously provided very useful evidence in this regard.

Staff then proposed a general approach for responding to the situation and specific strategies which could be delivered through a set of projects.  They would also show that the approach and strategies proposed are consistent with evidence from good practice both globally and in-country, demonstrated that their choices were evidence-based.

Here are 2 examples of Program Descriptions:

Producing good, high-quality Program Descriptions was a surprising challenge for us, and I’m not sure we ever really got this component of the DEF right.  Probably the reason that we struggled was that these documents were rather abstract, and our staff weren’t used to operating at this level of abstraction.

Most of the initial draft Program Descriptions were quite superficial, and were approved only as place-holders.  Once we started to carry out “Program Reviews” (see below), however, where more rigor was meant to be injected into the documents, we struggled.  It was a positive, productive struggle, but a struggle nonetheless!

We persisted, however, because I strongly believed that our teams should be able to articulate why they were doing what they were doing, and the Program Descriptions were the basic tool for that exact explanation.  So we perservered, hoping that the effort would result in better programs, more sophisticated and holistic work, and more impact on children living in poverty.

*

 

 

Program Reviews

For the same reasons outlined above, in my discussion of the “Program Descriptions” component of the DEF, we also struggled with the “Program Review” (#4 in Figure 1, above).  In these workshops, our teams would consider an approved “Program Description” (#1 in Figure 1) every three or four years, subjecting the document to a formal process of peer review.

ChildFund staff from other countries visited the host country to participate in the review process and then wrote a report making recommendations for how the Program under review might be improved.  The host country accepted (or debated and adjusted) the  recommendations, acted on them and applied them to a revision of the Program Description: improving it, tightening up the logic, incorporating lessons learned from implementation, etc.

Program Reviews were therefore fundamentally about learning and improvement, so we made sure that, in addition to peers from other countries, the host Country Office invited in-country partners and relevant experts.  And International Program Coordinators from Sydney were asked to always attend Program Reviews in the countries that they were supporting, again for learning and improvement purposes.

The Program Reviews that I attended were useful and constructive, but I certainly sensed a degree of frustration.  In addition to struggling with the relatively-high levels of abstraction required, our teams were not used to having outsiders (even their peers other ChildFund offices) critique their efforts.  So, overall, this was a good and very-important component of the DEF, designed correctly, but it needed more time for our teams to learn how to manage this process and to be open to such a public process of review.

*

Projects and Quarterly Reports

As shown on the right hand side of Figure 1, ChildFund’s field staff and partners carried out routine monitoring of projects (#6 in the Figure) to ensure that they were on track, and on which they based their reporting on activities and outputs.  Project staff summarized their monitoring through formal Quarterly Reports (#7) on each project documenting progress against project plans, budgets, and targets to ensure projects are well managed.  These Quarterly Reports were reviewed in each Country Office and most were also forwarded to ChildFund’s head office in Sydney (and, often, donors) for review.

When I arrived, ChildFund Australia’s Quarterly reporting was well-developed and of high quality, so I didn’t need to focus on this aspect of our work.  We simply incorporated it into the more-comprehensive DEF.

*

Quarterly Output Tracking

As described last time, ChildFund developed and defined a set of Outputs which became standard across the organization in FY 2011-12.  Outputs in each project were coded and  tracked from Quarter to Quarter by project.  Some of the organizational outputs were specific to a sector such as education, health and water sanitation or a particular target group such as children, youth or adults.  Other Outputs were generic and might be found in any project, for example, training or awareness raising, materials production and consultation.

Organizational Outputs were summarized for all projects in each country each Quarter and country totals were aggregated in Sydney for submission to our Board of Directors (#8 in Figure 1, above).  In March 2014 there were a total of 47 organizational Outputs – they were listed in my last article in this series.

One purpose of this tracking was to enhance our accountability, so a summary was reviewed every Quarter in Sydney by the International Program Team and our Program Review Committee.

Here is an example of how we tracked outputs: this is a section of a Quarterly Report produced by the International Program Team for our Board and Program Review Committee: Output Report – Q4FY15.

*

Project Evaluations

ChildFund also conducted reviews or evaluations of all projects (#9 in Figure 1, above) – in different ways.  External evaluators were employed under detailed terms of reference to evaluate multi-year projects with more substantial budgets or which were significant for learning or to a particular donor.  Smaller projects were generally evaluated internally.  All evaluators were expected to gather evidence of results against output targets and performance indicators written against objectives.

*

All development effectiveness systems have, at their heart, mechanisms for translating operational experiences into learning and program improvement.  In the representation of ChildFund’s DEF in Figure 1, this was represented by the central circle in the schematic which feeds back evidence from a variety of sources into our organizational and Country Strategy Papers, Program Descriptions and project planning and design.

Our program staff found that their most effective learning often occurred during routine monitoring through observation of project activities and conversations in communities with development partners.  Through thoughtful questioning and attentive listening, staff could make the immediate decisions and quick adjustments which kept project activities relevant and efficient.

Staff also had more formal opportunities to document and reflect on learning.  The tracking of outputs and aggregation each Quarter drew attention to progress and sometimes signaled the need to vary plans or redirect resources.

Project evaluations (#9 in Figure 1, above) provided major opportunities for learning, especially when external evaluators bring their different experiences to bear and offer fresh perspectives on a ChildFund project.

*

The reader can easily grasp that, for me, the DEF was a great success, a significant asset for ChildFund Australia that enabled us to be more accountable and effective.  Some more-technically-focused agencies were busy carrying out sophisticated impact evaluations, using control groups and so forth, but that kind of effort didn’t suit the vast majority of INGOs.  We could benefit from the learnings that came from those scientific evaluations, but we didn’t have the resources to introduce such methodologies ourselves.  And so, though not perfect, I am not aware of any comparable organization that succeeded as we did with our DEF.

While the system built on what I had learned over nearly 30 years, and even though I felt that it was designed comprehensively and working very well, that was merely my opinion!

Given the importance of the system, relying on my opinion (no matter how sound!) wasn’t good enough.  So we sought expert review, commissioning two independent, expert external reviews of the DEF.

*

The first review, which was concluded in November of 2012, took place before we had fully implemented the system.  In particular, since Outcome Indicator Surveys and Statements of Impact (to be covered in an upcoming blog article) were implemented only after three years (and every three years thereafter), we had not yet reached that stage.  But we certainly were quite advance in the implementation of most of the DEF, so it was a good time to reflect on how it was going.

In that light, this first external review of the DEF concluded the following:

The development of the DEF places ChildFund Australia in a sound position within the sector in the area of development effectiveness. The particular strength of ChildFund Australia’s framework is that it binds the whole organisation to a set of common indicators and outputs. This provides a basis for focussing the organisation’s efforts and ensuring that programming is strategically aligned to common objectives. The other particular strength that ChildFund Australia’s framework offers is that it provides a basis for aggregating its achievements across programs, thereby strengthening the organisation’s overall claims of effectiveness.

Within ChildFund Australia, there is strong support for the DEF and broad agreement among key DEF stakeholders and users that the DEF unites the agency on a performance agenda. This is in large part due to dedicated resources having been invested and the development of a data collection system has been integrated into the project management system (budgeting and planning, and reporting), thereby making DEF a living and breathing function throughout the organisation. Importantly, the definition of outcomes and outputs indicators provides clarity of expectations across ChildFund Australia.

One of the strengths of the DEF recognised by in-country staff particularly is that the DEF provides a basis for stakeholders to share their perspectives. Stakeholders are involved in identifying benefits and their perspectives are heard through case studies. This has already provided a rich source of information that has prompted reflection by in-country teams, the Sydney based programs team and the ChildFund Australia Board.

Significantly, the DEF signals a focus on effectiveness to donors and the sector. One of the benefits already felt by ChildFund Australia is that it is able to refer to its effectiveness framework in funding submissions and in communication with its major donors who have an increasing interest on performance information.

Overall, the review found that the pilot of the DEF has been implemented well, with lots of consultation and engagement with country offices, and lots of opportunity for refinement. Its features are strong, enabling ChildFund to both measure how much it is doing, and the changes that are experienced by communities over time. The first phase of the DEF has focused on integrating effectiveness measurement mechanisms within program management and broader work practices, while the second phase of the DEF will look at the analysis, reflection and learning aspects of effectiveness. This second phase is likely to assist various stakeholders involved in collecting effectiveness information better understand and appreciate the linkages between their work and broader organisational learning and development. This is an important second phase and will require ongoing investment to maximise the potential of the DEF. It place ChildFund Australia in a strong position within the Australian NGO sector to engage in the discourse around development effectiveness and demonstrate its achievements.

A full copy of this first review, removing only the name of the author, is attached here: External DEF Review – November 2012.

In early 2015 we carried out a second review.  This time, we had implemented the entire DEF, carrying out (for example) Statement of Impact workshops in several locations.  The whole system was now working.

At that point, we were very confident in the DEF – from our point of view, all components were working well, producing good and reliable information that was being used to improve our development work.  Our board, program-review committee, and donors were all enthusiastic.  More importantly, local staff and communities were positive.

The only major concern that remained related to the methodology we used in the Outcome Indicator Surveys.  I will examine this issue in more detail in an upcoming blog article in this series; but the reader will notice that this second formal, external evaluation focuses very much on the use of the LQAS methodology in gathering information for our Outcome Indicator workshops and Statements of Impact.

That’s why the external evaluator we engaged to carry out this second review was an expert in survey methodologies (in general) and in the LQAS (in particular.)

In that light, this second external review of the DEF concluded the following:

ChildFund Australia is to be commended for its commitment to implementing a comprehensive and rigorous monitoring and evaluation framework with learning at its centre to support and demonstrate development effectiveness. Over the past five years, DEL managers in Cambodia, Laos, Papua New Guinea and Vietnam, with support and assistance from ChildFund Australia, country directors and program managers and staff, have worked hard to pilot, refine and embed the DEF in the broader country programs.  Implementing the DEF, in particular the Outcome Indicator Survey using LQAS, has presented several challenges.  With time, many of the early issues have been resolved, tools improved and guidelines developed.  Nevertheless, a few issues remain that must be addressed if the potential benefits are to be fully realised at the organisational, country and program levels.

Overall, the DEF is well suited for supporting long-term development activities in a defined geographic area.  The methodologies, scope and tools employed to facilitate Outcome Indicator Surveys and to conduct Community Engagement and Attribution of Impact processes are mostly fit for purpose, although there is considerable room for improvement.  Not all of the outcome indicators lend themselves to assessment via survey; those that are difficult to conceptualise and measure being most problematic. For some indicators in some places, a ceiling effect is apparent limiting their value for repeated assessment. While outcome indicators may be broadly similar across countries, both the indicators and the targets with which they are to be compared should be locally meaningful if the survey results are to be useful—and used—locally.

Used properly, LQAS is an effective and relatively inexpensive probability sampling method.  Areas for improvement in its application by ChildFund include definition of the lots, identification of the sampling frame, sample selection, data analysis and interpretation, and setting targets for repeated surveys.

Community Engagement and the Attribution of Impact processes have clearly engaged the community and local stakeholders.  Experience to date suggests that they can be streamlined to some extent, reducing the burden on staff as well as communities.  These events are an important opportunity to bring local stakeholders together to discuss local development needs and set future directions and priorities.  Their major weakness lies in the quality of the survey results that are presented for discussion, and their interpretation.  This, in turn, affects the value of the Statement of Impact and other documents that are produced.

The DEF participatory processes have undoubtedly contributed to the empowerment of community members involved. Reporting survey results in an appropriate format, together with other relevant data, in a range of inviting and succinct documents that will meet the needs of program staff and partners is likely to increase their influence.

A full copy of this second review, removing only the name of the author, is attached here: DEF Evaluation – April 2015.

*

Great credit is due to ChildFund staff that contributed to the conceptualization, development, and implementation of the DEF.  In particular, Richard Geeves and Rouena Getigan in the International Program Team in Sydney worked very hard to translate my sometimes overly-ambitious concepts into practical guidelines, and ably supported our Country Offices.

One of the keys to the success of the DEF was that we budgeted for dedicated in-country support, with each Country Office able to hire a DEL Manager (two in Viet Nam, given the scale of our program there.)

Many thanks to Solin in Cambodia, Marieke in Laos, Joe in Papua New Guinea, and Thuy and Dung in Viet Nam: they worked very hard to make the DEF function in their complex realities.  I admire how that made it work so well.

*

In this article, I’ve outlined how ChildFund Australia designed a comprehensive and very robust Development Effectiveness System.  Stay tuned next time, when I describe climbing Mt Bond, and then go into much more depth on one particular component (the Case Studies, #3 in Figure 1, above).

After that, in the following article, I plan to cover reaching the top of West Bond and descending back across Mt Bond and Bondcliff (and losing toenails on both big toes!) and go into some depth to describe how we carried out Outcome Indicator Surveys (#2 in Figure 1) and Statements of Impact (#12) – in many ways, the culmination of the DEF.

*

Here are links to earlier blogs in this series.  Eventually there will be 48 articles, each one about climbing one of New Hampshire’s 4000-footers, and also reflecting on a career in international development:

  1. Mt Tom (1) – A New Journey;
  2. Mt Field (2) – Potable Water in Ecuador;
  3. Mt Moosilauke (3) – A Water System for San Rafael (part 1);
  4. Mt Flume (4) – A Windmill for San Rafael (part 2);
  5. Mt Liberty (5) – Onward to Colombia, Plan International in Tuluá;
  6. Mt Osceola (6) – Three Years in Tuluá;
  7. East Osceola (7) – Potable Water for Cienegueta;
  8. Mt Passaconaway (8) – The South America Regional Office;
  9. Mt Whiteface (9) – Empowerment!;
  10. North Tripyramid (10) – Total Quality Management for Plan International;
  11. Middle Tripyramid (11) – To International Headquarters!;
  12. North Kinsman (12) – Fighting Fragmentation and Building Unity: New Program Goals and Principles for Plan International;
  13. South Kinsman (13) – A Growth Plan for Plan International;
  14. Mt Carrigain (14) – Restructuring Plan International;
  15. Mt Eisenhower (15) – A Guest Blog: Max van der Schalk Reflects on 5 Years at Plan’s International Headquarters;
  16. Mt Pierce (16) – Four Years At Plan’s International Headquarters;
  17. Mt Hancock (17) – Hanoi, 1998;
  18. South Hancock (18) – Plan’s Team in Viet Nam (1998-2002);
  19. Wildcat “D” Peak (19) – Plan’s Work in Viet Nam;
  20. Wildcat Mountain (20) – The Large Grants Implementation Unit in Viet Nam;
  21. Middle Carter (21) – Things Had Changed;
  22. South Carter (22) – CCF’s Organizational Capacity Assessment and Child Poverty Study;
  23. Mt Tecumseh (23) – Researching CCF’s New Program Approach;
  24. Mt Jackson (24) – The Bright Futures Program Approach;
  25. Mt Isolation (25) – Pilot Testing Bright Futures;
  26. Mt Lincoln (26) – Change, Strategy and Culture: Bright Futures 101;
  27. Mt Lafayette (27) – Collective Action for Human Rights;
  28. Mt Willey (28) – Navigating Principle and Pragmatism, Working With UUSC’s Bargaining Unit;
  29. Cannon Mountain (29) – UUSC Just Democracy;
  30. Carter Dome (30) – A (Failed) Merger In the INGO Sector (1997);
  31. Galehead Mountain (31) – What We Think About When We Think About A Great INGO Program;
  32. Mt Garfield (32) – Building Strong INGO Teams: Clarity, Trust, Inspiration;
  33. Mt Moriah (33) – Putting It All Together (Part 1): the ChildFund Australia International Program Team;
  34. Owls’ Head (34) – Putting It All Together (Part 2): ChildFund Australia’s Theory of Change.

 

 

8 thoughts on “Bondcliff (35) – ChildFund Australia’s Development Effectiveness Framework

  1. Pingback: Mt Bond (36) – “Case Studies” In ChildFund Australia’s Development Effectiveness Framework | Mark McPeak

  2. Pingback: West Bond (37) – Impact Assessment in ChildFund Australia’s Development Effectiveness Framework | Mark McPeak

  3. Pingback: Mt Waumbek (38) – “Building the Power of Poor People and Poor Children…” | Mark McPeak

  4. Pingback: Mt Tom (1) – A New Journey | Mark McPeak

  5. Pingback: Mt Field (2) – Potable Water in Ecuador | Mark McPeak

  6. Pingback: Mt Passaconaway (8) – The South American Regional Office (SARO) | Mark McPeak

  7. Pingback: Mt Whiteface (9) – Empowerment! | Mark McPeak

  8. Pingback: Mt Moosilauke (3) – A Water System for San Rafael (part 1) | Mark McPeak

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s