UNFPA: Review of Evaluation Policy

9/5/2012 // Policy Director Berit Fladby presented this statement on behalf of Denmark, Finland, Sweden and Norway during the Second Regular Session of the Executive Board of UNDP, UNFPA and UNOPS on the 4-10th of September 2012.

President,

I am delivering this statement on behalf of Denmark, Finland, Sweden and my own country Norway.

We take this opportunity to thank the Division of Oversight Services for the biennial report on evaluation and the UNFPA senior management for the management response. We commend the Executive Director for engaging United Nations Office of Internal Oversight Services (OIOS) to undertake a review of UNFPA’s evaluation policy, and we thank OIOS for their report and the UNFPA for the management response. We also appreciate the submission of the biennial evaluation plan 2012-13.

First to the review of the evaluation policy:

The adoption of UNFPA’s evaluation policy in 2009 was an important step forward, although the Executive Board expressed some concern regarding its clarity. The OIOS report confirms the need for a revision of UNFPA’s evaluation policy, and we appreciate the commitment of the Executive Director to do so and we expect that the revision process will be guided by the norms and standards established by the United Nations Evaluation Group, especially the fundamental principle of independence, as well as international best practices. We would welcome information on how the revised evaluation policy will be prepared and when in 2013 it is expected that the revised evaluation policy can be presented to the Executive Board.

We concur with the analysis and most of the recommendations made by OIOS and would in particular underline the need for establishing a commonly understood vision for evaluation in UNFPA, based on the mandate of the organization and the strategic priorities defined in the revised strategic plan, and for clarifying the purposes of centralized and decentralized (embedded) evaluations, respectively. Equally important is to define how evaluation findings will inform policy development, strategic direction of country programmes and ways of doing business.

We further share the view that it is crucial to clarify the roles and responsibilities of Division of Oversight Services and Programme Division as well as the collaboration between them. We are concerned that insufficient clarity in this regard has led to different interpretations, hampered effective collaboration and caused duplication of efforts.

A few specific comments on the role and independence of the central evaluation function which we would urge UNFPA to look into in the process ahead:

First, we would like to see a more efficient central evaluation function. The current fragmentation of the evaluation function, where two units in headquarters (Evaluation Branch and Programme Division) perform similar tasks, is neither efficient nor cost-effective. Core evaluation functions (such as conducing strategic evaluations, evaluation planning, and guidance) should be conducted by the Evaluation Branch. This core function should be supported and facilitated by a stronger emphasis on establishing evaluability in programming. The Programme Division should enhance planning of programmes so that they can be better evaluated and strengthen results-based monitoring systems.

Second, we would like to recall decision 2009/18 (para 8d), in which the Executive Board   expects that “Chief of the Evaluation Branch in the Division of Oversight Services has the final say on the content of evaluation reports issued by the Division of Oversight Services.” In revising the evaluation policy, we expect this aspect to be included, as well as a proposal on whether the Director of DOS or the Chief of the Evaluation Branch should present the biennial evaluation report to the Executive Board. We recommend to pay due attention to the fact that the purpose and methodologies of evaluation are different from those of auditing.

Third, funding is a fundamental criterion to ensure functional independence of the central evaluation function. In our view, it is the responsibility of the Executive Board to ensure that the central evaluation function receives sufficient core resources and has sufficient capacity to carry out its tasks and responsibilities. Unfortunately, the OIOS report does not include an assessment of the adequacy of human and financial resources allocated to evaluation as requested by the Executive Board in its decision 2010/26 (para 18). Our expectation for the future is that the Executive Board gets easier access to this kind of information and we therefore recommend that evaluation is dealt with separately from auditing and investigation in the Integrated Budget 2014-2017, preferably through a separate budget line.

In addition to the above comments on the report, we would like to stress that an important purpose of the central evaluation function, besides ensuring quality of decentralized evaluations by establishing criteria and assessing the quality of the evaluations undertaken by country and regional offices, is to undertake evaluations of organization-wide and cross-cutting issues. The global or thematic evaluations are of immense importance to us as members of the Executive Board. In addition to the accountability aspect of such evaluations, global and thematic evaluations reports provide a good opportunity to engage in strategic discussions. It is therefore important that thematic evaluation reports are included in the agenda of the Executive Board, as called for in decision 2010/26. As also discussed during the informal session on evaluation in Geneva, our expectation is that the ongoing evaluation on maternal health is included in the agenda of the first regular session 2013.

Finally, we would like to stress the importance of the integration of human rights based approaches and gender equality in the evaluation function by using the UNEG guidance.

President,

Let me make a few comments on the Biennial report and the management response:

We are pleased to note that the coverage of country programme evaluations reached 100 % in 2011, a tremendous increase compared to 2009. This is a major achievement. One question in this regard: How many of the country programme evaluations are undertaken in the second last year of the programme cycle as earlier committed by UNFPA and supported by the Executive Board through decision 2010/26 (para 7)? We believe this is crucial in order to ensure learning which can feed into the formulation of strategic priorities and approaches in the next country programme cycle.

Clearly, UNFPA – like other UN agencies – struggles with the quality of decentralized evaluations. One prerequisite for good evaluations is results-based programming. We feel confident that the efforts made by UNFPA recently to strengthen results-based programming also will contribute to better quality of the decentralized evaluations. Equally important, however, is commonly accepted guidelines and training of staff based on those guidelines. We encourage UNFPA to have a second look at these issues when the evaluation policy is being revised and expect that the biennial report on evaluation deals with progress made.

Our last comment on the biennial evaluation report is that we request for a more comprehensive description and analysis of the findings and recommendations of the decentralized evaluations as called for in Executive Board decision 2009/18 (para 9).

President,

UNFPA, under the strong leadership of the Executive Director, has proven the willingness and ability to reform and to rectify short-comings in the organization. We have the full confidence that this will be the case also when it comes to making necessary improvements in the evaluation function of the organization.

Thank you.


Bookmark and Share