4B: Evaluation Implementation: Nurturing Growth through Innovation

4B1 A Discussion on Capacity-Building in Evaluation

Evaluation Capacity Building (ECB) refers to the systematic and intentional efforts to enhance the ability of individuals, organizations, or systems to conduct, use, and sustain evaluation. It involves developing the knowledge, skills, attitudes, and structures needed to design, implement, and utilize evaluations to improve programs, policies, and decision-making processes. This fireside chat will provide concrete examples of how evaluation can meaningfully contribute to organizational success as well as how evaluation capacity can be strengthened in public, international, and community-based organizations.

Dr. Isabelle Bourgeois, Full Professor, Faculty of Education, University of Ottawa
Isabelle Bourgeois, Ph.D., holds the position of Full Professor at the Faculty of Education, University of Ottawa. Her ongoing research work focuses on measuring and building organizational evaluation capacity (EC) in the public and community sectors. Her main contributions in this field include an organizational framework of evaluation capacity, an online organizational EC assessment instrument and an integrative review of the literature on evaluation capacity spanning a period of 20 years. Dr. Bourgeois was the Editor-in-Chief of the Canadian Journal of Program Evaluation from 2017 to 2022. In 2017, she received the Karl-Boudreault Award for Leadership in Evaluation from the National Capital Chapter of the Canadian Evaluation Society, and in 2021, she received the Parenteau award from Canadian Public Administration for best French-language article. She received the Contribution to Evaluation Award from the Canadian Evaluation Society in 2024.

Dr. Eric Champagne, Full Professor, Public Administration, School of Political Studies and Director, Centre on Governance, University of Ottawa
Eric Champagne, Ph.D. is Full Professor in Public Administration at the School of Political Studies and Director of the Centre on Governance at the University of Ottawa. He teaches courses on the theory and practice of public management and governance, public policy development, program evaluation and risk management. His current research focuses on infrastructure financing and asset management in the context of multilevel governance (basic infrastructure, transports, water; housing) and on the transformations of the public sector in the aftermath of the pandemic (digitalization, telework, hybrid policies). Before joining the University of Ottawa, Professor Champagne spent about ten years as a World Bank’s public sector reform specialist and a senior strategic advisor to the government of Canada. He is the vice president of the Performance and Planning Exchange (PPX), a not-for-profit organization dedicated to improving knowledge and practice in results-based management.

Dr. Robert Shepherd, Full Professor, School of Public Policy and Administration, Carleton University
Robert Shepherd, Ph.D., holds the position of Full Professor in the School of Public Policy and Administration at Carleton University. His ongoing research work focuses on public sector reform, ethics in government, and understanding the conditions necessary for more effective governmental evaluation. He routinely gives courses in policy and program evaluation, ethics and advanced public management. Dr. Shepherd is the incoming Editor of Canadian Public Administration. He is also Program Supervisor of the Graduate Diploma in Public Policy & Program Evaluation. He is a former President of the Canadian Association of Programs in Public Administration, and Chair of the Consortium of Universities for Evaluation Education.

 

4b2 Lessons Learned from a Clarificative and Implementation Evaluation of the City of Kingston’s Municipal Services

We conducted a clarificative and implementation evaluation of the City of Kingston’s municipal service processes (i.e., reach and quality of services, internal communication, and collection and use of service data). Our evaluation questions focused on 1) identifying the current service processes in place and to what extent they were being implemented across departments, 2) what internal communication methods the City could leverage to improve service, 3) who municipal services are reaching currently, 4) how the City could improve its collection and use of service-related data, and 4) to what extent services are meeting resident needs. Challenges to our approach involved the complexity in creating the program theory with both internal (City staff) and external (resident) components and procuring data to adequately assess the reach of service. Our findings highlighted gaps in the clarity, consistency, and documentation of internal processes related to service delivery and that efficiency often competed with effectiveness for communicating internally to complete requests. We also identified issues with adequate closing the loop on requests both according to staff and residents, and that resident engagement with online services is likely lower in regions outside the City core where there are fewer in-person service centers. We learned that effective evaluation relies on effective project management, which we would aim to improve in future, and the importance of adequate scoping aided by the use of program theories.

Dr. Valerie Wood, Research and Evaluation Specialist, Canadian Armed Forces
Dr. Valerie Wood is a Research and Evaluation Specialist for the Canadian Armed Forces (CAF) Road to Mental Readiness (R2MR) program, a mental health and performance training program. She is responsible for evaluating the effectiveness and impact of the R2MR program and leading their ongoing monitoring evaluation activities. She also leads internal and external research projects that inform R2MR programming and resources and that are focused on supporting the mental health and performance of CAF members.

 

4b3 From Theory to Practice: Examining the Development, Implementation, and Impact of Evaluation Policy

Over the past 50 years, evaluation has been a central function of the Government of Canada given its key role in the government’s expenditure management system and public reporting activities. This has resulted in the development of centralized federal evaluation policies to direct evaluation practice across federal departments and agencies. Such policies outline legislative requirements, as well as general expectations and guidelines for the federal evaluation function. In 2009, Bill Trochim argued that developing well-informed evaluation policies that can guide evaluation practice may be the most important issue facing the field of evaluation. This presentation shares the preliminary results of an empirical study examining the interpretation and implementation of the federal government’s Policy on Results by evaluation practitioners and users, its role and impact on the capacity to do and use evaluations, and how contextual factors play a role in shaping the policy-practice relationship. The findings provide a better understanding of the relationship between evaluation policy and organizational evaluation capacity and the importance of adapting foundational practices to meet diverse contextual needs.

Élyse McCall-Thomas, Evaluation Manager, Natural Sciences and Engineering Research Council
Élyse McCall-Thomas is a Ph.D. student at the University of Ottawa. Her research focuses on the theory-policy-practice connection with a specific focus the implementation and impact of the Government of Canada’s 2016 Policy on Results on the capacity to do and use evaluations. Élyse is also an Evaluation Manager with the Natural Sciences and Engineering Research Council and has over 16 years of experience in research and evaluation. Her work spans many areas including research funding, education, public health, mental health and addictions, and homelessness. Through her work, Élyse emphasizes considerations of capacity building, equity and inclusion, as well as the use of evaluation to support learning and evidence-based decision making.

 

4B4 Obstacles to the institutionalization of program evaluation in Africa: A case study from the Democratic Republic of Congo

In the contemporary paradigm of good public governance, program evaluation has become an indispensable tool, particularly in the field of international development. Its application is of paramount importance in the African context, where socio-economic challenges and the imperatives of public sector modernization call for rigorous, evidence-based approaches. Adopted in the 2000s in many African countries in the name of good governance and program-based budgeting, program evaluation is struggling to become institutionalized in Africa. And yet, its preponderant role in the democratization of public action, the rationalization of budgetary choices, and the strengthening of accountability and responsibility of public actions is unequivocal. Although several factors are highlighted at the continental level, analysis of their institutionalization at country and sector levels seem to offer a more interesting reading of causality. Using the case of the Democratic Republic of Congo, this analysis seeks to understand the challenges and obstacles to institutionalizing program evaluation in the African context.

Samuel Batumike Kabagale, Doctoral Student, Public Administration, University of Ottawa.
Samuel Batumike Kabagale is a doctoral student in Public Administration at the University of Ottawa. He is an affiliated researcher with the Centre on Governance and the Centre de Recherche en Études Évaluatives at the University of Kinshasa. Professionally, he is a graduate of the École Nationale d’Administration from DRC and has been a civil servant at the Ministry of Finance since 2015, where he was head of office in charge of programming in the dynamics of program budget implementation. He is currently conducting research on the institutionalization of program evaluation, public finance reforms and the prospects for incorporating the sustainability dimension into the evaluation process.