The intermediate level course(s) below have recently been offered by the CES-NCC.
See a list of current offerings by this Chapter.
After Data Analysis: from Report Writing to Knowledge Translation
This full-day, intermediate-level workshop will explore theory and techniques that can be applied to maximizing the impact of evaluation results for clients. Going beyond writing techniques, the workshop will discuss how to present evaluation findings in a way that is relevant and useful for clients and how to produce meaningful conclusions for decision-making. Case studies and small group work will be used to explore the following learning objectives:
- Reporting findings in comprehensive and effective ways (includes: audience identification, importance of ethics, limitations, methods, and findings);
- Crafting conclusions and developing recommendations for greatest impact (includes: issues of generalizability, political significance of findings, emphasis selection for conclusions, considerations of a policy perspective for recommendations); and,
- Fostering knowledge translation after the report is finalized (emerging practices and strategies in knowledge translation for dissemination of evaluation findings for impact). Participants will take away fresh ideas, useful tools and techniques to increase the utilization of evaluation findings for decision making in their own program evaluation contexts.
Gail Barrington has more than 25 years of practical experience running her own consulting practice, Barrington Research Group, Inc. She has conducted over 130 program evaluations and applied research studies. In 2008, she won the Canadian Evaluation Society award for her Contribution to Evaluation in Canada. For nearly ten years, Gail has taught Master’s-level program evaluation at Athabasca University and she was recently appointed as an Assistant Professor at Michigan State University where she is designing a course on qualitative and mixed methods.
Her recent book, Consulting Start-up and Management: A Guide for Evaluators and Applied Researchers, (SAGE, 2012) has been very well received and she continues to provide training and to write on this topic.
Evaluation on a Shoestring Budget
Have you ever been in a situation where you have limited time, or a limited budget, to complete an evaluation? Of course you have – we’ve all been there! Are you interested in learning some techniques to overcome the constraints of limited time and/or money? This intermediate level course, Evaluation on a Shoestring Budget, will look at practical and hands on tools and techniques to conduct successful evaluations when you are under time, budget and resource pressures.
Dr. Patricia Rogers is a professor of Public Sector Evaluation at the Royal Melbourne Institute of Technology University, Australia, with more than 25 years of experience in public sector evaluation and research. Dr. Rogers has been recognized for her contributions to the field of evaluation through a number of prestigious awards from the Australasian Evaluation Society and the American Evaluation Association and co-authored the book Purposeful Program Theory: Effective Use of Theories of Change and Logic Models with Sue Funnell.
Qualitative Methods in Realist Evaluations
Instead of asking whether or not a program or intervention ‘works’, realist evaluation provides methods for determining “what works for whom in what contexts, in what respects, and how”. This type of theory-based evaluation is particularly important when new interventions are being developed; when interventions are being considered for replication or scaling up; when programs are complex or are being introduced in complex settings; or when previous evaluations of programs have found mixed outcomes.
Qualitative data is used in multiple ways in realist evaluations. This workshop will outline the ways in which it is used, and the particular requirements that a realist approach implies for the nature of information to be collected and analysed. Realist interviews are intended to be theory-building and/or theory testing and to be iterative in nature. The implications for evaluation design, research ethics applications, interview techniques and data management will be explored.
This one-day workshop will focus on qualitative data (e.g. interviews and focus groups, qualitative components of surveys). The workshop will provide:
- A brief introduction to realist evaluation, its key assumptions, and the implications of taking a realist approach for evaluation design, implementation and analysis;
- An introduction to realist interviewing, including Pawson and Tilley’s idea of ‘teacher learner interviews’. Exercises will include writing and testing interview questions for different purposes in realist interviews;
- An introduction to issues and skills in realist qualitative analysis, with opportunities to practice realist qualitative analysis;
- Practical issues to be managed in the use of qualitative data for realist evaluations – sample design, research ethics, data management and other issues identified by participants.
This course is offered in English only.
Dr Gill Westhorp is a specialist in realist research and evaluation methodologies. She is Director of a small research and evaluation consultancy company specialising in realist approaches; a Professorial Research Fellow at Charles Darwin University, Darwin, Australia; an Associate at RMIT University, Melbourne, Australia; a member of the core team for the RAMESES I (standards for realist synthesis) and RAMESES II (standards for realist evaluation) projects based in Oxford, UK; and a member of the Advisory Committee for the Centre for the Advancement of Realist Evaluation and Synthesis (CARES) at Liverpool University, UK. She undertakes and leads realist evaluation and synthesis projects, and provides training in realist approaches. Much of her current work involves advising research and evaluation projects as a methodologist. She has been using realist approaches since 2002 and undertook her PhD with Prof Nick Tilley, co-author of the seminal text Realistic Evaluation.
Kim Grey is an internal evaluation manager and advisor for the Australian Government with 20 years’ experience designing and running complex evaluations, involving impact, participatory and realist approaches. These have covered Indigenous and mainstream cross-government and community sector programs about safety, wellbeing, employment and education. Projects have often involved the intersection of inter-cultural research practice and social science theory, in support of policy development and decision-making. Over recent years, Kim has explored the benefits of a realist approach to cross-sectoral place-based programs and local research to generate transferable knowledge that can be used and reused in sensitive areas. Kim is a Fellow at Charles Darwin University and it currently undertaking postgraduate research into use of substantive theory in evaluation for a Masters in Evaluation with the Centre for Program Evaluation at the University of Melbourne.