Evaluation is a dynamic practice. New tools and methods are constantly emerging. What’s a busy evaluator to do? Take a day to come learn from your peers! The CESBC 2018 Evaluation Conference is being held in Vancouver, BC, on Friday, November 30.

This conference is about opening up our evaluation toolboxes and sharing what we know and what we’ve learned. Every attendee should come away from the conference with something new to use in your work. This is an opportunity to share your most trusted tools, expand your toolbox, and build new connections.

Take a peek at our draft program to see what’s in store!

Speakers Program

OPENING WELCOME

Elder Roberta Price

Elder Roberta Price, from the Snuneymuxw and Cowichan First Nations, will lead us with the opening welcome. Roberta is the mother of four children and grandmother to eight beloved grandchildren. She is an Elder Advisor and research partner with many organizations and projects throughout our communities, including Vancouver Coastal Health, the UBC Learning Exchange and UBC School of Nursing, the National Indigenous Council on Diabetes, the Evaluation for Social Change and Transformational Learning certificate at SFU, and the Urban Indigenous Health and Healing Cooperative. She was also the Elder Advisor for the 2015 CES BC & Yukon and 2017 CES conferences, and we are honoured and pleased to have her join us again.

KEYNOTE SPEAKER

Dr. Sarah Schulman

Sarah is a Founder and Social Impact Lead of InWithForward, "a social design organization that makes human services more human". As a sociologist, Sarah is fascinated by what makes individuals, families, and policymakers tick. She’s worked with federal, regional, and local governments in 6 countries to shift how policies are made and measured. Since 2008, she's been working with social design organizations to launch award-winning social solutions, such as Burnaby-based Kudoz. She holds a Doctorate in Social Policy from Oxford University, where she was a Rhodes Scholar, and a Masters of Education from Stanford University. In 2018, she's exploring how to bring stories of people on the margins to a broader audience as a Global Journalism Fellow.

Short Presentations Program

Short presentations are 20 minute presentations by one or more speakers on a specific tool, idea, experience. Short presentations on related topics will be paired in 50-minute concurrent sessions. Sixteen short presentations have been selected for the conference, listed here alphabetically by title.

10+ Free or Low-cost Web-based Tools to extend your Evaluation Practice

As our field evolves, we are challenged to add more and more services to our evaluation tool-kits. Things like data visualization, GIS mapping, and participatory methodologies, have now become standard practices.  How can a new practitioner or lone evaluation consultant provide these value-add services within our limited resources and existing expertise? In this presentation I will talk about 10+ tools that help me extend my evaluation practices without having to hire specialists or pay for ongoing subscriptions. I have found there are many web-based platforms and Apps that support almost every aspect of evaluation including collaborative planning, data collection, data analysis, and reporting. The session will also enable participants to share their best go-to web-based tools.

Marla Steinberg is a professional evaluator living and working in Vancouver British Columbia. For over twenty-five years she has been helping funders, foundations, governments, community-based organizations and practitioners increase their evaluation skills, measure their impact, and find ways to improve their programming.

Client Centred Design: Blending Developmental Evaluation + Service Design for Emergent Programs

Sehat is an innovative community based health promotion program with the South Asian community in the Fraser Health Region. The program has experienced remarkable success in its uptake by the community but evaluating it during its emergent phase required a skill set that went beyond what’s typically used in a traditional health authority setting. For a fresh perspective, the team collaborated with an external consultant, Denise Withers, who shook things up by introducing an innovation lab to research, develop and test new ways to change behaviour. The lab brought in service design, ethnography and developmental evaluation approaches that challenged our typical ways of working and evaluating. Denise guided us through exercises like developing personas and mapping participant experiences and provided pragmatic (and fun!) tools for reflecting and challenging our assumptions. This session will showcase how some of these tools were applied and what we learned along the way.

Rachel Douglas is an Evaluation Specialist in Population and Public Health at Fraser Health.  She supports programs from across the department in their evaluation projects and capacity building.   Rachel has a Master’s in Public Health and a passion for research driven by the real world health system, clinical and community contexts.

Deljit Bains has held health care roles in acute care, education and population based community care.  As Leader of the South Asian Health Institute her role involves helping to improve health and health outcomes for the South Asian population in a culturally appropriate way through innovative evidence based care.

Denise Withers is a design consultant who uses story as a framework for innovation. Drawing on her background as an award-winning filmmaker and educator, she’s worked for over 100 clients, won 8 International awards and started 4 innovation labs. Her latest book is “Story Design: The Creative Way to Innovate.

Convergent Participation Model

This presentation will present a convergent participation model for evaluation, in which evaluators engage in a two-stage process to converge toward more similar descriptions and institutional quality ratings. The convergent participation model will be presented and assessed in relation to institutional educational quality audit models. Strengths and weaknesses of this model will be discussed with respect to its support for main goals of institutional quality assurance evaluation: (1) formative evaluation, (2) influence on practices, (3) institutional development, (4) social recognition, and (5) economic exchange.

Karen Belfer oversees the self-regulatory mechanisms for the public colleges in Ontario and is responsible for the operation of the Credentials Validation Service and the College Quality Assurance Audit Process. Prior to joining OCQAS, Karen was Dean of the Centre for Instructional Development and Applied Research at Vancouver Community College (VCC).

Data Collection Tools that are Accessible to Everyone - Using Arts-Based Methods in Evaluation

Arts-based methods have been shown to be an effective way to collect data from a wide variety of stakeholder groups and yield results that are not often captured by traditional evaluation data collection methods. This presentation will first share a rationale for using arts-based methods within an evaluation. Next, it will present three arts-based data collection methods that evaluators can utilize within their own practice (i.e. Photovoice, drawing, and Lego). For each method, this presentation will describe the methodology with an illustrative example from previous evaluation work as well as describe needed resources and previously achieved outcomes. The three methods presented require limited artist skills on behalf of the evaluator or stakeholder and therefore can be utilized without significant training. Each method represents a great way to get started using arts-based methods within one’s own evaluation practice.

Jennica Nichols is a Ph.D. student in Interdisciplinary Studies at UBC, with interests in mental health, evaluation, research-based theatre, and implementation science. She holds a Master in Public Health (Epidemiology, Global Health) from the University of Toronto (2012) and the Credentialed Evaluator designation from the Canadian Evaluation Society (2015).

Data Placemats for Different Tables

This presentation will discuss the use of multiple data placemat sessions in a recent evaluation of a fare enforcement program for bus rapid transit in King County, Washington. The first data placemat session was with the program staff who worked with and maintained the data, so the approach focused on getting feedback on more raw numbers. The second session was with department leadership and was focused on sharing and getting feedback on preliminary analysis. Program staff from the first session were present as well. This approach to this tool met stakeholders at the level at which they generally engage with data, which allowed for more useful discussions and sharing of information. This presentation will discuss the approach and lessons learned about adapting this tool to the context of a large local government department, where staff at different levels have different concerns and potential uses for the evaluation.

Elise Garvey is a Senior Management Auditor with the King County Auditor’s Office in Seattle, Washington. Her work has included evaluations in public health, climate action planning, major capital projects and green building, law enforcement, and transit. Elise is the co-chair of the Government Evaluation TIG of the American Evaluation Association.

Engaging Indigenous Communities in Evaluation: Strategies, Tools and Approaches for Evaluators

The Truth and Reconciliation Commission of Canada (TRC) stressed importance of monitoring, assessment, and reporting on various aspects of Indigenous health, welfare, and education. Evaluating relevance and effectiveness of the social, educational and health care programs and services implemented in Indigenous communities is necessary to ensure the continuous program improvement and can play an important role in the reconciliation process. Undertaking program evaluations in Indigenous community context requires specific approaches that evaluators should implement. In this presentation, Eyyub will reflect on his own experience of undertaking program evaluation assignments in Indigenous community contexts and discuss some of the approaches and lessons that he has learned in his experience.

Eyyub Hajiyev, MSW, MBA is an independent management consultant who provides evaluation, research and performance management services to public, private and non-profit sector organizations. Over the last 10 years, Eyyub has conducted dozens of evaluations for a wide range of federal and provincial government departments, involving extensive consultations with representatives of Indigenous communities. Eyyub has visited over 60 communities and conducted numerous in-person interviews, and focus group discussions with service recipients, community leadership, service delivery staff members and other stakeholders.

Help! Our Logic Model Contains 147 Potential Indicators – What Do We Do Now?

Developing an indicator set for dynamic, system-change initiatives presents unique challenges. You don’t quite know where you’re going and how you’re going to get there. But at the end of the day, your stakeholders or funders still want to know whether you’ve arrived. So how do you develop an indicator set that is practical for all its users and reflective of an evolving initiative? How do you adjust it when the context changes? And what do you do when you realize course corrections are needed? Using examples from the evaluation of a major health research capacity-building initiative, we will reflect on the “journey” of an indicator set, from early development to final form. Drawing from both the evaluation and program implementation perspectives, we will reflect on lessons learned and describe a systematic process for generating, evaluating and refining an indicator set for dynamic system-change initiatives.

Penny Cooper operates an independent evaluation consultancy based in Vancouver. She has twenty-five years’ experience in evaluation and research in Canada and Australia, and has worked in the not-for- profit, university and private sectors as both an external consultant and internal evaluator, primarily in health.

Our Developmental Evaluation Journey: Navigating Supports for Newcomer Refugees

As evaluation practitioners, how often do we reflect upon our own experiences? How can these reflections go hand-in-hand with an evaluation in progress? How can we integrate real-time client learning into coaching interventions while still maintaining the integrity of an evaluation exercise? What are the benefits and considerations for capturing and sharing these reflections? Presenters will engage participants to navigate these questions, based on a recent developmental evaluation (DE) of an initiative to support newcomer refugees. Discussions will focus on key watershed moments from this evaluation, including: Organizational readiness Project design, flexibility & adaptability Core evaluation questions Relationship building Power dynamics Influence of cultural differences This presentation will explore: why evaluators’ reflections are important; evaluation components worth monitoring; and when sharing evaluator experiences would be of value to partners and participants. Up to three significant lessons or tools will be identified for future application.

Kim Walker is a community development and environmental management consultant with a career spanning 35 years. Her developmental approach focusses on solutions and actions to build capacity and leadership. Evaluation topics have included abuse counselling, land use, indigenous organizations, teleworking, international qualifications recognition, immigrant & refugee services, and environmental enterprise development.

Elaina Mack is committed to empowering children and their families to thrive through engaging, empathetic and evidence-based approaches. She has more than 10 years of experience in applied research, evaluation and project management to support initiatives across Canada and internationally related to child rights and well-being, community development, and newcomer settlement.

Pinning down context: using a new resource to define a slippery concept

Why are we stymied by context? Understanding the context in which interventions are embedded helps us better understand how and how well they work. We are urged to report more thoroughly on the context of interventions, yet given the cursory treatment in many studies it seems capturing context remains a challenge. The Canadian Evaluation Society’s 2017 national conference featured a lively workshop on context, and highlighted there is no clear consensus on its definition. The Context and Implementation of Complex Interventions (CICI) framework (https://bit.ly/2ty5Jmt) was developed to provide guidance for reporting context. We recently used CICI to review interventions aimed to increase urban bicycling—a behaviour highly sensitive to context. Our next step is to apply what we learned from our scholarly review and use the CICI framework to operationalize relevant aspects of context for a “BikeHost” multi-site intervention and evaluation plan. We share this process in our presentation.

Stephanie Sersli is a doctoral candidate at SFU with an interest in healthy and just cities. She uses applied, mixed methods critical lenses to investigate potential solutions that make it easier for all people, but especially women, to use bicycles more and cars less.

Alyshia Burak is the Bike Education Manager at HUB Cycling, a non-profit in Vancouver that aims to get more people bicycling for transportation through education, action and events. She oversees HUB’s urban bicycling education programming which reaches 7000+ participants annually with courses for children, adults, families, and newcomers to Canada.

Pivoting to participatory evaluation amid an opioid overdose epidemic: Perspectives from a community-based funder of mental health and substance use programming

How do you re-envision a funder’s approach to evaluation amid an overdose crisis, leadership changes and major policy shifts in BC’s community-based mental health and substance use sector? This presentation describes the fearless and winding evaluative journey undertaken by Community Action Initiative in learning to support and reflect the needs and challenges faced by those on the front-line of BC’s overdose crisis: community-based service providers.

Jennifer Alsop has a rich background in program evaluation and policy research in support of public health and health promotion initiatives. She currently works at Community Action Initiative, a funder of community-based mental health and substance use programming in BC, where she leads evaluation, research and knowledge exchange initiatives.

The use of the AMERQI Framework to bring together Evaluation, Monitoring, and Quality Improvement to promote the use of data and information in a complex healthcare transformation project

Commonly used approaches to assessing programs and services (audit, monitoring, evaluation, research, and quality improvement) are often ill-defined and conflated in healthcare, both in the literature and in practice. Our team created a framework highlighting the distinctive features of the approaches, and where they overlap, with the aim of fostering a more comprehensive understanding of the approaches and how they can be leveraged to provide value to healthcare organizations. In this presentation, we will reflect on our recent experience of bringing together these approaches during the implementation of new clinical practices and a shared electronic health record at Lions Gate and Squamish General Hospitals. We will discuss how the framework helped us to clarify roles and develop relationships with Decision Support, Quality, and Clinical Informatics to monitor the progress of the implementation, work towards quality improvement in how the system is implemented, and gather rich data for the evaluation.

Dr. Beth Snow is the Head of Program Evaluation at the Centre for Health Evaluation & Outcome Sciences (CHÉOS) in Vancouver and leading the evaluation of the Clinical & Systems Transformation Project, a joint undertaking of Vancouver Coastal Health, Provincial Health Services Authority, and Providence Health Care. She is the president of the Canadian Evaluation Society BC & Yukon Chapter and a Credential Evaluator.

Dr. Alec Balasescu is an anthropologist, adjunct professor at Simon Fraser University, and Evaluation Specialist for the Clinical and Systems Transformation, BC. He published extensively on ethnographic methods, material culture, human-machine interaction and climate change. His current interests are at the intersection of healthcare, AI, and Climate Change.

Joyce Cheng is an Evaluation Specialist at Vancouver Coastal Health for the Clinical & System Transformation Project, a joint undertaking of Vancouver Coastal Health, Provincial Health Services Authority, and Providence Health Care. She is also a recent MSc graduate from University of Victoria’s School of Health Information Science.

Abdulkader Kadernani completed his Master of Health Administration in 2016 from The University of British Columbia. He previously held executive positions at a community hospital in Kenya before moving to Canada to join the Clinical and Systems Transformation Project Evaluation Team.

Stephanie Parent has a Master of Public Health from Simon Fraser University. She is currently an Evaluation Specialist for Providence Health Care and is working on the evaluation of British Columbia’s Clinical and Systems Transformation Project."

Theory of Change: A Planning and Evaluation Tool for the Not-For-Profit Sector

Theory of Change is a planning and evaluation methodology that has proven particularly useful in the not-for-profit sector. The not-for-profit sector is characterized by two features: (1) activities are typically constrained by funders/donors, and (2) not-for-profit organizations are governed by volunteer boards who have the power to set organizational direction but may not have in-depth knowledge of the organization’s activities. Vantage Point has found Theory of Change to be a particularly helpful tool to support planning and evaluation with boards and staff teams, and to communicate program activities and impact to funders. This workshop will introduce Theory of Change and reflect on the Vantage Point experience to describe the benefits for the not-for-profit sector.

Kathleen Lane is the Manager, Learning & Evaluation at Vantage Point. She holds an MBA from UBC’s Sauder School of Business. Before joining Vantage Point, Kathleen spent several years consulting in the not-for-profit sector and managing research-to-action projects at UBC.

Amnit Litt holds a Bachelor of Arts in Economics (Honours) from SFU and a Master of Science in Economics from LSE. She is passionate about evidence-based policy, supported by rigorous quantitative evaluation. Currently, in her role as Evaluation Coordinator, Amnit helps evaluate the programs and services at Vantage Point

Two Sides of the Same Coin? Reflections on Challenges and Opportunities in Evaluating Healthy Community Programs in Neighbouring Provinces

The BC Healthy Communities Society and Alberta Recreation and Parks Association both oversee well-established provincial healthy communities initiatives that are guided by principles of the WHO healthy cities/communities approach. Our programs have supported hundreds of urban, rural and indigenous communities in each province to advance local action for healthy communities. While the complexity and diversity of this work makes it exciting, it sure keeps us on our toes when it comes to evaluation! In this session, we’ll engage participants in reflective dialogue as we share some of the lessons we have learned about measuring program outcomes related to topics such as: Integrating program theories into evaluation frameworks Navigating the complexity of outcome attribution Evaluating long-term change processes in the context of short-term funding models Deciding on methods and tools for assessing community change.

Lisa McLaughlin is Program Manager of the Communities ChooseWell initiative at the Alberta Recreation and Parks Association where she leads the development and implementation of program evaluations. Lisa has also worked in school health promotion and chaired a health authority community health council. Lisa is currently completing her Master in Public Health (Health Promotion).

Diana Gresku is the Research and Impact Specialist for BC Healthy Communities Society and is developing a provincial program evaluation framework. She also supported a national program evaluation for the Public Health Agency of Canada and is passionate about developing communities’ evaluation capacity. Diana has a Master in Public Health (Global Health)."

Usability testing: An evaluator’s experience and lessons learned

With more organizations relying on websites and online resources for service efficiency and knowledge translation evaluators may find themselves conducting usability evaluation. Usability refers to the quality of a user's experience when interacting with products or systems, including websites. In short, usability is about effectiveness, efficiency and the overall experience of the user with the platform at hand. Traditionally usability testing has been conducted by usability specialist and designers, but evaluators with the right tools can similarly perform usability tests and provide recommendations to enhance user-experience in light of the broader goals and anticipated outcomes. In this presentation, the concept of usability (what is it?), its benefits (why do it?), the processes and common methods (how to do it?) and lessons learned will be discussed. Particular examples will be provided from usability testing of a cultural-humility online resource tool, an online health appointment booking website, and an online health information directory.

Shabnam Ziabakhsh is an experienced program evaluator with more than 15 years of experience in conducting applied research and program evaluation. Shabnam is currently the Evaluation Specialist at BC Women’s Hospital + Health Centre. Shabnam has a PhD in Social Psychology from Simon Fraser University.

What do I need in my Evaluation Toolbox?

"What tools do I need to be an evaluator?" Is a question frequently asked – by both new and experienced evaluators. New evaluators may be looking for a basic skill set. They may want to know how it fits with the skills they already bring to evaluation. Experienced evaluators may be looking for more specialized skills for a specific evaluation. Or those they see as new/emerging and thus relevant to their work in a more general sense. This session offers evaluators a look at what tools employers are asking for. (And a chance to reflect on which they already have or would like to have.) It presents the analysis of 40 recent program evaluation job postings - 25 in BC and 15 elsewhere in Canada (including federal). It focuses on two areas: a) work to be performed and key responsibilities (what you would do) and b) qualifications, experience and skills (what you will bring).

Diana Tindall's research on performance has helped organizations strategically realign activities, fine tune existing processes and demonstrate to stakeholders the value of initiatives for more than 25 years. Diana has worked on federal, provincial and not-for-profit projects across multiple sectors. She is currently an Evaluator with the Rick Hansen Institute.

Would you support this program? Role playing for participatory evaluation reporting

This short presentation will demonstrate a role playing exercise that can be used to assess the strengths, gaps and lessons learned by internal and external program stakeholders. The evaluator presents findings that addressed the key evaluation questions. Small groups review the results as if they were donors, key policy makers or other appropriate influencers. Using a structured process, they describe the key findings that they find most compelling, and what further information they might need to influence future funding or programmatic decisions. By sketching out a business plan, stakeholders provide valuable interpretation and judgement on programmatic results in a participatory manner.

Karen Snyder, PhD MPH, has been improving policies and programs in anti-trafficking, global health and the environment for over 20 years. Her work with small and large civil society organizations, academia, and government agencies is grounded in rigorous and systematic processes that involve the participation and voice of all stakeholders.


Demonstrations Program

Demonstrations are 45-minute sessions providing in-depth instruction on how to use an evaluation tool or method. Seven demonstrations have been selected for the conference, listed here alphabetically by title.

Anecdote Circles: Reveling in the Messiness

For those of us working in complexity (most of us), interviews, surveys and focus groups can be frustrating because we are constrained by a pre-determined set of questions or lines of inquiry. How can we dig into the messiness and make sense of it? Anecdote Circles can be used to collect stories for evaluating intangible, “difficult-to-evaluate” work, as well as for developing insights into teams and organizations, and for exploring themes through experiences and stories. This presentation will share the Anecdote Circle methodology, share an example of how the presenters used it with Vancouver Foundation, and finally demonstrate how Anecdote Circles work.

Trilby Smith is the Director of Learning and Evaluation at Vancouver Foundation. She supports the Grants and Community Initiatives department with meaning-making and sticky note and marker usage. Trilby has been an evaluator for over 20 years and is driven by the core value of involving those most impacted by evaluation in the learning process.

Chris Corrigan is a process artist, a teacher and a facilitator of social technologies for face to face conversation in the service of emergence. He and his partner Caitlin Frost run Harvest Moon Consultants, Ltd., offering coaching and facilitating to organizations and leaders.

Exploring the Landscape of Complexity and Systems Thinking in Evaluation: Reflections on the application of concepts, methods, and tools (PANEL)

Although there is increasing interest in complexity and systems thinking in the world of evaluation, there are few examples of how such thinking can be applied in concrete ways. The aims of the session are to present three people’s perspectives on complexity and systems thinking approaches to evaluation and to engage conference participants in a lively discussion about their own reflections and experiences. This panel brings together three people from different backgrounds who have applied complexity and systems thinking to evaluation work. They will discuss: what literature has influenced them in understanding complexity and systems thinking how they have applied various concepts, methods and tools (including new tools and/or the application of traditional tools in new ways) in evaluations what has worked well/not well their overall reflections on complexity and systems thinking and recommendations for future evaluations

Dr. Beth Snow is the Head of Program Evaluation at the Centre for Health Evaluation & Outcome Sciences (CHÉOS) and is leading the evaluation of the Clinical & Systems Transformation Project. She is the president of the Canadian Evaluation Society BC Chapter and a Credential Evaluator.

Lori Baugh Littlejohns has spent more time in Australia than Canada for the past six years, completing a PhD and then working with The Australian Prevention Partnership Centre. Studying complex systems and complex health promotion policy and practice is her passion.

Amy Salmon is a Scientist and Program Head for Knowledge Translation at the Centre for Health Evaluation and Outcome Sciences, and a Clinical Assistant Professor at UBC’s School of Population and Public Health.

Rachel Douglas (panel moderator) is an Evaluation Specialist in Population and Public Health at Fraser Health. She supports programs from across the department in their evaluation projects and capacity building. Rachel has a Master’s in Public Health and a passion for research driven by the real world health system, clinical and community contexts.

Participatory Approach to Evaluation: Democratizing Evaluation and Embodying Social Inclusion

The Pacific AIDS Network (PAN) is a community-based network supporting HIV/HCV organizations in BC. PAN uses a highly participatory approach to evaluation and will share learnings on the benefits, needed resources, and tools for this approach. When adapting to evaluation, PAN integrates community-based research (CBR) principles such as, ensuring that projects are collaborative, change-oriented and inclusive; have built-in supports to ensure participation of people with lived experience (PWLE); and grounded in methodological rigor and sound ethical practices. We have accomplished this by: bringing people most affected to the decision-making tables; building accessible evaluation training modules; and supporting Peer Evaluators to lead the work. To be successful, teams need to allow for extra time and resources and develop a culture of trust and learning. Impacts include: more relevant and nuanced evaluation questions; greater use of evaluation findings; empowering PWLE to lead evaluations; and democratizing evaluation and supporting social inclusion.

Janice Duddy is the Director of Evaluation and Community-Based Research at the Pacific AIDS Network (PAN). She has worked at PAN since 2013 and has been the Director of Evaluation and Community-Based Research (CBR) since 2016. Prior to her work at PAN she worked at the Provincial Health Services Authority.

Mona Lee, Manager of Evaluation at the Pacific AIDS Network, started her evaluation journey as a Master of Public Health student at SFU. Now championing participatory approach and shared measurement framework in evaluation, she supports a number of evaluation projects in partnership with community-based organizations across BC and Canada

Paul Kerber is the Evaluation Coordinator at the Pacific AIDS Network (PAN). He works on adiverse group of evaluation projects in British Columbia and Canada.

Partnering with the BC Health Care Industry using Business Intelligence (BI) and Tableau as key tools

Previously using static reports, WorkSafeBC has leveraged business intelligence (BI) to create an interactive scorecard using Tableau through partnership with the BC Health Authorities, and the Health Employer Association of BC (HEABC) to evaluate occupational health & safety (OH&S) effectiveness for the Health Care Industry. We intend to provide an overview of the data collection process, collaboration methods, and a live demonstration of the interactive tool - including best practices - used for evaluation and discuss further opportunities and engage the audience to consider BI and other practical applications and software for evaluation. The co-creation of a BI solution integrates technical and situational practice to evaluate workplace health & safety. The evaluation approach involves design, data sharing and collection, analysis, and interpretation. This enabled a framework with stakeholders of evaluative thinking to the unique issues in the healthcare industry.

Diana Liang is interested in applying business intelligence to create meaningful and actionable solutions. Further, analyzing shared data to evaluate effectiveness of programs using different techniques and emerging tools. She holds a Bachelor’s degree in Health Sciences and Master of Business Administration. She is currently a Senior Research Analyst with WorkSafeBC.

Rob Sturrock has over 25 years of experience in program evaluation, analysis, and business intelligence. He is currently a manager in Business Intelligence & Analytics at WorkSafeBC. He has worked in criminal intelligence with Organized Crime Agency of BC and Correctional Service of Canada. He has a Master’s degree in Criminology and Certified Six Sigma Black Belt.

Dana Rugina was born in Romania and came to Canada in 1996. She holds a Master’s degree in Commerce from the University of British Columbia and a PhD. from University of Orsay, France. She has worked as a Senior Research Analyst with WorkSafeBC for 16 years.

Rapid prototyping as a research and evaluation tool

In many social and public services, evaluation often happens at the end of the programs. In a complex world where there are many moving pieces and factors, this may not be the best strategy in understanding what works well for whom. Design methods, especially the rapid prototyping technique, offer a different way of evaluating. It’s a technique to understand people's tacit and latent needs by making ideas and concepts tangible, which interviews and observation often can’t get to because they focus only on the explicit and observable knowledge. Rapid prototyping is also an iterative technique that is geared toward understanding “what could be”. With rapid prototyping, evaluation can run in tandem with the programs/ services (instead of at the end) and learning can be looped back into programs/services to make quicker, cheaper and more effective improvement. This helps policy makers and service providers to get closer to their goals in designing for human-centred services and creating better outcomes.

Muryani Kasdani is passionate about designing things that inject inspiration and bring delight into people’s lives. For the past four years, she spent time doing qualitative research with marginalized populations and designing alternatives to enable human flourishing. The community learning exchange that she helped found, Kudoz, was showcased as one of the most disruptive social innovations in numerous publications.                         

Starting with the CES Program Evaluation Standards

The CES Program Evaluation Standards are one of three pillars of practice on which the CES professional designation program is based. Where did they come from? Who created them? Why are they important? How, where, and when do evaluators use them? This participatory session will enable participants with limited awareness of the standards to increase their knowledge and recognize ways of using them in their evaluation work. This demonstration will be a ‘must’ for students, new and emerging evaluators, and evaluators planning to apply for the credentialed evaluation designation. It will also be useful to practicing evaluators who want to make better use of the standards as a set of tools in their toolbox.

Sandra Sellick is an evaluation consultant (evaluationlink.ca) and associate faculty for Royal Roads University. As a CES member, she has volunteered as BC Interior coordinator, 2010 national conference co-chair, CESBC member at large, 2017 national conference program co-chair, National Board member, and book reviewer for the Canadian Journal of Program Evaluation.

The Evaluation Challenge; Thinking Fast and Learning Deep

In this fast-paced Evaluation Challenge, three seasoned evaluators will be presented with a case developed for the session. Audience participation will be encouraged to define the context of the Evaluation Challenge and add elements of complexity they feel are relevant. Challengers will be asked to present a conceptual framework and methodology. A panel of diverse evaluation voices will ask the challengers questions to understand why they made the choices they did, and what value it brings to the evaluation project. Panelists will include different sector perspectives and emerging evaluators. Audience members will gain an understanding of how evaluators adapt their approaches to suit contextual requirements and how the same case can be approached in a number of different ways.

Sarah Farina is the founder of Broadleaf Consulting, where she leads planning processes, creates evaluation frameworks, and conducts evaluations, primarily in health and community development. She is an adaptive leader and a skilled facilitator who uses collaborative approaches to help non-profit organizations, government and funders turn their aspirations into reality.

Marla Steinberg is a professional evaluator living and working in Vancouver British Columbia. For over twenty-five years she has been helping funders, foundations, governments, community-based organizations and practitioners increase their evaluation skills, measure their impact, and find ways to improve their programming.

Posters Program

Posters share insights and information about a topic in an engaging graphic format. Posters will be displayed in a common, central area throughout the conference. Presenters will discuss and answer questions about their poster during the scheduled presentation session. Four posters will be presented at the conference.

Creation of the Employer Health and Safety Planning Toolkit to help employers to evaluate their company’s health and safety performance for injury prevention and management to effectively create safety programs for their workers

The Business Intelligence & Analytics department at WorkSafeBC uses various innovative tools to encourage and help employers and workers prevent injuries and better manage current injuries. We intend to present a poster on a toolkit that includes a suite of applications used by employers to evaluate their health and safety performance. We will present on a poster board high quality visuals of the data collection method, process and how employers can use this solution to better equip themselves with health and safety tools.

Diana Liang is interested in applying business intelligence to create meaningful and actionable solutions. Further, analyzing shared data to evaluate effectiveness of programs using different techniques and emerging tools. She holds a Bachelor’s degree in Health Sciences and Master of Business Administration. She is currently a Senior Research Analyst with WorkSafeBC.

Jimmy Lin is currently a Business Intelligence Solutions Specialists at WorkSafeBC. He has a strong passion for translating requirements into thoughtful design while leveraging various BI tools and techniques to deliver high value solutions. He holds a Bachelor’s degree in Finance and Business Technology Management from UBC’s Sauder School of Business.

How We Learn New Things           

Information and resources on evaluation, building and maintaining consulting businesses, data visualization, and other topics bombard us on a daily basis. Evaluators are often generalists and it can feel overwhelming to have to be experts at everything. This interactive poster will be an opportunity for attendees to share their go-to resources, as well as where attendees have contributed to evaluative learning. Collecting information on the resources that conference attendees utilize regularly will have several benefits. New evaluators will learn what resources are considered valuable by more experienced evaluators, while seasoned folks may learn about resources that they had not previously considered. Conference attendees who want to build their reputations and skills as presenters will learn where to share their ideas.

Karen Snyder, PhD MPH, has been improving policies and programs in anti-trafficking, global health and the environment for over 20 years. Her work with small and large civil society organizations, academia, and government agencies is grounded in rigorous and systematic processes that involve the participation and voice of all stakeholders.

Program Evaluation of Simulation Based Education in Surgical Training

A Program Evaluation to examine the power of using simulation technologies in surgical education and improving the learning experience of residents. a survey that had been submitted to 45 residents at one of the top medical centers in United States Surgical Skill Labs. To examine residents perceived (learning outcomes, self confidence, improved examination scores), mastering procedures towards patient safety, and possible limitations they are facing during training. Methodology A paper survey was handed to surgical residents, response rate was 97%. Results Survey results indicated that residents are generally satisfied with clinical and cognitive learning outcomes, applying knowledge and improving their self confidence in treating real patients. However, they rated their experience at skills lab as “average”. Recommendations A list of actionable residents lead recommendations for leaders at surgery department, included change the current location of the skills lab and dedicated time for residents to work on the skills lab.

Dina Hesham Esmat Khorshed, MD, MHPE earned my master's degree in Health professions Education from Warner School of Education, USA. My passion with program evaluation started during studying of my Master's. So, I decided to pursue a certificate in program evaluation to learn how to embrace unheard voices in Education.  

We Asked, Listened, Adapted, and Validated – A Culturally-Sensitive Approach to Evaluation

This presentation focuses on multiple strategies used to fit the evaluation within the culture of primary care providers, and with the needs of the South Asian (SA) community. In 2015, Fraser Health embarked on a 3-year regional quality and innovation initiative, to improve early diagnosis and management of early dementia through 3 main streams of activities: 1. Integrated, interprofessional collaborative practice guidelines 2. Competency-based dementia education designed to support family physicians and nurse practitioners (Mentees) in their care of patients of early dementia by grouping them with Specialists (Mentors) 3. Public awareness and education campaign targeted for the SA community High-level summary of strategies: Evaluation lens was embedded from the planning stage and woven throughout the project. Representatives of SA communities and physicians were engaged members of evaluation working group and provided continuous input. Findings from process and outcome evaluation were used for decision making and quality improvement.

Golareh Habibi received her BSc in 2005 and her MSc in 2008 from UBC. She also received her Master of Public Health from SFU in 2015. Golareh has worked as a researcher since 2002; she joined Fraser Health Primary Care in 2012 as an Evaluation Specialist.

Jeevan Sangha received her Bachelors in Health Sciences in 2018 from Simon Fraser University. Jeevan has worked as a Research and Evaluation Assistant since March 2017 with Fraser Health on the Regional Dementia Strategy and as a Medical Staff Events Assistant with the Health System Redesign Funding team.




Powered by Wild Apricot Membership Software