What’s in Your Evaluation Toolbox 2.0?

“What’s in Your Evaluation Toolbox 2.0?”  – such was the theme of the recently held Canadian Evaluation Conference held in Vancouver earlier this year. This theme, posed as a question, challenged the presenters and the participants to share the evaluation tools they rely on in advancing the field of evaluation. This theme also recognized that our evaluation toolbox is ever expanding, thus making evaluation a dynamic practice.

My main takeaways from the conference are based on a few (unfortunately, not all) sessions that I attended.

 

1. Liberating Structures – Ecocycle

The hands-on presentation titled Assessing a portfolio of work using Ecocycle, a visual technique introduced the participants to the world of liberating practices and provided an example of one of such practices – Ecocycle Planning. Liberating Structures refer to the innovative ways of engaging people in a creative way. The premise of the liberating structures is that our world is increasingly complex, interdependent, and culturally diverse, and the conventional structures, such as presentations, reports, open discussion and brainstorming activities are either too inhibiting or too disorganized for a meaningful engagement.

Ecocycle Planning is a visual technique that allows groups “to view, organize, and prioritize current activities using four development phases: birth, maturity, creative destruction, and renewal”. By inviting participants to think of a project/plan and the related activities/steps holistically rather than in a piece-meal way, ecocycle planning helps everyone “to see the forest AND the trees”. In other words, the big picture does not get lost in the details, and the details do not overshadow the context.

 

2. The next session consisted of two presentations that focussed on indigenizing approaches to public health. In the presentation titled Learning to “see with two eyes”: Insights from applying culturally-responsive evaluation strategies to an Indigenous health initiative, Zhang introduced the “the two-eyed seeing” approach. Etuaptmumk, or Two-Eyed Seeing, refers to learning to see from one eye with the strengths of Indigenous knowledges and ways of knowing, and from the other eye with the strengths of Western knowledges and ways of knowing … and learning to use both these eyes together, for the benefit of all. You can read more about the application of this approach in the Making it Work Project that PAN is co-leading with the AHA Centre at the Canadian Aboriginal AIDS Network (CAAN).

Zhang talked about the Interior Health’s Aboriginal Health and Wellness Strategy and its focus on advancement of cultural competency; ensuring meaningful participation of Indigenous populations in health care planning and decision making; improving health equity; and improving mental wellness for Indigenous people. One of the key components of culturally safe evaluations is 4 “Rs”: Respect, Relevance, Reciprocity and Responsibility (Kirkness and Barnhardt, 1991). As well, healing and story-telling need to be emphasized during the evaluation process.

One of the thought-provoking questions raised after the second presentation titled Strengthening community partnership through evaluation of the TB High Incidence strategy at Northern Saskatchewan was around the education burden – reliance on the local Indigenous communities to provide cultural translation to non-Indigenous researchers and evaluators and the extent to which it shifts the responsibility away from self-education and taking the time to get to know the community.

 

3. The next presentation titled Real-time data visualization for developmental evaluation by Kischchuk was built around another hands-on activity wowed the attendees with the power of visualizing the data as it gets collected. By feeding the data that the attendees contributed through to Google Forms, we were able to see it real-time in Tableau. The minimum lag between the collection and its visualization opens new opportunities for new ways of not only presenting evaluation results but also engaging stakeholders in the process of reflecting on and interpreting these results and eventually employing the elements of participatory evaluation.’

 

4. The last session of the day that I attended was titled Tools and Approaches for Principles-Focused, Developmental Evaluation: Lessons from Three Cases. The session presented the results of the studies conducted by the Centre for Health Evaluation & Outcome Sciences (CHEOS), including the evaluation of the Foundry initiative – low-barrier access to primary care, mental health, substance use, and social supports to youth aged 12 to 24, Megamorphosis (a made up term J) –  transformation of residential care from an institutionalized medical model to a patient-centred model and Inpatient Psychiatry Redevelopment project. All of them utilizes principles-focused developmental evaluation that is suitable to situations that are highly emergent and volatile, difficult to plan, socially complex and innovative. Combined with principles-focused evaluation that makes the principles (e.g. patient-centred care, integrated care model, etc.) behind the programs, projects, and collaborations the focus of evaluation, these evaluations ask Based on the evidence, how does the principle work, and with what results?’

 

Source: https://weallcount.com/2019/07/11/auto-draft/

5. One of the highlights of the conference was a keynote presentation How to Avoid Using Data (Accidentally) Like a Racist – Equity in Data Products by Heather Krause, the founder of We All Count, a project with a mission to demystify, democratize and demonstrate data science. Heather’s presentation that focused on equity and ethics in data processes debunked the myth about objectivity of data and the perception that we as evaluators do not play any role in how data emerges. One of the main takeaways from the keynote is the importance of recognizing our own biases in all stages of evaluation (from the kind of questions we ask to how we report the results) and being transparent about why we collect the data we collect, who we collect it from, and how we will use this data. Even calculating an average is not as straightforward as you might think. Watch this video to learn more.

 

For more information about other sessions and presentations at CES BC 2020, check out the twitter handle #EvalBC2020

 

Learn More:

 

Questions? Feedback? Get in touch!
Alfiya Battalova, Evaluation Manager

[email protected]