Interpreting Data in Participatory Research and Evaluation Projects

How PAN’s values impact research and evaluation

The GIPA/MIPA principles of inclusive and meaningful involvement are part of Pacific AIDS Network’s (PAN) core values, as described in our strategic plan and organizational values and approaches. We try our best to embed these values in all of our work, including all stages of PAN-facilitated participatory research and evaluation initiatives.

We have discovered that there many ways to engage people in participatory research and evaluation and to embrace the meaningful involvement of people with lived experiences. This post describes some of the challenges we have faced in carrying out participatory data analysis and some strategies we have tested in response. We welcome your feedback and suggestions!

 

What is participatory data analysis?

A quick review of journal articles and grey literature reports (research outside universities or commercial publishing) on participatory research methods revealed that a lot has been said and written about participatory approaches to planning and implementing research projects. Less, however, has been written about participatory approaches to analyzing research data (Byrne et al., 2005; Pain & Francis, 2003). Even less that has been produced about the involvement of peer researchers in the data analysis process.

The CBR and Evaluation team at PAN shares the perspective of Rachel Pain and Peter Francis’ (2003, p. 51), who express “concerns over the lack of explicit frameworks for analysis within much [of the] practice of participatory research.” In other words, you’ll have a hard time finding recommendations on how to do data analysis in a participatory way that supports people from diverse backgrounds.

Community-based projects involving peer researchers and evaluators often involve people with various levels of experience in research and evaluation, not all of whom have formal research methods training. This diversity of skill levels has led us to try some new strategies for ensuring that our values of GIPA and MIPA weave through all components of our research and evaluation projects. This commitment means asking ourselves on an ongoing basis how to best support peers with various degrees of experience in doing the different kinds of work required at different stages of the research and/or evaluation project.

 

Improving Analysis Processes

We are proud to say that we are devising creative and novel strategies for engaging peers and people with lived experience in the data analysis phases of research projects. So far, two strategies that have achieved excellent results are data parties and data synthesis activities.

There’s no party like a data party!

The Positive Living, Positive Homes (PLPH) study organizes “data parties,” a concept the study team first heard about in a webinar by Kylie Hutchinson and that has been written about by Nancy Franz (2013). A blog post on PLPH data parties explains what a data party is and how the team is adapting this idea for their project:

A “data party” is simply a gathering of people who are interested or who have a stake in a particular study. The goal is to make sense of research data in a collaborative way. Because PLPH is a community-based study, the idea of a data party was attractive because it would enable our team to bring together varied viewpoints from our community stakeholders. Having service providers, research participants, academics, and other community members in the room together to discuss our findings let us approach and use the data in a number of meaningful ways.

 

Synthesize before you analyze!

Nancy Franz (2013) has discovered there can be challenges with participatory research projects where there is a whole lot of data/information that hasn’t yet been analyzed (often called “raw data”). As she notes, “many people have low or selected tolerance for large amounts of raw data.” To avoid this barrier to participation, prior to the data parties the PLPH research team surveys invitees (online and/or over the phone) to find out which of the study’s topics and issues will be the most interesting and useful to the community.

This scenario is one that PAN evaluators recently encountered when working on an impact evaluation of the Positive Leadership Development Institute. This project was supported by PAN, guided by a Steering Committee of peers and key stakeholders, and data collection was conducted by four beginner-to-novice peer evaluators. As we neared the end of data collection, time was running out on the peer evaluators’ contracts and we had a pending deadline for the project’s final report. At this stage, the project faced two major challenges related to participatory data analysis: 1) the difficulty of training beginner-to-novice evaluators to conduct the technical qualitative coding in a very short period of time, and 2) the difficulty of making sense of 2-dimensional textual data within limited time before contracts ended.

 

The “A-Ha!” Moment in Sharing Ideas About Data

A PAN staff person was reading about design thinking and had an ‘a-ha’ moment when she realized that there might be value in bringing together the peer evaluators’ to get their perceptions about what they heard most and/or most intensively during the data collection phase of the project. This ‘synthesis’ activity mimics something we do naturally when we are working independently on a project. When working alone, we often reflect – silently, or with friends and/or colleagues, without much preconceived intention to do so – on what we just heard, what we heard prior to that, and how these pieces of data and our ideas about them might all fit together.

In participatory research and evaluation, however, it is not always commonplace to mutually discuss this shared understanding of the data being gathered. In fact, it is likely quite rare for the research or evaluation team members to bring forward their thoughts and reflections to create a shared set of understanding about the full set of data before even attempting to analyze the data into thematically-coded bits. Therefore, a data synthesis phase can be very useful when working on a research team where differently positioned team members hold diverse sets of contextual knowledge both about the data collection process and the experiences being described by the project participants.

The true test of this idea came when we facilitated a discussion with the four peer evaluators who collected data for the PLDI impact evaluation. To start, the PAN staff asked the peer evaluators to think about and answer the following question: “If you were sitting down to have coffee with a friend and they asked you to tell them about the things you heard most frequently from participants during this project, what would you tell them?” We took notes on their initial thoughts about how they would respond to the question, asking whether a theme was something they heard over and over again or whether it was something that was said once or twice but very powerfully. This follow-up prompt was intended to get at things that might not have been said a lot, but were mentioned with a lot of emotion or meaning. This kind of information can’t be learned just from looking at data that captures numbers of times things were said, rather than what the information meant to people.  Peer evaluators were in an excellent position to reflect on these follow-up questions.

This quick brainstorm helped PAN staff to better understand what was really critical in the data the peer evaluators had collected, and it provided insight into the community-specific meaning of certain phrases that were coming up again and again in the data (i.e. “connection to community”, “giving voice”). The focus group was recorded and a transcript was produced. Using the notes from this session, PAN staff supporting the project were able to put together some questions based on this initial synthesis activity. The focus group transcript gave the evaluation team concrete data to discuss with the peer evaluators in an attempt to apply and better understand the ideas generated during the brainstorm.

This exercise of tying the ideas generated during the brainstorm to concrete data helped us to confirm whether the PAN staff who would be writing the final report for the evaluation were accurately understanding the ideas and experiences described by the peer evaluators. While not wholly participatory – the nitty-gritty data analysis was still conducted by the PAN staff who also wrote the final report – gathering this contextual understanding of the data collection process and an overview of the main and most critical themes of the data collection from those who were most embodied in its collection was helpful for deepening the quality of the analysis.

 

Reflections on these processes

At PAN we’ve learned that conducting participatory data analysis takes time, careful planning, thoughtful reflection, as well as a willingness to try new ideas and activities. As we are attempting to find creative ways through these processes together, we welcome any questions, comments and suggestions for further reading that you may have for us on this topic.

 

References

Byrne, A., Canavan, J., & Millar, M. (2009). Participatory research and the voice‐centred relational method of data analysis: Is it worth it? International Journal of Social Research Methodology, 12(1), 67-77.

Chambers, R. (1997). Whose reality counts? Putting the first last. London: Intermediate Technology Publications.

Franz, N. (2013). The data party: Involving stakeholders in meaningful data analysis. Journal of Extension, 51(1), Article #1IAW2.

Pain, R., & Francis, P. (2003). Reflections on participatory research. Area 35(1), 46-54.

 

 

Questions? Feedback? Get in touch!
Heather Holroyd, Community Based Research and Evaluation Coordinator
[email protected]
 
 
 
 
 
 
 
 

 

Image: Light Bulbs by jniittymaa0, Pixabay