Skip to main content

Methods for community-engaged data collection and analysis in implementation research

Abstract

Background

Community engagement is widely recognized as critical to successful and equitable implementation of evidence-based practices, programs, and policies. However, there are no clear guidelines for community involvement in data collection and analysis in implementation research.

Methods

We describe three specific methods for engaging community members in data collection and analysis: concept mapping, rapid ethnographic assessment, and Photovoice. Common elements are identified from a case study of each method: 1) selection and adaptation of evidence-based strategies for improving adolescent HPV vaccine initiation rates in disadvantaged communities, 2) strategies for implementing medication for opioid use disorders among low-income Medicaid enrollees during natural disasters, and 3) interventions to improve the physical health of adults with severe mental illness living in supportive housing.

Results

In all three cases, community members assisted in participant recruitment, provided data, and validated preliminary findings created by researchers. In the Photovoice case study, community members participated in both data collection and analysis, while in the concept mapping, community members also participated in the initial phase of organizing and prioritizing evidence-based strategies during the data analysis.

Conclusions

Community involvement in implementation research data collection and analysis contributes to greater engagement and empowerment of community members and validation of study findings. Use of methods that exhibit both scientific rigor and community relevance of implementation research also contributes to greater community investment in successful implementation outcomes. Nevertheless, the case studies point to the importance and efficiency of the division of labor embedded in community-engaged implementation research. Building capacity for community members to assume greater roles in obtaining and organizing data for preliminary analysis prior to interpretation is recommended.

Peer Review reports

Background

Community-engaged research (CEnR) is an approach for conducting research that requires development of partnerships, cooperation, negotiation, compromise, and a commitment to addressing health issues that are of interest to, and affect the well-being of, communities defined by geographic proximity, sociodemographic characteristics, or special interests [1,2,3,4]. Ideally, community input is incorporated in all aspects of the research, from development of the research question, implementation of the research project, and analysis of the results, to the dissemination of the findings to community partners. The focus of CEnR is on addressing health care needs identified by the community itself as priorities [3,4,5,6], health disparities [7,8,9,10,11], and the social determinants of health. It does so by a distribution of responsibilities and benefits to researchers and community members.

Community-engaged dissemination and implementation research (CEDI) emphasizes engaging health services consumers, practitioners, policymakers, community organizations, and other community members with diverse perspectives, experience, and expertise regarding local context and circumstances that are likely to hinder or facilitate the implementation of evidence-based practices intended to improve health and well-being of all community members [4, 11,12,13,14,15,16,17,18]. Community engagement can be viewed as a strategy [19] and as a determinant [20, 21] of successful implementation and sustainment of programs, practices and policies designed to promote health equity [22].

To date, the literature has focused primarily on the characteristics of CEnR in general, such as shared responsibility, and in conducting specific research activities such as identifying the research question and disseminating the results. In contrast, relatively little has been written about community engagement in two critical components of conducting research: data collection and analysis. Several studies have noted the role of community partners in developing data collection tools [11, 23] and interpretating qualitative findings through member checking [24, 25]. Nevertheless, how these activities may be conducted in a manner that is both scientifically rigorous and relevant to community needs remains poorly understood.

Using a multiple case study approach [26], we describe three methods for eliciting the participation of community members in the collecting and analyzing of data reflecting the principles of CEnR that relate to the implementation of innovative and evidence-based interventions and practices.

Methods

We selected three established methods that involve community members in data collection and or data analysis: concept mapping, rapid ethnographic assessment, and Photovoice. These methods were selected based on our own experience with their use [27,28,29] and because they embody three of the core principles of community-engaged research [3, 4, 9, 30, 31]. First, they engage people with intimate knowledge of the setting in data collection or data analysis. Second, they enhance the validity of data and its interpretation. A team approach that involves academic researchers and community members can provide insight necessary to support internal and external validity of data (multiple observers). The validity of data obtained from interviews or focus groups is enhanced through triangulation with other qualitative data, e.g., ethnographic data, or quantitative data (multiple data sources). Further, qualitative data complement quantitative data by providing rich information on setting or context (depth vs breadth). Third, all three methods empower participants by giving them agency to contribute to, and investment in success of, implementation efforts, and by giving both researchers and community members the opportunity to learn from one another.

Our analysis of the application of these three methods occurred in the following steps. First, we reviewed the methods sections of publications describing each method [27,28,29], along with notes taken from team meetings that occurred during the conduct of each study. This information included procedures for data collection and analysis, including who was involved in each activity, what was done, and how the validity and reliability of each activity was assessed. Codes were assigned to each of these items. Second, using the method of constant comparison [32], we grouped activities into discrete categories of participants (e.g., academic researchers, community members), process (e.g., observations, pile-sorting tasks), outcomes (e.g., themes, rank orders), strengths and limitations. Third, we compared these categories to identify those that were consistent across the three methods and those that were specific to each method.

Results

Concept mapping

Concept mapping is a structured conceptualization process and a participatory research method that yields a conceptual framework for how a group views a particular topic or aspect of a topic [33]. It uses inductive and structured group data collection processes to produce illustrative maps depicting relationships of ideas as clusters of topics, constructs or elements. Concept mapping involves six steps: preparation, generation, structuring, representation, interpretation, and utilization [34]. In the preparation stage, focal areas are identified and criteria for participant selection/ recruitment are determined. In the generation stage, participants address the focal question during “brainstorming” sessions and generate a list of items to be used in subsequent data collection and analysis. In the structuring stage, participants independently sort the items into piles based on perceived similarity. Each item is then rated in terms of its importance or usefulness to the focal question. In the representation stage, data are entered into specialized concept-mapping computer software (Groupwisdom™) [35], which generates quantitative summaries and visual representations or concept maps, based on multidimensional scaling and hierarchical cluster analysis. In the interpretation stage, participants collectively review the concept maps, assessing and discussing the cluster domains, evaluating items that form each cluster, and discussing the content of each cluster. Finally, in the utilization stage, findings are discussed to determine how best they inform the original focal question [36].

Concept mapping has been employed in several implementation research investigations [25, 37,38,39,40,41]. For instance, in collaboration with county mental health officials, agency directors, program managers, clinicians, administrative staff, and parents of children receiving mental health services, Aarons and colleagues [37] used concept mapping to identify clusters of barriers and facilitators to implementation of evidence-based mental health services for children and adolescents. Gullahorn and colleagues [25] engaged consumers and providers in a concept mapping study of barriers and facilitators to initiation and sustainment of Medication for Opioid Use Disorders. Gobin and colleagues [30] used concept mapping with 48 policy actors, healthcare practitioners and civic society representatives to co-develop a list of perceived actionable priorities for the implementation of a health advocate training intervention to facilitate access to primary care among vulnerable communities.

Case study

In an NIH-funded study, Tsui and colleagues [27] conducted a concept mapping exercise to facilitate the engagement of diverse stakeholders in prioritizing and selecting evidence-based strategies (EBS) for increasing HPV vaccination in medically underserved communities. The concept mapping was conducted in collaboration with 10 clinic members (providers, clinic leaders, and clinic staff) and 13 community members (advocates, parents, policy-level, and payers) in Los Angeles and New Jersey drawn from a purposively selected sample who participated in a preliminary series of virtual semi-structured interviews and focus group discussions designed to elicit their perspectives on and experiences with HPV vaccination in safety-net primary care settings [42].

Researchers initially generated 20 pre-specified statements describing EBS for HPV vaccination from existing national sources and guidelines and identified 20 additional emerging strategies from the qualitative data collected in the preliminary study. The compiled statements were then reduced by eliminating duplicate strategies. The final 38 statements were sent to HPV vaccine community partners and advocates for review and further distillation.

Community members who agreed to participate were asked to complete two phases of concept mapping: 1) sorting and rating and 2) interpretation. Participants pile sorted statements according to their meaning and similarity with the meanings of other statements in the same pile. Each pile was given a name by the participant that described its contents. Participants then rated each statement by importance and feasibility for increasing HPV vaccination in their organization or region on a 4-point scale.

Using the Groupwisdom™ [35] concept mapping platform, researchers characterized how the named piles were clustered by participants. A point map was generated to position each EBS for HPV vaccination on a two-dimensional map with four poles where strategies located close to each other carried a similar meaning and elements further apart were less related. A similarity matrix was created to examine overall prioritization of EBS as well as configurations for specific participant groups. The analysis produced weighted and unweighted cluster maps, ladder graphs, and go-zone maps (e.g., most important and most feasible strategies). The research team compared maps of seven, eight, and nine clusters before reaching consensus that creating an additional cluster would not improve the meaningfulness of the data and deciding to use the map of eight clusters.

All participants who completed the sorting and rating phases of concept mapping were subsequently contacted to participate in a one-hour virtual group interpretation meeting on the Zoom platform. Participants received a handout with preliminary findings via email, which included the eight-cluster map, ladder graphs, and go-zone map comprised of all responses in aggregate. Participants then were asked to reflect and share their feedback on the concept mapping activity results using a discussion guide that focused on the following: 1) overall thoughts about the eight clusters that resulted from the sorting activity, 2) reactions to the relative ratings for importance and feasibility of strategy clusters, and 3) thoughts on how the go-zone map aligned with their organizations’ current approaches for HPV vaccination. The 60-min session was recorded and then transcribed by a third-party transcription service. Two research team members then read through the transcript and conducted a content analysis of overall themes, structured around the Practice Change Model [43] and key areas of divergence among participants, if any.

Concept mapping results were then shared with system leaders from a large multi-site federally-qualified health center (FQHC) system in Los Angeles and physician and clinic champions from three clinic sites within the FQHC. Clinic leaders and champions and research team members discussed strategies prioritized from concept mapping results as well as the current clinical context and strategies used within the FQHC and selected 8 strategies, which were finalized with physician champions and then implemented at each of the three clinics.

Rapid ethnographic assessments

Another tool for community-engaged data collection and analysis in implementation research is rapid ethnographic assessments, also known as Rapid Assessment Procedures (RAP). Distinguishing features of RAP include: 1) formation of a multidisciplinary research team including a member or members of the affected community; 2) development of materials to train community members; 3) use of several data collection methods (e.g., informal interviews, newspaper accounts, agency reports, statistics) to verify information through triangulation; 4) iterative data collection and analysis to facilitate continuous adjustment of the research question and methods to answer that question; and 5) rapid completion of the project, usually in four to six weeks [44, 45].

Rapid assessment procedures have been used in formative and summative evaluation studies of healthcare organization and delivery [46,47,48]. RAP has also been used in conducting evaluations of program implementation [49,50,51,52]. For instance, Holdsworth and colleagues [53] used the rapid assessment approach to evaluate the implementation of an intensive care unit (ICU) redesign initiative aimed at improving patient safety in four academic medical centers in the United States. Steps in their approach included 1) iteratively working with stakeholders to develop evaluation questions; 2) integration of implementation science frameworks into field guides and analytic tools; 3) selecting and training a multidisciplinary site visit team; 4) preparation and trust building for 2-day site visits; 5) engaging sites in a participatory approach to data collection; 6) rapid team analysis and triangulation of data sources and methods using a priori charts derived from implementation frameworks; and 7) validation of findings with sites. Martinez and colleagues [54] proposed to conduct a rapid ethnographic assessment during clinic site visits to collect information on potential barriers and facilitators to implementing measurement-based care to improve youth mental health outcomes in low resource settings.

A data collection and analysis protocol based on RAP principles designed specifically for implementation research is the Rapid Assessment Procedure-Informed Clinical/Community Ethnography (RAPICE), a methodological approach that combines clinical and/or community ethnography and rapid assessment procedures. Originally developed to meet the requirements for time-efficient data collection with minimal participant burden in pragmatic clinical trials [55], RAPICE was adapted for use in community settings to address implementation issues of importance to communities [56]. Both forms of RAPICE include an iterative, team-based approach to data collection and analysis, involving an interaction between ethnographically trained clinicians or community members who act as participant observers (PO) and clinically oriented social scientists and/or community members who act as external analysts [55].

RAPICE can be used to collect and analyze data to address important implementation science research questions, such as what factors act as barriers and facilitators to implementing a specific evidence-based policy, program or practice in a specific setting or context; and what strategies are associated with successful implementation [55]. RAPICE may also be used in conducting formative evaluations of implementation efforts, providing feedback that may be used to modify or supplement these efforts to increase the likelihood of successful implementation [57].

Case study

In a study funded by the Louisiana Department of Health, Springgate and colleagues [28] employed a community-based version of RAPICE to identify how environmental stressors such as hurricanes, floods, major storms, or the COVID-19 pandemic impacted the implementation of Medication for Opioid Use Disorders (MOUD) services for low-income Medicaid enrollees. Academic researchers and community partners concurrently assessed whether telehealth or other innovations in clinical services or coordination of care may be of value in improving implementation and resilience of evidence-based care practices for Opioid Use Disorders to reduce overdoses and improve health during episodes of increased environmental stress. Under the aegis of a community academic partnership, the Community Resilience Learning Collaborative and Research Network (C-LEARN), the Promoting Resilience to Opioid Use Disorders in Louisiana (PROUD-LA) study employed a community-partnered participatory research (CPPR) framework [58, 59] to engage a diverse group of community leaders and researchers to co-lead study design, implementation, and analysis. C-LEARN was formed in 2017 to advance resilience in Southeast Louisiana communities threatened by climate change-related disasters and includes a collaboration of partners in health services delivery, public health, and community-based organizations such as churches, neighborhood associations, and social services providers [10].

Drawing on nominations from community members of C-LEARN’s Leadership Council, a purposive snowball sampling design [60] was used to identify and recruit members of five groups of stakeholders with knowledge of MOUD in Louisiana: Medicaid members between the ages of 25 and 65 years receiving MOUD (n = 17), advocates (n = 2), healthcare providers and pharmacists (n = 9), health care system administrators (n = 10), and public health agency officials in Louisiana with experience with climate-related disasters (n = 4). Participating stakeholders lived or worked in 22 parishes throughout the state of Louisiana. Members of each participant group were recruited for interviews until theoretical saturation was reached (i.e., no new information was obtained from participants) [61].

The PROUD-LA research team conducted virtual semi-structured interviews of participants between January and May 2023, following a guide that had been co-developed by members of the Leadership Council to reflect community knowledge, expertise, concerns, and priorities. The guide evolved iteratively to include a series of questions on six topical themes: 1) disaster planning and lessons from prior disasters; 2) Medicaid members’ engagement with providers, pharmacies, or health services organizations during disasters; 3) challenges experienced by Medicaid members due to hurricanes, floods, major storms, or the COVID-19 pandemic; 4) healthcare providers’ adaptations to these disaster events to ensure patient care; 5) use of telehealth during or following extreme weather events or COVID-19 pandemic surges; and 6) the effects of fentanyl in Louisiana and the United States during these periods of increased environmental stress. Questions were tailored to the unique experiences and perspectives of each participant group. The approximately one-hour interviews were recorded and transcribed for analysis.

Consistent with CPPR and RAPICE principles and practice, the community and academic research team used a rapid analysis approach [62, 63] to summarize content from each interview and identify common themes across interviews. This approach included iterative, team-based reviews of selected transcripts to generate a summary template including neutral domains corresponding to each interview question and tailored to the variety of respondents (e.g., administrator, healthcare provider, patient). Once consistency in use of the template had been established, a pair of academic team members summarized all interview transcripts into standard templates under supervision of the study PIs. The summary templates included representative and illustrative quotations from respondents. Completed summaries were transferred to an Excel spreadsheet to facilitate response comparisons across respondents [64]. The team reviewed templated summaries and matrices to synthesize and identify variations in responses to interview questions and develop written memos to track emerging patterns in data. Preliminary findings were presented to community stakeholder-members of the C-LEARN Leadership Council to inform/clarify key themes and enrich descriptions. A discussion then ensued until academic and community team members reached consensus as to the meaning and significance of the data. In some instances, the Leadership Council recommended the combination of some of the preliminary subthemes to facilitate interpretation or requested further explanation of the significance of some of the subthemes identified in the researchers’ preliminary analysis.

These partnered analyses demonstrated that prospective MOUD-specific disaster planning, flexible clinical procedures, and experience with telehealth to maintain contact and provide care are effective strategies to support implementation of MOUD treatment services during pandemic surges and climate-related extreme weather events. However, findings also highlighted several potential considerations for policies and practices of state Medicaid programs, managed care organizations, providers, and others to benefit members during hurricanes or major community stressors, including changes in Medicaid policies to enable access to MOUD by interstate evacuees, improvement of medication refill flexibilities, and potential incentivization of telehealth services to facilitate more systematic use [28].

Photovoice

Photovoice is a participatory methodology using photographic storytelling [65] where participants take pictures around their homes and communities depicting their lives as impacted by different health and social conditions and then use them for initiating dialog and advocating changes. Photovoice does not require any prior research or photography experience [66] and is adaptable across different groups and public health issues [65], making it ideal for use in low resource settings.

For the most part, Photovoice has been used primarily to design and evaluate interventions [67,68,69], and as an intervention itself (e.g., [70,71,72]). Photovoice has supported collaboration with Veterans, military families, and other key stakeholders to identify barriers to post-deployment care for those with traumatic brain injury and propose solutions for improving community reintegration after separation from military service [73, 74]. A few studies have also relied on this method to identify and develop strategies for implementing an intervention. For instance, Kohrt and colleagues [75] are conducting a type 3 hybrid implementation-effectiveness cluster randomized controlled trial in Nepal to evaluate the implementation-as-usual training for primary care providers (PCPs) compared to an alternative implementation strategy to train PCPs and facilitation by people with lived experience of mental illness (PWLE) and their caregivers using Photovoice. Brazg et al. [76] used the Photovoice methodology to engage high school youth in a community-based assessment of adolescent substance use and abuse. Youth were able to reflect their community’s strengths and concerns with regards to adolescent substance abuse, as they took photographs to answer the question “What contributes to adolescents’ decisions to use or not to use alcohol and other drugs?” This information was seen by the authors as critical to the successful development and implementation of prevention curricula.

Case study

Cabassa and colleagues [29] used Photovoice to engage a purposive sample of 16 English-speaking adults with severe mental illness (SMI; e.g., schizophrenia, bipolar disorder) to participate in a six-week program in which they would learn to take photographs in their communities and discuss issues of health and wellness in their everyday lives. This study was conducted in partnership with two supportive housing agencies in New York City and funded by the New York State Office of Mental Health. Agency staff recommended that study participants should also be participating in the agency’s wellness programs (e.g., nutrition group) and/or have expressed interest in issues of health and wellness.

Weekly 90-min Photovoice groups were conducted over a 6-week period at each agency. During the first session, participants learned how to recruit community members, obtain permission to take their photograph if desired, and explain the purpose of the project and how the photograph would be used. Each participant was given a digital camera, instructed in its operation, and provided with the opportunity to practice taking photographs. Participants were then instructed to take photographs for the following session about what they did to stay healthy.

During sessions two to five, participants were directed to download the pictures they had taken for that session, pick one photograph that best represented the theme for that week, print the photo, and participate in a brief photo-elicitation interview conducted by researchers to discuss the meaning of the chosen photo. This was followed by a group dialogue about what the photographs showed and how they related to the life of the photographer. Dialogues were co-facilitated by members of the research staff and a peer leader. During the last 10 min of the session, participants voted and chose the theme for next week’s photo-assignment.

An analytical working group composed of three research team members conducted all qualitative data analyses for this project. Several member-checking activities, such as presentations to the staff, consumers, and executive boards at each agency, community photo-exhibits, and small group discussions with participants, were conducted to review emerging themes, receive feedback on preliminary interpretations of results, and validate study findings. Analysis trustworthiness and rigor was also ensured by generation of an audit trail consisting of analytical memos and meeting notes, prolonged engagement with participants, triangulation of visual and narrative data, and peer-debriefing sessions [77].

Pile sorting techniques [78] and the constant comparative method derived from grounded theory [32] were used by researchers to develop an integrated coding structure for the narrative and visual data. Five implementation themes were identified related to preferences for the format, content, and methods of health interventions. Community participants expressed a strong preference for using peer-based approaches to deliver health interventions in their housing agencies. The study demonstrated the value of Photovoice in engaging target population participants in implementation research and enabling them to represent and communicate their views of important implementation outcomes through images and narratives [29].

Comparisons of community engagement methods

A comparison of the roles assumed by community members and researchers in data collection and analysis activities in all three case studies is presented in Table 1. In all three studies, both groups were engaged in identification and assessment of implementation determinants and strategies. In all three studies, researchers contributed their knowledge of the methods employed to ensure scientific rigor in the collection and analysis of data. Community members contributed their knowledge of their respective communities to ensure the relevance of data collection and analysis, as well as of the data themselves, to community needs. Both researchers and community members acted as both teachers and learners, providing feedback to one another in an iterative fashion.

Table 1 Community member and researcher roles in implementation data collection and analysis activities by case study method

With respect to data collection, community members played an important role in identifying and recruiting potential participants and determining what data were to be collected in all three studies. Researchers shared responsibility for participant recruitment and training community members in data collection and analysis techniques. Community members also provided information about their communities while researchers received this information using structured and unstructured interview techniques.

Data analysis occurred in three stages. In the first stage, community members in the Concept Mapping case study participated in the pile sort and ranking activities, whereas pile sorting in the Photovoice study was conducted by researchers. Researchers generated the concept maps in the Concept Mapping case study while community members generated photographs and labels in the Photovoice study. In the second stage, researchers coded and conducted thematic analyses of these data, along with data obtained from interviews, in all three studies. In the third stage, community members were responsible for validation and expansion, also known as member checking, of results in all three case studies.

Table 2 provides a comparison of the strengths and weaknesses of community engagement in each case study. All three studies provided high internal validity by utilizing community member preferences to prioritize selection of feasible and acceptable implementation strategies (Concept Mapping study) and obtaining insight into implementation context and determinants (Rapid Ethnographic Assessment and Photovoice studies). All three studies made efforts to address the power dynamics that exist between researchers and community members. In the Concept Mapping study, researchers had control over what is collected and how it is analyzed, while community members had control over how the results were validated. In the Rapid Ethnographic Assessment and Photovoice studies, community members had control over what was collected and how it was validated, while researchers had control over how it was analyzed. However, all three studies had limited external validity or generalizability due to their purposive sampling of study participants and limited geographic representation. The internal validity of the Rapid Ethnographic Assessment methodology was limited by the short timeframe in which data were collected, resulting in less depth of understanding than that afforded by use of traditional ethnographic methods [60]. Fidelity to the methodology was also challenged in all three studies by the absence of a brainstorming session in the Concept Mapping study, participant observation by community members in the Rapid Ethnographic Assessment Study, and community member participation in the Stage 2 coding and analysis of data due to budget limitations in the Photovoice study. The use of each method was also constrained by certain requirements such as the use of expensive software (Concept Mapping), accessibility to study sites (Rapid Ethnographic Assessment), and cameras to collect data (Photovoice).

Table 2 Strengths and weaknesses of community engagement in implementation data collection and analysis by method

In addition to their strengths and weaknesses, there are a few additional considerations when contemplating using these methods to engage community members in data collection and analysis. One consideration is the potential ethical issues involved in community members having access to identifiable human subjects data. Although such information was not collected in the Concept Mapping study, the participation of community members was subject to a review and approval by an Institutional Review Board in all three case studies. In the Photovoice study, community members were given explicit instructions on how to use an identifiable permission form should they wish to photograph another community member. Community members taking the photographs were also required to provide written informed consent. It is recommended, however, that community members undergo human subjects training prior to engaging in collection or analysis of identifiable information. Second, each of these methods requires a considerable time investment on the part of community members, although the method employed in the Rapid Ethnographic Assessment case study was explicitly designed to minimize participant burden. Community members in all three studies were offered some form of compensation (e.g., gift cards) in appreciation for their time. Innovations in methods designed to reduce participant burden and compensation of community members for their participation is recommended to facilitate community engagement.

Discussion

CEnR recognizes and attempts to redress the imbalance of power in academic-community research partnerships [3, 4, 9]. Researchers typically control many of the resources to conduct research (access to funding, formal research skills). However, the power imbalance is typically addressed in the formulation of research questions and dissemination of research results. Co-creation usually involves community engagement in an intervention’s development and implementation [79, 80]. It rarely involves community participation in data collection except as a source of data, and occasionally involves community participation in data analysis through member checking of data collected and analyzed by researchers. Research activities should ideally be used to correct this imbalance if it is to be more impactful [3, 4]. The process of data collection and analysis may also reflect this power imbalance.

The case studies highlighted the role of community members in identifying and recruiting study participants, in providing information related to the topic under investigation, in collecting that information (e.g., through taking photographs or acting as participant observers), in participating in the organization of that information in the first stage of the analysis process (e.g., in the concept mapping exercise), and in validating and interpreting the findings in the second stage of the analysis process. To increase their engagement in these data collection and stage one analysis activities, however, training of community members in data collection and analysis as part of all three methods is highly recommended.

The case studies also illustrate the use of two specific methods commonly found in community-engaged implementation research. First, all three studies included some form of member-checking for validating qualitative data in which results are presented to and reviewed by individuals who provided the data and/or individuals representing the communities from which the data were collected [81,82,83,84]. Second, all three studies used some form of Community Advisory Board (CAB) (or Leadership Council in the case of the PROUD-LA Study) to offer support, nominate participants, and provide leadership and oversight on the conduct of data collection and analysis [85,86,87,88,89]. CABs have participated in the development of focus group agendas [85] and interpretation of research findings [89, 90] in other implementation studies. However, CABs have generally been used as a source of data/information rather than assisting in data collection through their participation in semi-structured interviews [89] and meetings guided by the Delphi Technique meetings, an iterative approach for gaining group consensus on a topic [91, 92].

The three case studies also reflect the division of labor that occurs in community-engaged data collection and analysis. In the first two cases presented, academic researchers collected the data and community members participated in their analyses. In the Photovoice case study, the community collected the data, and researchers participated in the analysis of the data collected. Shared responsibility and ownership do not mean that everyone needs to do the same activities/tasks. Rather, effective CEnR involves the maximization of the unique strengths that researchers and community members bring the partnership. All three methods are dependent upon the intimate knowledge of the community gained from lived experience. Data are collected and analyzed in ways that reflect the relevance of the research focus to the community. However, both the lived experience and assessment of research relevance may vary among community members, necessitating the identification and engagement of multiple groups of community partners. Data collection and analysis are also dependent on the researcher’s theoretical and methodological expertise, which can also vary, necessitating the identification and engagement of interdisciplinary teams of investigators. Data are collected and analyzed in ways that reflect both the rigor of scientific investigation and relevance to community needs [93].

Finally, all three case studies reflect the leveraging of implementation science to achieve health equity through neighborhood and policy interventions [18]. Each study targeted the delivery of an evidence-based program, policy or practice to a population experiencing health disparities (i.e., HPV vaccination of Latinx youth, medication for Opioid Use Disorders for Medicaid enrollees, and health promotion programs for adults with severe mental illness living in supportive housing). Reliance on members of these populations to collect and analyze data, however, occurred in only the Photovoice study. In all three studies, community participation in data collection and/or analysis led to a greater investment on the part of community members and organization to the success of implementation efforts. If equity is to be achieved, greater engagement of the intended beneficiaries of innovative and evidence-based policies, programs and practices is recommended.

Limitations

The three case studies were not intended to be representative of all forms of data collection and analysis in community-engaged implementation research, or even the use of these three specific methods in such research. The HPV study, for instance, modified the first stage of the concept mapping process by presenting participants with a set of statements representing constructs previously elicited from semi-structured interviews and focus group discussions with the same participants. Concept mapping participants usually generate key topics for discussion and formulate statements during a brainstorming session in the first stage [33]. The community-based version of RAPICE in the PROUD-LA Study did not involve participant observation by community members. Previous research using the clinical version of RAPICE engaged participant observers who had dual roles as researchers and community members (in this case, as clinicians working in the study setting) [94, 95].

Conclusions

Despite these limitations, the findings point to several key considerations that should be included in all forms of community-engaged implementation research. These considerations include the following: creation of community advisory boards or leadership councils to advise researchers on whom to recruit to participate, what information should be collected, and how it should be collected; use of sampling strategies that enable community members to assist in participant recruitment; training of community members in collecting information and conducting preliminary (stage one) analyses of the information collected; and systematic use of member-checking activities to enable community members to interpret and validate study findings. Taking these considerations into account is recommended to ensure that implementation research is community-engaged, has internal and external validity, empowers the community and its members, and engenders a commitment on the part of the community to successful implementation outcomes.

Data availability

The study materials and data analyzed for this study are available from the corresponding author on reasonable request.

Abbreviations

CAB:

Community Advisory Board

CeNR:

Community Engaged Research

CEDI:

Community Engaged Implementation Research

C-LEARN:

Community Resilience Learning Collaborative and Research Network

HPV:

Human Papillomavirus

EBS:

Evidence-based Strategies

FQHC:

Federally Qualified Health Center

MOUD:

Medication for Opioid Use Disorders

PCP:

Primary care provider

PROUD-LA:

Promoting Resilience to Opioid Use Disorders in Louisiana

RAP:

Rapid Assessment Procedures

RAPICE:

Rapid Assessment Procedures-Informed Community Ethnography

SMI:

Severe Mental Illness

References

  1. Centers for Disease Control and Prevention (CDC). Community engagement: Definitions and organizing concepts from the literature. 1997. http://www.cdc.gov/phppo/pce/. Accessed 23 Sept 2024.

  2. Han HR, Xu A, Mendez KJW, Okoye S, Cudjoe J, Bahouth M, et al. Exploring community engaged research experiences and preferences: a multi-level qualitative investigation. Res Involv Engagem. 2021;7(1):19. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s40900-021-00261-6. PMID:33785074.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Luger TM, Hamilton AB, True G. Measuring community-engaged research contexts, processes, and outcomes: a mapping review. Milbank Q. 2020;98(2):493–553.

    PubMed  PubMed Central  Google Scholar 

  4. Epps F, Gore J, Flatt JD, Williams IC, Wiese L, Masoud SS, et al. Synthesizing best practices to promote health equity for older adults through community-engaged research. Res Gerontol Nurs. 2024;17(1):9–16. https://doiorg.publicaciones.saludcastillayleon.es/10.3928/19404921-20231205-01.

    Article  PubMed  Google Scholar 

  5. Fagan HB, Ortiz J, Riveros BT. Why CEnR matters for health equity. Del J Public Health. 2018;4(5):4–7. https://doiorg.publicaciones.saludcastillayleon.es/10.32481/djph.2018.11.002.

    Article  Google Scholar 

  6. London RA, Claassen J. Playing for keeps: a long-term community-engaged research partnership to support safe and healthy elementary school recess. Sociol Forum. 2023;38(1):1063–81. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/socf.12906.

    Article  Google Scholar 

  7. Gilmore-Bykovskyi A, Croff R, Glover CM, Jackson JD, Resendez J, Perez A, et al. Traversing the aging research and health equity divide: toward intersectional frameworks of research justice and participation. Gerontologist. 2022;62(5):711–20. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/geront/gnab107.

    Article  PubMed  Google Scholar 

  8. Key KD, Furr-Holden D, Lewis EY, Cunningham R, Zimmerman MA, Johnson-Lawrence, et al. The continuum of community engagement in research: a roadmap for understanding and assessing progress. Prog Community Health Partnersh. 2019;13(4):427–34. https://doiorg.publicaciones.saludcastillayleon.es/10.1353/cpr.2019.0064.

    Article  PubMed  Google Scholar 

  9. Wallerstein N, Duran B, Oetzel JG, Minkler M. Community-based participatory research for health: advancing social and health equity. New York: John Wiley & Sons; 2017.

    Google Scholar 

  10. Springgate B, Sugarman OK, Wells KB, Palinkas LA, Meyers D, Wennerstrom A, et al. Community partnered participatory research in Southeast Louisiana communities threatened by climate change: the C-LEARN experience. Am J Bioethics. 2021;21(10):46–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/15265161.2021.1965248.

    Article  Google Scholar 

  11. Rabin BA, Cain KL, Salgin L, Watson PL Jr, Oswald W, Kaiser BN, et al. Using ethnographic approaches to document, evaluate and facilitate virtual community-engaged implementation research. BMC Public Health. 2023;23(1):409. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12889-023-15299-2.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Goodman MS, Sanders Thompson VL, Johnson CA, Gennarelli R, Drake BF, Bajwa P, et al. Evaluating community engagement in research: quantitative measure development. J Commun Psychol. 2017;45(1):17–32. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/jcop.21828.

    Article  Google Scholar 

  13. Ramanadhan S, Davis MM, Armstrong R, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29(3):363–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10552-018-1008-1.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Hamilton AB, Farmer MM, Moin T, Finley EP, Lang AJ, Oishi SM, et al. Enhancing mental and physical health of women through engagement and retention (EMPOWER): a protocol for a program of research. Implement Sci. 2017;12(1):127. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-017-0658-9.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Knapp AA, Carroll AJ, Mohanty N, Fu E, Powell BJ, Hamilton A, et al. A stakeholder-driven method for selecting implementation strategies: a case example of pediatric hypertension clinical practice guideline implementation. Implement Sci Commun. 2022;3:25. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-022-00276-4.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Schlechter CR, Del Fiol G, Lam CY, Fernandez ME, Greene T, Yack M, et al. Application of community-engaged dissemination and implementation science to improve health equity. Prev Med Rept. 2021;24:101620. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.pmedr.2021.101620.

    Article  Google Scholar 

  17. Blachman-Demmer DR, Wiley TRA, Chambers DA. Fostering integrated approaches to dissemination and implementation and community engaged research. Translat Behav Med. 2017;7:543–6.

    Google Scholar 

  18. Ashcraft LE, Cabrera KI, Lane-Fall MB, South EC. Leveraging implementation science to advance environmental justice research and achieve health equity through neighborhood and policy interventions. Annu Rev Public Health. 2024;45(1):89–108. https://doiorg.publicaciones.saludcastillayleon.es/10.1146/annurev-publhealth-060222-033003.

    Article  PubMed  Google Scholar 

  19. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Mathieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations of Implementing Change (ERIC) project. Implement Sci. 2015;10:21. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-015-0209-1.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated consolidated framework for implementation research based on user feedback. Implement Sci. 2022;17(1):75. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-022-01245-0.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Palinkas LA, Chou CP, Spear SE, Mendon SJ, Villamar J, Brown CH. Measurement of sustainment of prevention programs and initiatives: the sustainment measurement system scale. Implement Sci. 2020;15(1):71. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-020-01030-x.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Cooper C, Watson K, Alvarado F, Carroll AJ, Carson SL, Donenberg G, et al. Community engagement in implementation science: the impact of community engagement activities in the DECPHeR Alliance. Ethn Dis. 2024;DECIPHeR(Spec Issue):52–9. https://doiorg.publicaciones.saludcastillayleon.es/10.18865/ed.DECIPHeR.52.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Petagna CN, Perez S, Hsu E, Greene BM, Banner I, Bednarczyk RA, et al. Facilitators and barriers of HPV vaccination: a qualitative study in rural Georgia. BMC Cancer. 2024;24(1):592. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12885-024-12351-1.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Crabtree BF, Miller WL. Doing qualitative research. 3rd ed. Thousand Oaks: Sage; 2022.

    Google Scholar 

  25. Gullahorn B, Kuo I, Robinson AM, Bailey J, Loken J, Taggart T. Identifying facilitators and barriers to the uptake of medication for opioid use disorder in Washington DC: a community-engaged concept mapping approach. PLoS ONE. 2024;19(7):e0306931. https://doiorg.publicaciones.saludcastillayleon.es/10.1371/journal.pone.0306931.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  26. Yin RK. Case study research: design and methods. Thousand Oaks: Sage; 2003.

    Google Scholar 

  27. Tsui J, Shin M, Sloan K, Mackie TI, Garcia S, Fehrenbacher AE, et al. Use of concept mapping to inform a participatory engagement approach for implementation of evidence-based HPV vaccination strategies in safety-net clinics. Implement Sci Commun. 2024;5(1):71. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-024-00607-7.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Springgate B, Matta I, True G, Doran H, Torres WV, Stevens E, et al. Implementation of medication for opioid use disorder treatment during a natural disaster: the PROUD-LA study. J Subst Use Addict Treat. 2024;165:209469. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.josat.2024.209469.

    Article  CAS  PubMed  Google Scholar 

  29. Cabassa LJ, Parcesepe A, Nicasio A, Baxter E, Tsemberis S, Lewis-Fernández R. Health and wellness photovoice project: engaging consumers with serious mental illness in health care interventions. Qual Health Res. 2013;23(5):618–30. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1049732312470872.

    Article  PubMed  Google Scholar 

  30. Israel BA, Eng E, Schulz AJ, Parker EA, editors. Methods in community-based participatory research for health. 2nd ed. San Francisco: Jossey-Bass; 2012.

    Google Scholar 

  31. Minkler M, Wallerstein N, editors. Community-based participatory research for health: from orocesses to outcomes. 2nd ed. San Francisco: Jossey-Bass; 2008.

    Google Scholar 

  32. Strauss AL, Corbin J. Basics of qualitative research: techniques and procedures for developing grounded theory. Thousand Oaks: Sage; 1998.

    Google Scholar 

  33. Trochim WM. An introduction to concept mapping for planning and evaluation. Eval Program Plann. 1989;12:1–16.

    Google Scholar 

  34. Burke JG, O’Campo P, Peak GL, Gielen AC, McDonnell KA, Trochim WM. An introduction to concept mapping as a participatory public health research methodology. Qual Health Res. 2005;15:1392–410.

    PubMed  Google Scholar 

  35. GroupWisdom. The Concept System® groupwisdom™. 2022. [Web-based Platform] [Available from: https://www.groupwisdom.Tech.

  36. Green AE, Fettes DL, Aarons GA. A concept mapping approach to guide and understand dissemination and implementation. J Behav Health Serv Res. 2012;39(4):362–73.

    PubMed  Google Scholar 

  37. Aarons GA, Wells R, Zagursky K, Fettes D, Palinkas LA. Implementing evidence-based practice in community mental health agencies: multiple stakeholder perspectives. Am J Public Health. 2009;99(11):2087–95.

    PubMed  PubMed Central  Google Scholar 

  38. Gobin R, Thomas T, Goberdhan S, Sharma M, Nasiiro R, Emmanuel R, et al. Readiness of primary care centres for a community-based intervention to prevent and control of nocommunicable diseases in the Caribbean: a participatory mixed-methods study. PLoS ONE. 2024;19(4):e0301503. https://doiorg.publicaciones.saludcastillayleon.es/10.1371/journal.pone.0301503.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  39. Ahmad F, Norman C, O’Campo P. What is needed to implement a computer-assisted health risk assessment tool? An exploratory concept mapping study. BMC Med Inform Decis Mak. 2012;12:149. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1472-6947-12-149.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Lobb R, Pinto AD, Lofters A. Using concept mapping in the knowledge-to-action process to compare stakeholder opinions on barriers to use of cancer screening among South Asians. Implement Sci. 2013;8:37.

    PubMed  PubMed Central  Google Scholar 

  41. Saragosa AC, Flatt JD, Buccini G.Using concept mapping to co-create implementation strategies to address maternal-child food insecurity during the first 1000 days of life. Matern Child Nutr. 2024:e13739. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/mcn.13739. Online ahead of print.

  42. Tsui J, Shin M, Sloan K, Martinez B, Palinkas LA, Baezconde-Garbanati L, et al. Understanding clinic and community member experiences with implementation of evidence-based strategies for HPV vaccination in safety-net primary care settings. Prev Sci. 2023. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11121-023-01568-4. Online ahead of print.

  43. Cohen D, McDaniel RR Jr, Crabtree BF, Ruhe MC, Weyer SM, Tallia A, et al. A practice change model for quality improvement in primary care practice. J Healthc Manag. 2004;49(3):155–68. discussion 69–70.

    PubMed  Google Scholar 

  44. Beebe J. Basic concepts and techniques of rapid appraisal. Human Org. 1995;54:42–51.

    Google Scholar 

  45. Harris KJ, Jerome NW, Fawcett SB. Rapid assessment procedures: a review and critique. Human Org. 1997;56(3):375–8.

    Google Scholar 

  46. Scrimshaw SCM, Hurtado E. Rapid assessment procedures for nutrition and primary health care: anthropological approaches to improving programme effectiveness. Los Angeles: UCLA Latin American Center; 1987.

    Google Scholar 

  47. Scrimshaw SC, Carballo M, Ramos L, Blair BA. The AIDS rapid assessment procedures: a tool for health education planning and evaluation. Health Ed Q. 1991;18(1):111–23.

    CAS  Google Scholar 

  48. Vindrola-Padros C, Vindrola-Padros B. Quick and dirty? A systematic review of the use of rapid ethnographies in healthcare organization and delivery. BMJ Qual Saf. 2018;27(4):321–30. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2017-007226.

    Article  PubMed  Google Scholar 

  49. Goepp JG, Meykler S, Mooney NE, Lyon C, Raso R, Julliard K. Provider insights about palliative care barriers and facilitators: results of a rapid ethnographic assessment. Am J Hospital Palliat Care. 2008;25:309–14.

    Google Scholar 

  50. Schwitters A, Lederer P, Zilversmit L, Gudo PS, Ramiro I, Cumba L, et al. Barriers to health care in rural Mozambique: a rapid assessment of planned mobile health clinics for ART. Glob Health Sci Pract. 2015;3:109–16.

    PubMed  PubMed Central  Google Scholar 

  51. Wright A, Sittig DF, Ash JS, Erikson JL, Hickman TT, Paterno M, et al. Lessons learned from implementing service-oriented clinical decision support at four sites: a qualitative study. Int J Med Inform. 2015;84:901–11.

    PubMed  Google Scholar 

  52. Gertner AK, Franklin J, Roth I, Cruden GH, Haley AD, Finley EP, et al. A scoping review of the use of ethnographic approaches in implementation research and recommendations for reporting. Implement Sci Pract. 2021;2:2633489521992743. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/2633489521992743.

    Article  Google Scholar 

  53. Holdsworth LM, Safaenili N, Winget M, Lorenz KA, Lough M, Asch S, et al. Adapting rapid assessment procedures for implementation research using a team-based approach to analysis: a case example of patient quality and safety interventions in the ICU. Implement Sci. 2020;15:12.

    PubMed  PubMed Central  Google Scholar 

  54. Martinez RG, Weiner BJ, Meza RD, Dorsey S, Palazzo LG, Matson A, et al. Study protocol: Novel Methods for Implementing Measurement-Based Care with youth in Low-Resource Environments (NIMBLE). Implement Sci Commun. 2023;4(1):152. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-023-00526-z.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Palinkas LA, Zatzick D. Rapid Assessment Procedure Informed Clinical Ethnography (RAPICE) in pragmatic clinical trials of mental health services implementation: methods and applied case study. Adm Policy Ment Health. 2019;46(2):255–70.

    PubMed  PubMed Central  Google Scholar 

  56. Palinkas LA, Springgate BF, Sugarman OK, Hancock J, Wennerstrom A, Haywood C, et al. A rapid assessment of disaster preparedness needs and resources during the COVID-19 pandemic. Int J Environ Res Public Health. 2021;18(2):425. https://doiorg.publicaciones.saludcastillayleon.es/10.3390/ijerph18020425.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  57. Palinkas LA, Mendon SJ, Hamilton AB. Innovations in mixed methods evaluations. Annu Rev Public Health. 2019;40:423–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1146/annurev-publhealth-040218-044215.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Jones L, Wells K. Strategies for academic and clinician engagement in community-participatory partnered research. JAMA. 2007;297(4):407–10. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jama.297.4.407.

    Article  CAS  PubMed  Google Scholar 

  59. Springgate BF, Wells KW. Partnered participatory research to build community capacity and address mental health disaster. Ethn Dis. 2011;21(3 Suppl 1):S1-3–4.

    Google Scholar 

  60. Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood KE. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Adm Policy Ment Health. 2015;42:533–44. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10488-013-0528-y.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Guest G, Bunce A, Johnson L. How many interviews are enough? An experiment with data saturation and variability. Field Methods. 2006;18:59–82. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1525822X05279903.

    Article  Google Scholar 

  62. Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.psychres.2019.112516.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Vindrola-Padros C, Johnson GA. Rapid techniques in qualitative research: a critical review of the literature. Qual Health Res. 2020;30(10):1596–604. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1049732320921835.

    Article  PubMed  Google Scholar 

  64. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855–66. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/104973230201200611.

    Article  PubMed  Google Scholar 

  65. Wang C, Burris MA. Photovoice: concept, methodology, and use for participatory needs assessment. Health Educ Behav. 1997;24(3):369–87.

    CAS  PubMed  Google Scholar 

  66. Barry J, Higgins A. Photovoice: an ideal methodology for use within recovery-oriented mental health research. Issues Ment Health Nurs. 2021;42(7):676–81. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/01612840.2020.1833120.

    Article  PubMed  Google Scholar 

  67. Nganda M, Luhaka P, Kukola J, Ding Y, Bulambo C, Kadima J, et al. Participatory development for collaborative with people with lived experience of mental health conditions to strengthen mental health services. Int Health. 2024;16(Supplement 1):i30–41. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/inthealth/ihae008.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Amenyah SD, Murphy J, Fenge LA. Evaluation of a health-related intervention to reduce overweight, obesity and increase employment in France and the United Kingdom: a mixed-methods realist evaluation protocol. BMC Public Health. 2021;21(1):582. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12889-021-10523-3.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Arora SRA, Shama W, Lucchetta S, Markowitz S, Yohan A. The cancer journey through the lens of a sibling: a photovoice intervention for siblings of children with cancer. Soc Work Health Care. 2021;60(5):430–47. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/00981389.2021.1926397.

    Article  PubMed  Google Scholar 

  70. Sessford JD, Chan K, Kaiser A, Singh H, Munce S, Alavinia M, et al. Protocol for a single group, mixed methods study investigating the efficacy of photovoice to improve self-efficacy related to balance and falls for spinal cord injury BMJ Open. 2022;12(12):e065684. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjopen-2022-065684.

    Article  PubMed  Google Scholar 

  71. Gorbenko KO, Riggs AR, Koeppel B, Phlegar S, Dubinsky MC, Ungaro R, et al. Photovoice as a tool to improve ptient-provider communication in an inflammatory bowel disease clinic: a feasibility study. Eval Clin Pract. 2022;28(1):159–68. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/jep.13609.

    Article  Google Scholar 

  72. Russinova Z, Mizock L, Bloch P. Photovoice as a tool to understand the experience of stigma among individuals with serious mental illnesses. Stigma Health. 2018;3(3):171. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/sah0000080.

    Article  Google Scholar 

  73. True JG, Davidson L, Meyer DV, Urbina S, Ono SS. “Institutions don’t hug people:” a roadmap for building trust, connectedness, and purpose through Photo voice Collaboration. J Humanist Psychol. 2021;61(3):365–404.

    Google Scholar 

  74. True G, Facundo R, Urbina C, Sheldon S, Southhall JD, Ono SS. “If you don’t name the dragon, you can’t begin to slay it:” participatory action research to increase awareness around military-related traumatic brain injury. J Commun Engagem Scholarsh. 2021;13(4):3. https://digitalcommons.northgeorgia.edu/jces/vol13/iss4/3.

    Google Scholar 

  75. Kohrt BA, Turner EL, Gurung D, Wang X, Neupane M, Luitel NP, et al. Implementation strategy in collaboration with people with lived experience of mental illness to reduce stigma among primary care providers in Nepal (RESHAPE): protocol for a type 3 hybrid implementation effectiveness cluster randomized controlled trial. Implement Sci. 2022;17(1):39. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-022-01202-x.

    Article  PubMed  PubMed Central  Google Scholar 

  76. Brazg T, Bekemeier B, Spigner C, Huebner CE. Our community in focus: the use of photovoice for youth-driven substance abuse assessment and health promotion. Health Promot Pract. 2011;12(4):502–11. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1524839909358659.

    Article  PubMed  Google Scholar 

  77. Padgett DK. Qualitative methods in social work research: challenges and rewards. Thousand Oaks: Sage; 1998.

    Google Scholar 

  78. Bernard H. Research methods in anthropology: qualitative and quantitative approaches. Walnut Creek: AltaMira Press; 2002.

    Google Scholar 

  79. Leask CF, Sandlund M, Skelton DA, Altenberg TM, Cardon G, Chinapaw MJM, et al. Framework, principles and recommendations for utilizing participatory methodologies in the co-creation and evaluation of public health interventions. Res Involv Engagem. 2019;5:2. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s-40900-018-0136-9.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Vargas C, Whelan J, Brimblecombe J, Allender S. Co-creation, co-design and co-production for public health: a perspective on definitions and distinctions. Public Health Res Pract. 2022;32(2):e3222211. https://doiorg.publicaciones.saludcastillayleon.es/10.17061/phrp3222211.

    Article  Google Scholar 

  81. Patton MQ. Qualitative research and evaluation methods. 3rd ed. Thousand Oaks: Sage; 2002.

    Google Scholar 

  82. Birt L, Scott S, Cavers D, Campbell C, Walter F. Member checking: a tool to enhance trustworthiness or merely a nod to validation? Qual Health Res. 2016;26(13):1802–11. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1049732316654870.

    Article  PubMed  Google Scholar 

  83. Tshuma N, Elakpa DN, Moyo C, Soboyisi M, Moyo S, Mpofu S, et al. The transformative impact of community-led monitoring in the South African health system: a comprehensive analysis. Int J Public Health. 2024;69:1606591. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/ijph.2024.1606591.

    Article  PubMed  PubMed Central  Google Scholar 

  84. De Poli C, Oyebode JR, Binns C, Glover R, Airoldi M. Effectiveness-implementation hybrid type 2 study evaluating an intervention to support ‘information work’ in dementia care: an implementation study protocol. BMJ Open. 2020;10(12):e038397. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjopen-2020-038397.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Dulin MF, Tapp H, Smith HA, Urquieta de Hernandez B, Furuseth OJ. A community-based participatory approach to improving health in a Hispanic population. Implement Sci. 2011;6:38.

    PubMed  PubMed Central  Google Scholar 

  86. James AS, Richardson V, Wang JS, Proctor EK, Colditz GA. Systems intervention to promote colon cancer screening in safety net setting: protocol for a community-based participatory randomized controlled trial. Implement Sci. 2013;8:58.

    PubMed  PubMed Central  Google Scholar 

  87. Jurkowski JM, Green Mills LL, Lawson HA, Bovenzi MC, Quartimon R, Davison KK. Engaging low-income parents in childhood obesity prevention from start to finish: a case study. J Community Health. 2013;38(1):1–11. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10900-012-9573-9.

    Article  PubMed  Google Scholar 

  88. Cho B, Kleven L, Woods-Jaeger B. Bringing out the best in parenting: translating community-engaged research on adversity and parenting to policy. Prog Community Health Partnersh. 2021;15(3):271–84. https://doiorg.publicaciones.saludcastillayleon.es/10.1353/cpr.2021.0031.

    Article  PubMed  Google Scholar 

  89. Stadnick NA, Laurent LC, Cain KL, Seifert M, Burola ML, Salgin L, et al. Community-engaged optimization of COVID-19 rapid evaluation and testing experience: roll-out implementation optimization trial. Implement Sci. 2023;18:46.

    PubMed  PubMed Central  Google Scholar 

  90. Valentine SE, Fuchs C, Carlson M, Elwy AR. Leveraging multistakeholder engagement to develop an implementation blueprint for a brief trauma-focused cognitive behavioral therapy in primary care. Psychol Trauma. 2022;14(6):914–23. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/tra0001145.

    Article  PubMed  Google Scholar 

  91. Bilenduke E, Dwyer AJ, Staples ES, Kilbourn K, Valverde PA, Fernández ME, Risendal BC. A practical method for integrating community priorities in planning and implementing cancer control programs. Cancer Causes Control. 2023;34(Suppl 1):113–23. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10552-023-01688-w.

    Article  PubMed  PubMed Central  Google Scholar 

  92. Thompson D, Callender C, Dave JM, Jibaja-Weiss ML, Montealegre JR. Health equity in action: using community-engaged research to update an intervention promoting a health home food environment to Black/African American families. Cancer Causes Control. 2024;35(2):311–21. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10552-023-01753-4.

    Article  PubMed  Google Scholar 

  93. Palinkas LA. Rigor and relevance in social work science. In: Brekke J, Anastas J, editors. Shaping a science of social work: professional knowledge and identity. Oxford University Press; 2019. p. 176–98.

    Google Scholar 

  94. Palinkas LA, Whiteside L, Nehra D, Engstrom A, Taylor M, Moloney K, et al. Rapid ethnographic assessment of the COVID-19 pandemic April 2020 “Surge” and its impact on service delivery in an acute care medical emergency department and trauma center. BMJ Open. 2020;10(10):e041772. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjopen-2020-041772.

    Article  PubMed  PubMed Central  Google Scholar 

  95. Palinkas LA, Engstrom A, Whiteside L, Moloney K, Zatzick D. A rapid ethnographic assessment of the impact of the COVID-19 pandemic on mental health services delivery in an acute care medical emergency department and trauma center. Adm Policy Ment Health. 2021;49:157–67. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10488-021-01154-2.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Funding

This work was funded by a National Cancer Institute Award (5R37CA242541; PI: Tsui). SG’s effort on this study was supported in part by the Multidisciplinary Training in Ethnic Diversity and Cancer Disparities (5T32CA229110-04; PI: Loic Le Marchand).

Author information

Authors and Affiliations

Authors

Contributions

LAP, JT, and BFC contributed to the conception and design of the study. LAP, JT, BS, LC, and BFC contributed to the analysis of the data. LAP wrote the first draft. JT, BFC, BS, LC, MS, and SG contributed to the interpretation of data analysis, manuscript revisions, and read and approved the final manuscript.

Corresponding author

Correspondence to Jennifer Tsui.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the University of Southern California’s Institutional Review Board (Protocol # UP-20–00541) under exempt status.

Consent for publication

We consent to publication.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Palinkas, L.A., Springgate, B., Cabassa, L.J. et al. Methods for community-engaged data collection and analysis in implementation research. Implement Sci Commun 6, 38 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00722-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00722-z

Keywords