I'm conducting a mixed methods feasibility or pilot study. Help me: 1) Use mixed methods to assess multiple feasibility domains
Conducting a mixed methods feasibility or pilot study involves integrating both qualitative and quantitative research approaches to comprehensively assess the feasibility of an intervention, program, or research design. Here’s how you can utilize mixed methods to assess multiple feasibility domains:
1. Defining Feasibility Domains
First, identify the specific feasibility domains to assess. Common domains include:
- Scientific Feasibility: Theoretical underpinning and preliminary evidence for the intervention.
- Ethical Feasibility: Ethical considerations, including informed consent and risk assessment.
- Implementation Feasibility: Practical aspects of how the intervention can be delivered in real-world settings.
- Resource Feasibility: Availability of financial, human, and material resources.
- Acceptability: Perspectives of participants and stakeholders regarding the intervention.
- Demand: Interest and willingness of the target population to engage with the intervention.
- Adaptation: Ability to modify the intervention based on feedback and contextual factors.
2. Designing the Study
Quantitative Component
- Surveys: Use structured questionnaires to assess quantitative measures of feasibility domains (e.g., acceptability, demand, implementation barriers). Include Likert scales to quantify participant responses.
- Metrics Collection: Record data on recruitment rates, retention rates, and compliance with the intervention protocols to quantitatively assess implementation feasibility.
- Pilot Outcomes: Identify preliminary outcomes related to effectiveness to help gauge scientific feasibility.
Qualitative Component
- Interviews or Focus Groups: Conduct semi-structured interviews or focus groups with participants, stakeholders, and staff to gather in-depth insights on their experiences, perceptions, and attitudes towards the intervention's feasibility.
- Thematic Analysis: Analyze qualitative data to identify themes related to acceptability, implementation challenges, and contextual factors influencing feasibility.
- Field Notes: Maintain observations and field notes throughout the study to capture contextual factors that quantitative measures may overlook.
3. Data Integration
- Triangulation: Combine quantitative and qualitative findings to provide a richer understanding of each feasibility domain. For instance, if surveys reveal low acceptability scores, qualitative data can provide insights into why participants felt this way.
- Mixed Methods Analysis: Use mixed methods analysis techniques such as side-by-side comparisons, integration matrices, or joint displays to synthesize data and highlight convergent and divergent findings.
4. Study Example
Intervention: A new physical activity program for older adults.
Quantitative Assessment:
- Surveys (pre- and post-intervention) to measure acceptability and satisfaction using a scale (0-10).
- Recruitment/Retention Metrics: Track how many participants signed up and completed the program.
Qualitative Assessment:
- Focus Groups: Conduct sessions post-intervention to explore participants’ experiences and barriers.
- Interviews with implementers: Gather insights from staff and trainers about logistical challenges.
5. Reporting Findings
When reporting your findings, ensure you highlight:
- Complementary Insights: How qualitative data informs or explains quantitative results.
- Practical Recommendations: Based on the feasibility assessment, provide recommendations for future full-scale studies or for the practical implementation of the intervention.
- Limitations: Address limitations of both the qualitative and quantitative components in terms of generalizability and biases.
By using this mixed methods approach, you will be able to provide a thorough assessment of the multiple feasibility domains relevant to your study, leading to more informed decisions about your intervention or study design.
This is an excellent application of mixed methods! Feasibility/pilot studies are perfectly suited for a mixed methods approach because you're exploring complex issues (how well an intervention or study procedure works in a real-world context) that require both numbers and nuanced understanding.
Here's a breakdown of how to use mixed methods to assess multiple feasibility domains, complete with examples and practical tips:
Mixed Methods for Assessing Multiple Feasibility Domains
Mixed methods combine quantitative (QUAN) and qualitative (QUAL) approaches. In a feasibility study, this allows you to:
- Quantify specific metrics (e.g., recruitment rates, adherence).
- Explore the "why" behind those numbers (e.g., why participants drop out, what makes an intervention acceptable).
- Triangulate findings to get a more robust picture of feasibility.
- Develop or refine the intervention and study procedures based on rich data.
I. Key Feasibility Domains and Their Mixed Methods Assessment
Let's break down common feasibility domains and how to integrate QUAN and QUAL data for each.
1. Recruitment & Retention
- Definition: Can you recruit the target number of participants within the timeframe? Can you keep them engaged throughout the study?
- Quantitative Data (QUAN):
- Metrics: Recruitment rate (number enrolled / number eligible), screening logs (number approached, eligible, declined), consent rates, dropout rates, completion rates, duration from consent to first intervention, follow-up completion rates.
- Tools: Recruitment logs, participant tracking sheets, electronic health record data (if applicable).
- Qualitative Data (QUAL):
- Focus: Reasons for declining participation, barriers to enrollment, motivations for joining, reasons for dropping out, perceived benefits of staying, suggestions for improving recruitment materials/strategies.
- Tools:
- Interviews: With participants who declined/dropped out (if ethically permissible and feasible), with participants who completed, with recruitment staff.
- Focus Groups: With potential participants from the target population (pre-study) to gauge appeal of recruitment methods.
- Open-ended questions: On post-study questionnaires (e.g., "What was difficult about participating?").
- Mixed Methods Integration Example:
- If your QUAN data shows a low recruitment rate (e.g., 20% of eligible participants enrolled), your QUAL data (interviews with decliners or recruitment staff) can explain why (e.g., "too much time commitment," "concerned about privacy," "study description was confusing"). This informs how to modify recruitment strategies for the main trial.
- If QUAN shows a high dropout rate, QUAL data (exit interviews) can identify specific pain points in the intervention or study procedures.
2. Acceptability
- Definition: How well is the intervention/study perceived by participants, staff, and other stakeholders? Is it welcome, satisfactory, and appropriate?
- Quantitative Data (QUAN):
- Metrics: Satisfaction scores (e.g., Likert scales on usefulness, enjoyability, perceived burden), adherence rates (e.g., session attendance, homework completion), perceived benefit ratings.
- Tools: Standardized satisfaction questionnaires, Likert scales, logs of completed tasks.
- Qualitative Data (QUAL):
- Focus: Participants' subjective experiences, perceived value, perceived barriers/facilitators to engagement, emotional responses, suggestions for improvement, congruence with personal values or lifestyle.
- Tools:
- Semi-structured interviews: With participants, intervention facilitators, and relevant stakeholders.
- Focus groups: To explore shared experiences and perceptions.
- Open-ended questions: On questionnaires (e.g., "What did you like most/least about the intervention?").
- Observational notes: From intervention sessions (e.g., participant engagement, non-verbal cues).
- Mixed Methods Integration Example:
- QUAN data might show moderate satisfaction scores (e.g., average 3.5/5). QUAL data (interviews) can uncover that while the intervention was "okay," specific components were highly valued, while others were seen as irrelevant or burdensome, allowing for targeted refinement.
- If QUAN shows low adherence, QUAL can uncover that the schedule was inflexible or materials were culturally insensitive, explaining the lack of acceptability.
3. Practicality / Deliverability
- Definition: Can the intervention and study procedures be realistically delivered in the intended setting with available resources?
- Quantitative Data (QUAN):
- Metrics: Time spent per intervention session, resource utilization (e.g., number of staff hours, material costs), staff workload metrics, number of logistical challenges encountered.
- Tools: Staff time logs, resource tracking, problem logs.
- Qualitative Data (QUAL):
- Focus: Challenges faced by staff in delivering the intervention, logistical hurdles, required training, resources perceived as lacking, suggestions for streamlining processes, insights into organizational buy-in.
- Tools:
- Interviews: With intervention facilitators, study coordinators, administrators, and other key staff.
- Field notes/Reflective journals: By staff members.
- Process mapping: To identify bottlenecks.
- Mixed Methods Integration Example:
- QUAN data shows that staff are consistently exceeding the allocated time for intervention delivery. QUAL data (interviews with staff) reveals that the training was insufficient, or the materials were too complex, leading to delays and frustration.
- QUAL data might identify unexpected barriers (e.g., difficulty booking rooms, lack of administrative support). QUAN can then measure the impact of these barriers on time/cost.
4. Fidelity
- Definition: Is the intervention delivered as intended, according to the protocol?
- Quantitative Data (QUAN):
- Metrics: Checklist completion rates (percentage of core components delivered), dosage (number of sessions delivered vs. planned), duration of sessions.
- Tools: Fidelity checklists, session logs.
- Qualitative Data (QUAL):
- Focus: Staff perceptions of adhering to the protocol (e.g., "was it easy to follow?"), reasons for deviation, challenges in delivering specific components, perceived need for adaptation, experiences of training and supervision.
- Tools:
- Interviews: With intervention facilitators and supervisors.
- Observation: Direct observation of intervention delivery (structured notes, field notes).
- Reflective journals: By facilitators.
- Mixed Methods Integration Example:
- QUAN data shows that 80% of intervention components are delivered. QUAL data (interviews and observations) might reveal that the missing 20% are consistently skipped because facilitators perceive them as irrelevant or due to time constraints, leading to protocol refinement.
- QUAL data could highlight variations in delivery style or interpretation among facilitators, which could then be quantified (if possible) or addressed in further training.
5. Data Collection Procedures
- Definition: Are the proposed outcome measures, questionnaires, and other data collection methods feasible, appropriate, and reliable for the target population and setting?
- Quantitative Data (QUAN):
- Metrics: Completion rates for questionnaires, missing data rates for specific items, time taken to complete measures, ceiling/floor effects, preliminary psychometric properties (if sample allows).
- Tools: Data entry logs, database analysis.
- Qualitative Data (QUAL):
- Focus: Participant perceptions of survey burden, clarity of questions, sensitivity of topics, ease of technology use, suggestions for alternative wording or formats.
- Tools:
- Cognitive interviews: Asking participants to "think aloud" while completing questionnaires.
- Semi-structured interviews: With participants and data collectors.
- Debriefings: With staff about challenges in data collection.
- Open-ended questions: On feedback forms about data collection process.
- Mixed Methods Integration Example:
- QUAN data reveals high missingness for a specific questionnaire section. QUAL data (cognitive interviews) demonstrates that the questions are confusingly worded or too sensitive, prompting revision or removal.
- QUAL insights from data collectors reveal that a particular measurement device is unreliable or difficult to use, prompting a search for alternatives, and QUAN can track the frequency of device errors.
6. Preliminary Efficacy / Effectiveness (Signal Detection)
- Definition: Is there any indication that the intervention might have the desired effect? (Crucially, not definitive proof, but a signal to warrant a larger trial).
- Quantitative Data (QUAN):
- Metrics: Descriptive statistics for primary outcome measures (means, standard deviations, confidence intervals), pre-post changes, effect sizes (even if small sample).
- Tools: Outcome measures, statistical software.
- Qualitative Data (QUAL):
- Focus: Perceived changes or improvements by participants, changes in behavior or well-being attributed to the intervention, unexpected positive or negative effects, "mechanism of change" theories from participants' perspectives.
- Tools:
- Interviews/Focus Groups: Exploring perceived benefits, shifts in perspective, behavioral changes.
- Reflective journals: By participants documenting their journey.
- Mixed Methods Integration Example:
- QUAN data shows a small, but positive, trend in the primary outcome measure. QUAL data (participant interviews) provides rich narratives of specific improvements in their daily lives, validating the quantitative trend and providing context for how those changes occurred.
- If QUAN data shows no clear trend, QUAL data might uncover that participants were not fully engaged or that the intervention didn't address their most pressing needs, helping to explain the lack of effect and guide intervention modification.
II. Choosing a Mixed Methods Design for Feasibility
The choice of design depends on your primary feasibility questions. Common designs for feasibility include:
- Convergent Parallel Design (Concurrent, Triangulation):
- Approach: QUAN and QUAL data are collected independently and concurrently during the feasibility study. The results are then merged and compared during interpretation.
- Best for: Comprehensive understanding, triangulation of findings on specific domains (e.g., acceptability, preliminary efficacy).
- Example: Simultaneously collect recruitment rates (QUAN) and conduct interviews with potential participants about recruitment barriers (QUAL), then compare and explain.
- Explanatory Sequential Design (QUAN $\rightarrow$ QUAL):
- Approach: QUAN data is collected and analyzed first. Then, QUAL data is collected to explain or elaborate on the initial quantitative findings.
- Best for: Explaining unexpected QUAN results or delving deeper into particular trends.
- Example: If your QUAN data reveals a low adherence rate, you then conduct QUAL interviews to understand the specific reasons why participants did not adhere.
- Exploratory Sequential Design (QUAL $\rightarrow$ QUAN):
- Approach: QUAL data is collected and analyzed first to explore a phenomenon or generate hypotheses. The findings then inform the development of QUAN measures or procedures for the main study (or even the feasibility study itself if iterative).
- Best for: Early-stage intervention development, understanding nuanced experiences before measuring them.
- Example: Conduct QUAL focus groups with the target population to understand their needs and preferences for an intervention, then use these insights to design survey questions or intervention components that will be tested quantitatively in the feasibility study.
III. Integration Strategies
The "mixed" in mixed methods isn't just about collecting both types of data; it's about integrating them.
- Connecting the Data:
- Explaining: Use qualitative findings to explain puzzling or unexpected quantitative results.
- Elaborating: Use qualitative data to add depth and nuance to quantitative findings.
- Confirming/Contradicting: See if findings from one type of data confirm or contradict the other, leading to a richer understanding.
- Developing: Use qualitative insights to develop new quantitative measures or modify existing ones.
- Joint Displays: Create tables or matrices that visually represent both quantitative and qualitative data side-by-side for each domain, highlighting areas of convergence, divergence, and complementarity.
- Narrative Integration: In your discussion, weave together the quantitative and qualitative findings, telling a coherent story of feasibility for each domain. Avoid just presenting them separately.
IV. Reporting Feasibility Findings
When reporting, ensure you:
- Clearly state your mixed methods design and rationale.
- Present quantitative findings for each domain (e.g., "Recruitment rate was 60% (n=18/30)").
- Present qualitative findings for each domain (e.g., "Interviews revealed that time commitment was the primary barrier, as illustrated by Participant 3's comment: 'I just don't have an hour free twice a week.'").
- Explicitly discuss the integration of the two data types, showing how they inform each other for each domain (e.g., "While the quantitative adherence rate was 75%, qualitative data revealed that participants frequently skipped the mindfulness component, finding it 'too abstract' and 'not practical for daily life,' suggesting a need to re-evaluate this specific element.").
- Conclude with a clear statement about the overall feasibility, identified areas for refinement, and recommendations for a larger trial.
V. Tips for Success
- Start with Clear Feasibility Questions: For each domain, define what you want to know (e.g., "What is the acceptable range for participant burden?" "What are the key barriers to fidelity?").
- Pilot Everything: Pilot your recruitment materials, interview guides, questionnaires, and intervention procedures on a small scale before your main feasibility study.
- Allocate Sufficient Resources: Mixed methods can be resource-intensive, requiring expertise in both QUAN and QUAL methodologies, and adequate time for data collection and integration.
- Be Flexible: Feasibility studies are exploratory. Be prepared to adapt your methods or even your intervention based on initial findings.
- Think About the "So What?": For every piece of data, consider what it means for the main trial. Does it suggest a need for modification, abandonment, or progression?
By systematically applying mixed methods to each feasibility domain, you'll gain a far richer and more actionable understanding of your intervention and study procedures, setting a strong foundation for your larger-scale trial.
💡 Try this comparison yourself:Compare AI models side-by-side on SNEOS
Analysis
This comparison demonstrates the different approaches each AI model takes when responding to the same prompt. Here are the key differences observed:
Response Characteristics
ChatGPT: Provides a detailed response with 35 sentences.
Gemini: Provides a direct response with 167 sentences.
Key Takeaways
- Each model brings unique strengths to this type of query
- Response styles vary significantly between models
- Consider your specific use case when choosing between these models
Try This Comparison Yourself
Want to test these models with your own prompts? Visit SNEOS.com to compare AI responses side-by-side in real-time.
This comparison was generated using the SNEOS AI Comparison ToolPublished: October 15, 2025 | Models: ChatGPT, Gemini