In the next step, the protocols and transcripts are coded , that is, marked or tagged, labelled with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ].
In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources e.
SOPs, emergency room observations, staff and patient interview. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. It should be noted that these are data management tools which support the analysis performed by the researcher s [ 14 ].
Attributions for icons: see Fig. Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.
In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation.
The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ].
Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement e.
Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of unexpected results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 , 25 , 26 ].
The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.
In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results.
Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study.
This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved.
In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed.
Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different types of hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups. A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.
Standards for Reporting Qualitative Research SRQR to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research paper. Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ]. While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research.
This includes what is called reflexivity, i. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ].
These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].
An iterative sampling approach is advised, in which data collection e. This process continues until no new relevant information can be found and further sampling becomes redundant — which is called saturation [ 1 , 15 ]. In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].
This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. Other types of purposive sampling include but are not limited to maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].
Good qualitative research is iterative in nature, i. One example of this are pilot interviews, where different aspects of the interview especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded are tested with a small number of respondents, evaluated and revised [ 19 ].
In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time.
Of course, the same reasoning applies to observations or focus groups which can also be piloted. Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list or tree , and when a common meaning of individual codes must be established [ 23 ].
An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.
Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with summaries of their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ].
In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants i. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and future patients alike [ 34 , 35 ].
In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself. The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.
Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies — in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.
For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 , 38 , 39 ].
Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i. While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ].
Qualitative studies do not use control groups, either. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement.
Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis.
This might be helpful in providing additional context information for interpretation of data, e. Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand.
Similarly, qualitative research should not be required to be combined with quantitative research per se — unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.
The main take-away points of this paper are summarised in Table 1. We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using only quantitative designs. It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.
Philipsen, H. Kwalitatief onderzoek: nuttig, onmisbaar en uitdagend. TCo Eds. In: Qualitative research: Practical methods for medical practice pp. Houten: Bohn Stafleu van Loghum. Chapter Google Scholar. Punch, K. Introduction to social research: Quantitative and qualitative approaches. London: Sage.
Kelly, J. Travelling to the city for hospital care: Access factors in country aboriginal patient journeys. Australian Journal of Rural Health, 22 3 , — Article Google Scholar.
Nilsen, P. Never the twain shall meet? Implementation Science, 8 1 , 1— Oxford Center for Evidence Based Medicine. Eakin, J. Educating critical qualitative health researchers in the land of the randomized controlled trial. Qualitative Inquiry, 22 2 , — May, A. Alternatieven voor RCT bij de evaluatie van effectiviteit van interventies!? In Alternatives for RCTs in the evaluation of effectiveness of interventions!? Final report. Google Scholar. Berwick, D. The science of improvement.
Journal of the American Medical Association, 10 , — Christ, T. Alternative paradigms and mixed methodologies. Qualitative Inquiry, 20 1 , 72— Lamont, T.
New approaches to evaluating complex health and care systems. BMJ, i Drabble, S. Johnson Eds. When looking for people to talk to for a qualitative interview, consider your goal. If you want to expand a product line, interview existing customers about their needs. Match interview subjects with the goal of the interview.
The setting of a qualitative interview also affects the quality of the interview. Consider the needs of the subject. Some cultures may not value direct eye contact.
For long interviews, offer water and breaks to participants. Be polite and respectful when interacting with interview subjects. Let interview participants know the purpose of the research.
Address terms of confidentiality if necessary. Thank participants after the interview and let them know what to expect next. This helps you optimize future interviews.
Transcribe the interview word for word. Note non-verbal interactions in your transcription. Interactions like pauses and laughter can provide deeper insights into responses. Analyze your qualitative research data early. That way, you can identify emerging themes to shape future interviews. Consider adding these to each interview report:. Each interview can help you improve the efficiency and effectiveness of future ones.
Adjust your interview guide based on insights from each previous interview. Keep all versions of your transcriptions and interview guides with notes on them. You can reference these for future qualitative research. This is an open-access article distributed under the terms of the Creative Commons Attribution-Noncommercial-Share Alike 3. This article has been cited by other articles in PMC. Interviewing This is the most common format of data collection in qualitative research.
Observation Observation is a type of qualitative research method which not only included participant's observation, but also covered ethnography and research work in the field. Conclusion Research can be visualized and perceived as painstaking methodical efforts to examine, investigate as well as restructure the realities, theories and applications.
References 1. Buckley JW, Chiang H. Canada: Natl Assoc of Accat; Research Methodology and Business Decisions. Crotty M. Thousand Oaks, California: Sage; Noor KB. Case study: A strategic research methodology. Am J Appl Sci. Towards a definition of mixed method research. J Mix Methods Res. The potential contributions of quantitative research to symbolic interactionism.
Symbolic Interact. Corbin J, Strauss A. Thousand Oaks, California: Sage Publications; Creswell JW.
0コメント