Instrument Development in the Context of Mixed Methods Framework
Explore the intricacies of instrument development within a mixed methods framework as presented by Vanessa Scherman from the University of South Africa. The overview delves into mixed methods research, methodological norms, and closing the loop in research processes. Gain insights into the pragmatic standpoint, mixed method designs, validity considerations, and trustworthiness in research practices.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
INSTRUMENT DEVELOPMENT IN THE CONTEXT OF A MIXED METHODS FRAMEWORK PRESENTED BY VANESSA SCHERMAN UNIVERSITY OF SOUTH AFRICA
OVERVIEW Brief overview of mixed methods Elaboration of methodological norms in relation to approaches Instrument development Closing the loop
INTRODUCTION Mixed methods research can account for what Mertons (2018) calls wicked problems Need to account for culture and context (Nastasi & Hitchcock, 2016) Planning including both qualitative and quantitative methods methods (Nastasi & Hitchcock, 2016) Use of various methods both qualitative and quantitative to tease out nuances Iterative process exists between the purpose of the research and the research questions But as Shannon-Baker (2018) has indicated we need to be able to defend the choices we make
FROM A PRAGMATIC STANDPOINT Commonly associated with mixed methods (Creamer, 2018) Represents a practical and applied research philosophy Umbrella term including dialectical pluralism, critical realism and transformative-emancipatory paradigms Investigation of the perceived problem possible without imposing constraints on methods Lends itself to the use of mixed methods, which provides the researcher with the opportunity to answer the research questions adequately (Teddlie & Tashakkori, 2003).
MIXED METHOD DESIGNS Thought of in terms of: Priority equivalent, qualitative/quantitative dominant Timing sequential, concurrent
MIXED METHOD DESIGNS Convergent designs Explanatory sequential designs Exploratory sequential designs Creswell, 2014
VALIDITY Content-related validity Construct-related validity Criterion-related validity Ecological validity Statistical conclusion validity Scherman, 2016
TRUSTWORTHINESS Credibility Dependability Confirmability Triangulation Lincoln & Guba (1985)
LEGITIMATION Legitimation refers to the threats to internal and external validity or credibility in quantitative and qualitative research
Of the four rationales for mixing qualitative and qualitative approaches, instrument fidelity most lacks adequate development. Indeed, with very few exceptions (e.g., Collins et al., 2006; Hitchcock et al., 2005, 2006), scant guidance has been given to help researchers use mixed research techniques to optimize the development of either qualitative or quantitative instruments Onwuegbuzie, Bustamante & Nelson (2010)
R esearch questions R esearch questions L iterature review L iterature review C onceptual framework C onceptual framework S pecific research questions S pecific research questions
13 STARTING AT THE BEGINNING R esearch questions R esearch questions L iterature review L iterature review Instrument development is not divorced from research questions, literature and analysis C onceptual framework C onceptual framework S pecific research questions S pecific research questions D ata level questions D ata level questions 13
REMEMBER DEDUCTIVE REASONING GENERAL TO THE SPECIFIC 14 Theory Hypothesis Observation Confirmation 14
MEASUREMENT INSTRUMENTS Valid and reliable Instruments should be attractive, brief and easy to respond to Carefully plan the format and content Do not include items that do not relate to your topic Structured selection type questions are preferable 15
TYPES OF INSTRUMENTS Cognitive measuring intellectual processes such as thinking, memorizing, problem solving, analyzing, or reasoning Aptitude measuring general mental ability usually for predicting future performance 16
TYPES OF INSTRUMENTS Affective assessing individual feelings, values, attitudes, beliefs, etc. Typical affective characteristics of interest o Values deeply held beliefs about ideas, persons, or objects o Attitudes dispositions to favorable or unfavorable toward things o Interests inclinations to seek out or participate in particular activities, objects, ideas, etc. o Personality characteristics that represent a person s 17
TYPES OF INTSTRUMENTS Scales used for responding to items on affective tests o Likert o Semantic differential o Thurstone o Guttman o Rating scales 18
QUESTIONNAIRE BLUEPRINTS Description component Table of specifications Items or tasks Plans for the scoring procedures Analysis
CONSTRUCT DEFINTION NUMBER OF ITEMS 26 Items ITEM TYPE CODING POSSIBLY ANALYSIS Frequencies Cross Tabulations Scale analysis (including reliability analysis) Correlations Professional development/im proving practice A good vocational training encouraged for the further development of staff (Sammons, 1999) as articulated by in- service training opportunities, updating policies and introduction of new programmes (Taggart & Sammons, 1999). Dichotomous items Likert scale items No = 1, Yes=2 4=Strongly agree 3=Agree 2=Disagree 1=Strongly disagree
21 CONSTRUCTING ITEMS Only include items which relate to the topic Collect demographic information Each question should deal with a single concept Avoid jargon Be specific short simple items are best Avoid leading questions Avoid sensitive or touchy questions
22 CONSTRUCTING ITEMS Avoid double barrel questions Respondents should be competent to answer Avoid negatively phrased items
PRE-TESTING Piloting the cover letter and instrument Identify weaknesses and strengths Face and content validity 23
ISSUES - COGNITIVE, APTITUDE, OR AFFECTIVE INSTRUMENTS Bias distortions of a respondent s performance or responses based on ethnicity, race, gender, language, etc Responses to affective test items Socially acceptable responses Accuracy of responses Response sets Problems inherent in the use of self-report measures and the use of projective tests 24
ISSUES - SELECTING INSTRUMENTS Non-psychometric issues Cost Administrative time Objections to content by parents or others Duplication of testing 25
DESIGNING YOUR OWN INSTRUMENTS Get help from others with experience developing tests Item writing guidelines Avoid ambiguous and confusing wording and sentence structure Use appropriate vocabulary Write items that have only one correct answer Give information about the nature of the desired answer Do not provide clues to the correct answer 26
TEST ADMINISTRATION GUIDELINES Plan ahead Be certain that there is consistency across administration sessions Be familiar with any and all procedures necessary to administer the instrument 27
STATISTICAL CONSIDERATIONS Cannot be divorced from development process Method of analysis should be established before items are written Analysis should direct format 28
INTERVIEW GUIDES Let the participants tell their own story Questions should be simple Do not ask more than one question at a time
INTERVIEW GUIDES Different type of questions: Direct questions Indirect questions Structuring questions Follow-up questions Probing questions Specifying questions Interpreting questions
Outline the broad knowledge areas related to research questions Take care to phrase questions in a manner that will encourage honesty Think about your respondents and use appropriate language Develop questions within these broad areas Use how questions
Think about the logical flow of the interview and structure according Ask difficult questions only after rapport has been built Last question should provide closure Develop probes that will elicit more detail Begin with warm-up questions
Major area DEFINTION Type of questions and probes Type of coding strategy Alignment with Quantitative data Professional development/im proving practice A good vocational training encouraged for the further development of staff (Sammons, 1999) as articulated by in- service training opportunities, updating policies and introduction of new programmes (Taggart & Sammons, 1999).
BRINGING THE TWO TOGETHER The design phase has to be thought through carefully There has to be alignment between quantitative and qualitative phases Instrument fidelity is key and has to be made explicit Methodological norms for each approach has to be articulated and there has to be alignment with strategies
BRINGING THE TWO TOGETHER Discussions on validity are not new Focus has normally been on quantitative instruments but the use of qualitative data is increasingly being discussed (Koskey et al., 2018) Onwuegbuzie, Bustamante & Nelson (2010) have proposed an Instrument Development and Construct Validation (IDCV) process
INSTRUMENT DEVELOPMENT AND CONSTRUCT VALIDATION Conceptualization of the construct Identify and describe behaviors Develop the initial instrument Pilot testing Design and field test revised instrument Validate revised instrument quant analysis Validate revised instrument qual analysis Validate revised instrument qual dominant cross-over Validate revised instrument quant dominant cross-over Evaluate
BRINGING THE TWO TOGETHER More research is required to develop adequate guidelines The use of innovative analyses techniques is required but has to be considered during the design phase
REFERENCES Creamer, E. G. (2018). An introduction to fully integrated mixed methods research. London SAGE. Creswell, J. W. (2002). Educational research. Boston: Pearson. Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks: Sage Publications. Creswell, J. W. (2014). Educational research: Planning, conducting and evaluation quantitative and qualitative research (4th ed.). Boston: Pearson. Creswell, J. W. (2014). A concise introduction to mixed methods research. London: SAGE. Gay, L.R., Mills, G.E., Airasian, P.W. (2011). Educational research. Boston: Pearson. Koskey, K.L.K., Sondergeld, T.A., Stewart, V.C., & Pugh, K.J. (2018). Applying the mixed methods instrument and construct validation process: The transformative experience questionnaire. Journal of Mixed Methods Research, 12 (1), 95-122. Mertons, D. (2018). Mixed metods to address wicked problems. Retrieved 20 September from https://www.ualberta.ca/international-institute-for-qualitative- methodology/webinars/mixed-methods-webinar/archived-webinars Nastasi, B. K., & Hitchcock, J.H. (2016). Mixed methods research and culture-specific interventions: Program design and evaluation. London: SAGE. O Leary , Z. (2014). Doing your research project. Thousand Oaks: Sage Publications. Onwuegbuzie, A.J., Bustamante, R.M., Nelson, J.A. (2010). Mixed research as a tool for developing quantitative research instruments. Journal of Mixed Methods Research, 41 (1), 56-78. Scherman, V. (2016). Methodological standards and fit for purpose: criteria to evaluate psychological tests and assessments. In R Ferreira (Ed), Psychological assessment: Thinking innovatively in contexts of diversity, pp. 72-85. Pretoria: Juta. Shannon-Baker, P. (2018). Introducing mixed methods in courses on research design. Retrieved 20 September from https://www.ualberta.ca/international-institute-for- qualitative-methodology/webinars/mixed-methods-webinar/archived-webinars Whitley, B. E. (2002). Principles of research in behavioral science. Boston: McGraw Hill.