Evaluating Computing Outreach Activities: A Common Framework
This presentation discusses a common framework for evaluating computing outreach activities, including declining enrollments, lack of diversity, labor shortage, and the systematic literature review process. Steps such as framing the question, identifying relevant work, and reviewing candidate articles are explored.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Towards a Common Framework for Evaluating Computing Outreach Activities Adrienne Decker Adrienne.Decker@rit.edu Monica M. McGill mmcgill@bradley.edu Amber Settle asettle@cdm.depaul.edu
Declining Enrollments Lack of Diversity OUTREACH Labor Shortage
A Question Does it work?
This Talk Systematic Literature Review Framework Discussion of our results Recommendations for the community based on our findings
Systematic Literature Review Khan, Kunz, Kleijnen, and Antes (2003), Petticrew and Roberts (2006) Five steps: Frame the question Identify relevant work Assess the quality of the studies Summarize the evidence Interpret the findings
Step 1: Frame the Question What type of data has been collected in formal, peer-reviewed research that has been conducted on computing outreach activities in recent years? Overarching characteristics: Populations Studied Students enrolled in computing outreach programs as defined by the researchers Interventions Programs that exposed students to computing concepts Study designs Quantitative, qualitative, or mixed methods studies Outcomes Effects of the program on participants behaviors, attitudes, skills, knowledge, or dispositions
Step 2: Identify Relevant Work Years: 2009 to 2015 SIGCSE Technical Symposium on Computer Science Education (SIGCSE) Frontiers in Education (FIE) [*** 2015 proceedings not available when review completed ***] Innovation and Technology in Computer Science Education (ITiCSE) International Computing Education Research Workshop (ICER) Taylor & Francis Computer Science Education (CSE) Transactions on Computing Education (TOCE)
Step 2: Identify Relevant Work 3,949 candidate articles to be reviewed for relevance Keywords/Concepts K-12, elementary school, high school, secondary school, after school clubs, summer camp 3,837 papers did not fit the criteria and deemed irrelevant 112 articles to undergo a more thorough review
Characteristic Collection points Step 3: Assess the Quality of the Studies Populations Studied Participant characteristics (age and/or grade in school, gender, ethnicity, location) Number of participants in study Read 112 articles Interventions Goals and facets of the program 32 removed Study Designs Research question Quantitative, Qualitative, Mixed Methods, or Other Longitudinal, cross-sectional, experimental, quasi-experimental, etc. Type of data collected (participants behaviors, attitudes, skills, knowledge, or dispositions) Results of the study described the activity/curriculum in general terms work in progress papers that did not include any data or findings Outcomes
Number of articles meeting criteria 09 10 11 12 13 14 15 Total SIGCSE 10 5 3 4 7 5 3 37 FIE 2 3 2 3 n/a 10 ITiCSE 3 2 2 2 1 2 12 ICER 1 2 3 6 CSE 1 1 2 5 TOCE 9 1 1 11 Totals 15 10 17 10 11 9 8 80
Where did the activities take place? Switzerland, 1, 1% Hong Kong, 2, 2% Israel, 3, 4% Scotland, 4, 5% Finland, 5, 6% US, 60, 72% Other, 23, 28% South Africa, 1, 1% New Zealand, 1, 1% England, 1, 1% Canada, 1, 1% Argentina, 2, 3% Australia, 2, 3%
When outreach activities were offered 30 28 27 Number of activities 25 20 15 10 10 7 5 1 1 1 2 1 1 1 0
Study Participants 9000 to 9999 4000 to 4999 2000 to 3999 1000 to 1999 500 to 999 400 to 499 300 to 399 Number of Participants 200 to 299 151 to 199 100 to 150 90 to 99 80 to 89 70 to 79 60 to 69 50 to 59 40 to 49 30 to 39 20 to 29 10 to 19 1 to 9 0 1 2 3 4 5 6 7 8 9 10 11 Number of studies
Gender of Participants Not specified 22 Only male participants 1 Male and female participants 40 Only female participants 17 0 5 10 Number of studies 15 20 25 30 35 40
Ethnicity of Participants Reported by 28 (35%) of the studies. Three studies were strictly minority participation Two strictly Hispanic/Latino/Latina One American Indian 25 studies reported mixed ethnicity with some minority participation. American Indian or Alaskan Native; Asian; Asian/Pacific Islander; Black or African American; Filipino; Hispanic, Latino, Latina; Multi-racial; Other. 52 studies (65%) that did not indicate participant ethnicity
Step 5: Interpret the Findings (Discussion) Lack of longitudinal studies (Reported by only 7 (8%) of the studies) Majority in US (only category without missing data, but not always reported in article) Majority target middle and high school students, and a many target multiple age ranges or allow students to progress through them as they age.
Step 5: Interpret the Findings (Discussion) 49% of the overall studies indicated that they were designed to increase gender diversity; 31% of the overall studies indicated that increasing ethnic diversity was a goal Gender being tackled more often than ethnicity Over half of the studies had less than 100 participants and 45% had less than 50 participants
Step 5: Interpret the Findings (Discussion) Measured: Participant attitudes about computing (31%) Potential further study of computing and interest in computing careers (23%) Participant knowledge of concepts (23%)
Step 5: Interpret the Findings (Discussion) It is not always clear how data was collected and in most cases the instruments were not provided, but it was commonly surveys and other instruments created by the leaders of the interventions/studies. Low numbers of participants -> Qualitative Methods? Unfortunately, no in a vast majority of the cases
Step 5: Interpret the Findings (Discussion) Missing data For some studies, it was not clear about their intentions on increasing diversity. For those where they did indicate that as a goal, a significant portion did not report on the exact gender or ethnic breakdown of participants. 20% of the studies did not clearly indicate the total number of participants in the interventions in any way.
Limitations Venues for academic researchers within the computing community National Center for Women in Technology (NCWIT), Girl Scouts of America Other academic venues not considered If a title did not seem to indicate an association with outreach activities, its abstract was not examined.
Call to Action - Framework Preliminary steps Define overarching research question(s) to be studied Define data to be collected to provide answers to the research question
Framework Preliminary Steps Ensure that the data collection and reporting of data has been approved by your local institutional review board Consider variables outside of the study that may influence the outcomes and include these as part of your report
Framework Data to be Collected Collect basic demographic data on the participants, including gender, ethnicity, age, grade in school Collect any other unique characteristics about the participants that may influence the study (participated in previous activities, were all gifted students, etc.)
Framework Data to be Collected Use reliable, validated survey instruments, when possible, to gauge participant attitudes, self-efficacy, and skills if one or more of these are used to answer your research question(s) Consider the number of students in the group; statistical analysis such as a t-test typically requires 26 or more to be considered valid
Framework - Reporting Provide the research question and/or the purpose of the intervention (computing activity) Describe type of activity and where activity was held (including country) Provide amount of time participants were engaged in the activity (hours/days/weeks)
Framework - Reporting Provide information on who ran the activity Provide data that was collected, reporting at a minimum the gender, ethnicity, age, and grade in school of participants, both in count and in percentages. For each piece of data collected, report count and percentages
Qualitative? Consider qualitative methods for studies with low numbers of participants Rigorously applied qualitative techniques can provide our community with more information than trying to apply inappropriate quantitative techniques