Assessment and Course Redesign in Community College Geosciences
This content discusses the process of course design, assessment of student outcomes, and the use of test results in a community college geosciences department. It emphasizes the importance of evaluating student success, course content, and teaching methodologies to improve diversity and transfer opportunities. The methods of assessment include homework assignments, classroom exercises, term papers, tests, and more. Additionally, it explores the significance of asking questions effectively and predicting student performance. The overall goal is to enhance teaching practices and ensure student success.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Assessment and Course Redesign in Community College Geosciences Course Design, Improving Diversity, and Transfer Opportunities in Geoscience Wake Technical Community College, Geology Department November 18, 2017 Dr. Kenneth L. Howard, LG
Course Design and Evaluation Design select textbook, write syllabus, and develop presentation methods. Assessment of student outcomes measure outcomes with numerous means and assign a grade. Course evaluation we evaluate student success, course content, methodology, and our success based on results. Redesign we use data gathered to modify and improve our teaching and assessment methods.
Assessment of Student Outcomes Homework Assignments Classroom and Laboratory Exercises Term papers Testing Quizzes Unit tests Final Exam
How Do We Use Test Results The primary purpose of testing is to assign a grade. Students are evaluated against a grade scale with 70% correct response level selected, but We must also use our testing to assess our own course content, our testing procedures, and our success as instructors.
How Do We Ask a Question Does it matter how we ask? We commonly ask a variety of questions reflecting different levels of student understanding and difficulty of content: Multiple choice questions, Short answer questions, Matching, and Essay questions. Generally, we evaluate questions for content and the level of difficulty using Bloom s Taxonomy levels as our basic guide. Recall Understand Apply Evaluate
Can you predict student performance on these comparable questions? On the accompanying figure, which letter is likely over oceanic crust that is similar in age to that beneath letter E? ______ a. A and B b. B c. C d. D On the accompanying figure, which letter is over the oldest oceanic crust? ______ a. A b. B c. C d. D e. E (Bloom Level 2.7 Inferring)
Did you get the right question? 85% 52%
Short answer versus multiple choice Clay minerals formed from destruction of feldspars illustrates which weathering process? ___________ (Bloom Level 1.2 Recalling) Conversion of feldspar to clay illustrates which weathering process? a. oxidation b. mechanical c. hydrolysis d. syntropical (Bloom Level 1.1 Recognition) Which question has a higher student success rate?
Did you pick the easiest? Short Answer Result 43% Multiple Choice Result 57%
What one little word may do? Which of the following features is younger than Fault 1? _______ a. the lava flow b. the granite c. tilted layers d. lava flow and granite e. tilted layers and granite Student success rate: 11% What do you think the modification was? What do you think the change in student success was?
The word is ONLY Which of the following features is younger than Fault 1? _______ a. only the lava flow b. only the granite c. tilted layers d. lava flow and granite e. tilted layers and granite Student success rate after modification: 45%
How Do We Use Test Results to Assess Our Course Content? We assess a number of issues with tests: course content, course learning objectives, student understanding, our own performance, and student performance. What is the difficulty of the question? Is the question asked in such a way that the student should understand the information requested? Do we collect and track sufficient data to make sure we can measure success and change?
Testing and Assessment Common Objectives and Exam Questions at Course Level Since 2010, the WTCC Geology Group has tested students on 12 course objectives with multiple choice questions on final exams. Data are compiled and reviewed at the end of each semester to establish long-term trends. Common Questions Aimed at Specific Content Knowledge and Comprehension Levels More content or concept targeted learning outcomes are evaluated for content specific and Bloom level specific . Each assessment targets general knowledge, critical thinking, and scientific literacy for each content area.
Common Exam Question Results Since Fall 2010, test scores on 9 objectives have fallen (1 15%), one objective has not changed, and two objectives have increased (1 3%). Students currently achieve 70% success on only 5 of the twelve objectives. LO #2 - Andes Mountains - down 10% 100% 80% 60% 40% Fall 2010 Spring 2011 Fall 2011 Spring 2012 Fall 2012 Spring 2013 Fall 2013 Spring 2014 Fall 2014 Spring 2015 Fall 2015 Spring 2016 Fall 2016 Spring 2017
LO #6b2 - Shield Volcanoes - down 11% 100% 80% 60% 40%
Common Questions Aimed at Specific Content or Understanding Targeting a specific area of course content with 1) low level questions to measure general knowledge and 2) questions to specifically assess critical thinking skills and scientific literacy. Presented to students as quizzes with a total of ten questions in an on-line format. Results are analyzed at the end of each semester and one particular area of low performance is selected for follow on intervention.
Scientific Literacy Intervention Initial Finding (from SLO#2, PLO#6 quiz results): Students do not understand sorting of sediments and answer a scientific literacy question at a 35.6% level of proficiency. First Level Intervention: modify the question and learn that the problem is not with the question (student success 36.8%). Second Level Intervention (on-going): increase teacher effort to explain the concept in either lecture or laboratory format. First semester results encouraging (student success 41.6%) but not decisive.
Assessment of Our Teaching Prowess Tracking pre- and post-course student knowledge is essential to understanding the effectiveness of our methodology. Scientific Literacy a test of general science knowledge (general science, physics, chemistry, biology, and earth sciences) given on the first and last day. Repeating questions from semester tests verbatim on the final exam and looking at the difference in success.
Changes in Scientific Literacy Science Literacy Test Results Start and End of Semester 80.0 75.0 70.0 Semester End Percent Correct 65.0 60.0 Semester Start 55.0 50.0 45.0 40.0 Semester
Repeat Questions on Final Exam Topic Regression Crystallization Hydrolysis Perched H2O Table Continental Rift Metamorphism Age Dating Rock Forming Minerals Angle of Repose Earthquake Distance Semester Final 43.1 47.5 57.8 26.8 68.4 52.4 78.8 87.1 50.2 68.8 Difference 59.1 62.2 61.8 26.7 85.0 58.2 71.7 91.0 67.3 92.4 16.0 14.7 4.0 -0.1 16.6 5.8 -7.1 3.9 13.1 23.6
Redesign Based on Results How do we use the information that we gather? Interventions directed at specific topics designed to give students better understanding of material. Change lecture emphasis and presentation Create a classroom or lab exercise Assign homework on the topic Adopt a new textbook. When all else fails, change the question.
Summary of Observations Assessment is not always a straightforward process. Interventions do not always achieve the desired results. Most students will not be successful with some course content despite intervention. After long periods of stability in outcomes, it s time to rethink the entire process.