Assessment, Accountability & Human Potential Data-Based Decision Making Process

Slide Note
Embed
Share

Explore the process of data-based decision making in education with a focus on identifying valid data, utilizing various tools for data collection and progress monitoring, and leveraging screeners for quick diagnostic insights. Learn how to integrate data into instructional planning, groupings, and school-wide initiatives for enhanced educational outcomes.


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.



Uploaded on Apr 17, 2024 | 5 Views


Presentation Transcript


  1. Assessment, Accountability & Human Potential Data-Based Decision Making Process Lisa Coldiron, Jori Martinez-Woods, Kate Bailey In Collaboration with Mack Shane, Instructional Coach, ACCESS Project Lauren Irwin, Educational Associate, DDOE Susan Veenema, Educational Associate, DDOE

  2. Assessment, Accountability & Human Potential Data-Based Decision Making Process Lisa Coldiron, Jori Martinez-Woods, Kate Bailey In Collaboration with Mack Shane, Instructional Coach, ACCESS Project Lauren Irwin, Educational Associate, DDOE Susan Veenema, Educational Associate, DDOE

  3. Learner Objectives Goals for this session: Identify valid data Describe key features of data- based decision making Define the steps necessary for data-based decision making Understand overlap with existing school-wide initiatives like MTSS 2

  4. Identifying Valid Data

  5. Using Data at Sussex Montessori Screeners DELS Primary Normalization Check/Elementary Adjustment Check (NCMPS) K-Readiness Screener Phonemic Awareness Screener Formal Data Collection and Progress Monitoring Tools Dibels (Benchmark & Progress Monitoring) Universal Tool STAR Reading and Math (Benchmark & Progress Monitoring) UT Aperture (Social Emotional Learning) 4

  6. Using Data at Sussex Montessori Informal Data Collection and Progress Monitoring Tools 3 Period Lessons Work Samples- Evidence of Progress Observations and anecdotal notes Transparent Classroom Progress Transparent Classroom- Photos of Lessons 5

  7. Screeners ( Beginning Of Year) (DELS Screener, Primary Normalization Check/Elementary Adjustment Check, Phonemic Awareness Screener, K-Readiness Screener) These are tools we use to obtain a quick diagnostic data on our new students. What this does is allow you to get an idea of where your students present levels are, what they already know and can do. The data from these tools allow you to lesson plan, and design instruction and groupings in the first 6 weeks of school 6

  8. Screeners ( Beginning Of Year) (DELS Screener, Primary Normalization Check/Elementary Adjustment Check, Phonemic Awareness Screener, K-Readiness Screener) 7

  9. Formal Data Collection and Progress Monitoring (Dibels, STAR Reading and STAR Math, Aperture (Social-Emotional Learning) These are tools give us academic/sel snapshot of where our students are in relationship to benchmark standards. Assessed in the beginning, middle, and end of the year, and can be used to drive instruction and groupings as well as interventions. These also help us identify learners for child study, MTSS, social emotional supports, etc. (Data can be synced to Panorama) ) 8

  10. Formal Data Collection and Progress Monitoring STAR Reading 9

  11. Formal Data Collection and Progress Monitoring STAR Math: 10

  12. Formal Data Collection and Progress Monitoring Aperture: Social Emotional Learning 11

  13. Formal Data Collection and Progress Monitoring- Dibels **Also includes a test of Related Early Literacy Skills, including Letter Naming Fluency, which assesses a student s ability to say the names of upper and lowercase letters in the English alphabet. This skill is a strong predictor of future reading success in young children. 12

  14. Formal Assessments How will the results be used? A student s scores on the DIBELS, STAR, and Aperture give the school information about whether or not a student is on track for grade-level in ELA, Math, and Social-Emotional learning competencies. This allows us to quickly identify where students are in relation to grade- level benchmarks. We will use this data to make informed decisions about instruction, MTSS/Child study, and differentiated lesson planning. The goal is for all of our learners to achieve their human potential 13

  15. Informal Data Collection and Progress Monitoring (Examples: 3-period lessons, work samples, observations, anecdotal notes, transparent classroom photos/data, weekly progress monitoring) These are your ongoing daily data collection to monitor student progress, inform instruction, and provide you daily opportunities to gather evidence of progress for both gen-ed and spec-ed students. Data should be kept in Transparent Classroom, a binder, or google doc, so that it is easily referred to. 14

  16. Informal Data Collection and Progress Monitoring (3-period lessons, work samples, observations, anecdotal notes, transparent classroom photos/data, weekly progress monitoring interventions) (University of Oregon Center on Teaching and Learning, 2018) 15

  17. Delaware MTSS DE-MTSS provides a whole-child framework for all students to reach their full potential in a positive, inclusive, and equitable learning environment. Through high-quality instruction and intervention and a culture of collaboration, communication, and flexibility parents, educators, and leaders work together to develop a responsive system of support that addresses the academic and nonacademic needs of all learners and boosts student performance 16

  18. Leveraging MTSS Data LEAs should consider whether these academic or social/emotional/behavioral assessments are being completed during the multi-tiered systems of support or MTSS process Start from the beginning: what does the universal screener data (conducted within the first four weeks of the school year or within four weeks of the student s entry into school) or standardized assessment data tell about the student and any potential areas of need? 17 (14 Del.C. 508. 6.1.1)

  19. Example: Healthcare and MTSS Tertiary Level Specialists, more specialized care, surgeons Secondary Level Medication, specific diets, specific tests Primary Level access to healthcare, physicals, screenings, vitamins and supplements 18

  20. Justification Overlap with other school-wide initiatives Tiered support systems for academic and social/emotional/behavioral need Data-driven specially designed instruction Adaptations to instructional strategies 19

  21. Using Data to Determine Areas of Need Leveraging Panorama Combining multi-tiered systems of support into a single dashboard Leverage our formal assessment data, to make informed decisions about our students, plan and track MTSS/Child study, etc. 20

  22. Data-Based Decision Making in a Montessori Classroom

  23. Assessment, Accountability & Human Potential Our work is grounded in the premise that human flourishing should be the goal of education. By this, we mean the capacity to thrive socially, emotionally, intellectually and economically; to participate meaningfully in family, community, and civic life; and to live a life of curiosity, agency and satisfaction. Dr. Montessori described human flourishing as a means to becoming a persona of one s time and place, with the means and wherewithal not only to function within but to shape society.

  24. Montessorians believe strongly that schools must hold themselves accountable to the children and families they serve. We also believe that Montessori is among the most data-centric educational approaches ever invented. Building on that premise, we assert that the best way to service children and families is for schools to cultivate sustainable systems for assessing their impact using a rage of data sources as named in the previous slides. For such a system to be effective, however, we must pay careful attention to both what we measure and how we do it.

  25. We are guided by 2 principles; first to view assessment as a system as opposed to an event. Second is to build an assessment protocol around carefully coordinated tools that enable the system to operate with both precision and coherence. The foundational goal is to understand how the learner thinks, what is going on with them? What interests them? What they are ready for? What obstacles are they confronting? Our Formal and Informal assessments must inform our practice, lesson planning and social emotional supports needed for their individual flourishing. We must select measures that define the school s mission.

  26. Common Misconceptions Claim 1: Having a framework for data-based decision making will fix everything. While data are an important component of DBDM, decision makers must interpret the data to inform decisions about how to effectively support students. Data must be combined with pedagogical and content knowledge to translate it into a usable action plan, taking the context into consideration (Wilcox, 2021). (Wilcox, 2021) 25

  27. Common Misconceptions Claim 2: Data-based decision making will force me to change everything I m doing in my classroom. Educators may find it beneficial to take an incremental approach to instructional changes based upon data, which can help them to parse out the potential impact of different factors on student performance (Flanagan, 2021). This will allow for greater contextualization of the responses to interventions, thus allowing the educator to figure out what s working and what s not. (Flanagan, 2021) 26

  28. Common Misconceptions Claim 3: Data-based decision making is mostly about accountability. Free and Appropriate Public Education (FAPE) requires a two-prong approach, which requires educators to consider how the data informs how well a student with a disability has made progress appropriate in light of his or her circumstances. This is primarily determined by: 1. Procedural compliance of the IEP 2. Reasonable calculation to enable the student to receive educational benefit (Endrew F. & IEP Standards, 2023) 27

  29. Re-establishing Core Beliefs Assessment results inform educators about how they can best support their students In special education and tiered intervention systems, professionals use assessment results to build tailored programming for students with disabilities Educators continuously reassess the appropriateness of that programming by analyzing and interpreting student outcomes (Browder et al., 2020a) 28

  30. Data-Based Decision-Making in the IEP

  31. 30

  32. Collect and Use data 31

  33. Initial Identification and Universal Screening Initial Indication of Need a. How is this being measured (e.g. what type of data point is this?) i. Transparent Classroom Progress - Universal Rubric for Mastery ii. Universal screeners or MTSS data iii. Behavioral or SEL data b. How can we gather multiple data points that measure similar areas of skills needs? i. Horizontally and vertically gathering more points, utilization of Panorama Initial Data Point 32

  34. Comparing and Contrasting 33

  35. Initial Identification and Universal Screening Psychoeducational Evaluation (ESR) and Related Service Assessments a. Primary data point that indicates whether special education services will be provided b. Aligns to steps in DBI framework i. Step 1: Determine PLEP Present Levels of Educational Performance (Analyze Data) What can they do right now? ii. Step 2: Analyze Impact (Use Decision Rules) iii. Step 3: Action Steps (Implement Changes to Instruction) 34

  36. Addressing Student Needs Evidence-based practices, research-based curricula, and a variety of complex interventions have been expertly designed to help students with disabilities successfully access and excel in the general education curriculum. (Cook & Odom, 2013; Spooner et al., 2012) 35

  37. Essential Steps in Data-Based Decision-Making 1. Understand present levels of performance in relation to grade-level academic standards and non-academic expectations using data from standardized and curriculum-based assessments, surveys/questionnaires, observations, progress reports, parent/student interviews etc. At this stage, the key question to answer is: In what areas does the student require specialized instruction? These decisions should reflect the strengths, interests, and needs from the family and student s point of view. 36

  38. Essential Steps in Data-Based Decision-Making 2. Analyze impact of the student s disability for each area of identified need. a. To do so, the IEP team should identify why the student is not meeting grade level standards or expectations in order to specify the disability-related needs (e.g., skills) that will improve their involvement, access, and progress with the general education curriculum 37

  39. Essential Steps in Data-Based Decision-Making 3. Develop goals for the disability-related needs that are ambitious, achievable, and designed to close the gaps in the student s progress toward grade-level academic standards and non-academic expectations 38

  40. Essential Steps in Data-Based Decision-Making 4. Identify teaching and learning adjustments for each disability-related need that directly support the student s ability to meet IEP goals and helps them to access, engage in, and make progress with the grade-level curriculum, expectations, activities, and environments. 39

  41. Essential Steps in Data-Based Decision-Making 5. Plan implementation of all program accommodations and modifications that the student will receive to enable them to meet their IEP goals and improve their involvement, access, and progress with the general education curriculum. The plan should clearly define how often (frequency), how long (duration), and where or when the student should receive their IEP services. If needed, the frequency and duration of required supports for school personnel to implement this IEP (e.g., coaching, professional learning) should also be included. 40

  42. Essential Steps in Data-Based Decision-Making 6. Monitor and evaluate the student s progress toward goals and the implementation of the IEP to identify what is working and what may need to change to close the gaps in the student s progress toward grade-level academic standards and non-academic expectations. 41

  43. Decision-Making Rules for DBDM 42

  44. Present Levels in an IEP - Using Multiple Data Sources and MTSS

  45. Developing a Comprehensive Learner Profile When developing an IEP, student teams are looking to create a comprehensive learner profile through the use of a variety of assessment measures and other sources that are sensitive to language and culture, in order to: Analyze and describe students strengths and needs Analyze the school-based learning environments to determine potential supports and barriers to students academic progress. (McLeskey et al., 2017) 44

  46. Comprehensive Learner Profile - Strengths and Needs Teachers should collect, aggregate, and interpret data that support strengths and needs from multiple sources, such as: - - - - - Informal and formal observations Work samples Curriculum-based measures Functional behavior assessment School Files - - - Analysis of curriculum Information from families Other data sources (McLeskey et al., 2017) 45

  47. Adapted from WestEd EBR Graphic, 2022 46

  48. Common Data Collection Methods There are several commonly used formats for measures: Observations Interviews and Questionnaires Rating Scales Assessments (within different domains) Academic SEL Communication Transition 47

  49. Data Considerations When crafting an IEP... - Teachers are funneling a large quantity of information from many sources and creating a coherent plan that provides all supports necessary for a student to be successful. 48

  50. Considerations for Assessment When crafting an IEP... - The team should consider the purpose of the assessment and how the results will be used Teams should also consider whether the data collected are an accurate representation of the student s performance, which speaks to: - Reliability of scores - Validity of interpretations made about the scores - Fairness of the assessment for all the students - (American Educational Research Association et al., 2014) 49

Related


More Related Content