Oklahoma Assessment System Recommendations Overview

OKLAHOMA
ASSESSMENT SYSTEM
RECOMMENDATIONS
JUAN M. D’BROT, CENTER FOR ASSESSMENT
SEPTEMBER 19, 2016
GOALS FOR TODAY
Discuss Oklahoma’s Assessment System aligned to Best Practices
House Bill 3218 Requirements
Previously named goals
Considerations for developing an assessment system
Discuss recommendations
Recommendations that emerged from the previous Task Force meeting
Address other necessary outcomes addressing accountability, peer review, and technical quality
Finalize Recommendations for the Assessment System
Discuss scenarios for standards-based assessments and college entrance assessments and the
associated implications
OKLAHOMA’S ASSESSMENT SYSTEM
Aligning with Best Practices
HOUSE BILL 3218 REQUIREMENTS
Study the following aspects for the 
assessment system
Alignment to Oklahoma Academic Standards
Provide a measure of comparability among other states
Have a track record of statistical reliability and accuracy
Yield both norm-referenced and criterion-referenced scores
Provide a measure of future academic performance for assessments administered in high
school
TASK FORCE RECOMMENDED GOALS OF THE
ASSESSMENT SYSTEM
1.
Provide instructionally useful information to teachers and students with appropriate
grain-size and timely reporting
2.
Provide clear and accurate information to parents and students regarding achievement
and progress toward key outcomes using a meaningful assessment
3.
Provide meaningful information to support evaluation and enhancement of curriculum
and programs
4.
Provide information to appropriately support federal and state accountability decisions
TASK FORCE RECOMMENDED GOALS OF THE
ASSESSMENT SYSTEM
Note the general language used as large targets for the assessment system
Are these reflective of what should be valued?
Are there ones that should be revised?
Are there pieces that are missing?
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Types of Assessments and Appropriate Uses
Standards, Instruction, Assessment
The Assessment Development Process
Cross-walking those against the study requirements
All based on Oklahoma Standards
and Goals for Students
All based on Oklahoma Standards
and Goals for Students
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Alignment to Oklahoma Academic Standards
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Alignment to Oklahoma Academic Standards
Standards … Instruction … Assessment (in that order)
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Alignment to Oklahoma Academic Standards
Standards … Instruction … Assessment (in that order)
Standards
 drive the curriculum design
This, in turn, informs 
instructional 
delivery
The 
assessment 
supports an evaluation of student response
to instruction
The 
assessment 
system’s 
results inform adjustments to
instruction (and potentially curriculum)
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Alignment to Oklahoma Academic Standards
An assessment is only as good as the standards it is intended to measure
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Alignment to Oklahoma Academic Standards
An assessment is only as good as the standards it is intended to measure
The state standards always come first
From here, eligible content for the assessment is identified (must demonstrate alignment)
A blueprint is then developed to cover the standards with sufficient alignment
ALIGNED TO STATE STANDARDS
Match
: The degree to which assessment items
connect to standards
Depth
: The degree to which assessment items
cover the cognitive complexity of the standards
Breadth
: The degree to which assessment items
cover the full range of the standards
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Alignment to Oklahoma Academic Standards
An assessment is only as good as the standards it is intended to measure
The state standards always come first
From here, eligible content for the assessment is identified (must demonstrate alignment)
A blueprint is then developed to cover the standards with sufficient alignment
Items are developed based on the blueprint and the items are field tested
High quality items are used to create final forms of the assessment to make inferences about
student mastery of the standards
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Provide a measure of comparability among other states
How does the development process differ if we desire comparability? Depends on the
level of comparability.
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Provide a measure of comparability among other states
How does the development process differ if we desire comparability? Depends on the
level of comparability.
Within state (same development process)
Proficiency
:  same claims for every student across the state regardless of grade
Scale Score: 
same claims for every student across the state by grade
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Provide a measure of comparability among other states
How does the development process differ if we desire comparability? Depends on the
level of comparability.
Within state (same development process)
Proficiency
:  same claims for every student across the state regardless of grade
Scale Score: 
same claims for every student across the state by grade
Across states (added complexity)
Requires some common information from other states (e.g.,  same students take different items, or
same items are administered to different students)
Requires additional testing time or test administrations
Requires items that are available for use (e.g., national tests)
Requires administering those items under similar conditions
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Yield both norm-referenced and criterion-referenced scores
How does the development process differ if we desire criterion-referenced scores?
It doesn’t. Our criteria are the standards.
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Yield both norm-referenced and criterion-referenced scores
How does the development process differ if we desire criterion-referenced scores?
It doesn’t. Our criteria are the standards.
How does the development process differ if we desire norm-referenced scores?
It depends on our norm group. Remember, the inferences are normed, not the test.
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Yield both norm-referenced and criterion-referenced scores
How does the development process differ if we desire criterion-referenced scores?
It doesn’t. Our criteria are the standards.
How does the development process differ if we desire norm-referenced scores?
It depends on our norm group. Remember, the inferences are normed, not the test.
Within state (same development process)
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Yield both norm-referenced and criterion-referenced scores
How does the development process differ if we desire criterion-referenced scores?
It doesn’t. Our criteria are the standards.
How does the development process differ if we desire norm-referenced scores?
It depends on our norm group. Remember, the inferences are normed, not the test.
Within state (same development process)
Across states (added complexity). Norms are dependent on many things:
The sample who takes the test
The conditions under which the test is administered (e.g., timed test, test environment)
The types of questions that are administered (e.g., questions not prioritizing content, but instead
student score)
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Provide a measure of future academic performance for assessments
administered in high school
How does the development process differ if we desire measures of future academic
performance?
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Provide a measure of future academic performance for assessments
administered in high school
How does the development process differ if we desire measures of future academic
performance?
Similar to previous points, our claims of future performance rely on data from the target time
E.g., Post-secondary readiness requires data on things like remediation rates or career placement
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
Study Point: 
Provide a measure of future academic performance for assessments
administered in high school
How does the development process differ if we desire measures of future academic
performance?
Similar to previous points, our claims of future performance rely on data from the target time
E.g., Post-secondary readiness requires data on things like remediation rates or career placement
It is often difficult to satisfy both alignment to state standards and prediction toward future
academic performance without performing additional studies
Linking to a test that has post-secondary inferences
Supplementing post-secondary readiness tests with items targeting state-specific standards
Using longitudinal data to confirm post-secondary claims on HS assessments
CONSIDERATIONS FOR DEVELOPING AN
ASSESSMENT SYSTEM
These considerations must intersect with the task force recommendations to yield an
assessment of sufficient quality
OKLAHOMA’S ASSESSMENT SYSTEM
Task Force Recommendations
RECOMMENDATIONS FOR 3-8 AND HS
ASSESSMENTS
Recommendations were pulled from stated goals and discussions with the Task Force during
the August meeting
Drafted recommendations are an attempt to summarize conversation points that emerged
Recommendations will be displayed and Task Force members will be asked to make revisions
Grade 3-8
High School
General Recommendations
Additional questions will be posed. Task Force members will be asked to quickly answer these
questions in small groups and share their responses with the room
RECOMMENDATIONS FOR GRADES 3-8
 
RECOMMENDATIONS FOR GRADES 3-8
ASSESSMENTS
The Oklahoma assessment should maintain its focus on the Oklahoma State Standards
Support the ability to measure growth for students and provide a measure of predicted performance on
future OK tests
Support criterion-referenced interpretations (i.e., performance against standards) and report
Content-coverage: In an effort to support coverage of the OK standards, the assessment should include
an adequate assessment of writing
Individual claims should focus on how students perform relative to OK standards and should reflect scale
score, Lexiles, Quantiles,  content cluster, and growth.
Support norm-referenced interpretations that include within-state percentile comparisons and some
across-state comparison
RECOMMENDATIONS FOR GRADES 3-8
ASSESSMENTS:   ADDITIONAL QUESTIONS
Given what we is necessary to support across-state comparisons, how granular must the
comparison be?
Is NAEP sufficient to show state-by-state competitiveness?
Should the state attempt to support more granular comparisons (e.g., proficiency by grade,
performance level, percentile, scale score), which may require longer tests, additional test
administrations, nationally-available items, or additional costs?
What student accountability uses should be implemented and why? If not, why should the
3-8 assessments not be used for student accountability?
RECOMMENDATIONS FOR HIGH
SCHOOL
 
RECOMMENDATIONS FOR HIGH SCHOOL ASSESSMENTS
Link proficiency on the grade 10 assessment to post-secondary readiness.
Consider the use of an off-the-shelf high school assessment that measures post-
readiness.
Several questions are presented at the end of the document that was provided in advance to
you that focus on potential issues that must be addressed with this approach
Ensure that the high school assessment has applicability and value to students by
connecting criterion-based inferences to outcomes of value (e.g., readiness for post-
secondary, prediction of STEM readiness, remediation risk)
RECOMMENDATIONS FOR HIGH SCHOOL
ASSESSMENTS:   ADDITIONAL QUESTIONS
The first two recommendations are somewhat in conflict with each other.  Given the
potential constraints around the use of an assessment that provides a measure of college-
readiness, should the state prioritize a standards-based assessment or a measure of college-
readiness? Why? Please address;
Common administration conditions and the possible lack of accommodations
Timed testing
Non-reported scores to post-secondary institutions for students who need accommodations
Augmentation of measures of college-readiness to cover the OK standards
Additional data from post-secondary sources to substantiate claims of post-secondary readiness for
an OK standards-based assessment (e.g., Grade 10 assessment)
RECOMMENDATIONS ACROSS
ASSESSMENTS
 
RECOMMENDATIONS ACROSS ASSESSMENTS
Provide a reporting dashboard to support timely and accessible access to performance information
Report scale score (and errors), performance levels, clear indicators of proficiency (i.e., a performance
level that reflects proficient or above), and relevant predictive information (e.g., next grade, potential
remediation, distance to proficiency)
Provide an indicator of content cluster performance
Provide appropriate comparison data depending on the level of reporting (e.g., student, teacher,
building/district administrator)
Maintain transparent subgroup reporting
Continue providing Lexile and Quantile reporting with additional information on how to interpret
information
RECOMMENDATIONS ACROSS ASSESSMENTS:
ADDITIONAL QUESTIONS
Prior feedback indicated interest in earlier reporting. Please provide feedback to the
following topics to better understand Task Force recommendations
Should the testing window be moved up to allow for earlier reporting? Why or why not?
How can we support the increased awareness of preliminary assessment reports? How can
they be made more useful?
A more widespread transition to online testing can facilitate a faster turnaround of reports.
Consider the possible advantages and challenges associated with online testing. Should the
state advocate for a more aggressive transition to online testing? Why or why not?
OKLAHOMA’S ASSESSMENT SYSTEM
Finalizing Recommendations
SCENARIOS AND REACTIONS
Three possible scenarios will be presented to the Task Force members
Task Force members will be asked to consider the scenarios and some of the issues
associated with each
Please note and discuss the advantages or disadvantages of each scenario
Task Force members will then recommend one of the scenarios to be adopted by the
state department to ensure it can be implemented (or explored for deeper study given
the impending regulations being prepared)
THREE POTENTIAL SCENARIOS
1.
State, standards-based summative assessments in grades 3-8, 10
2.
State, standards-based summative assessments in grades 3-8, 10 with a college entrance
assessment in grade 11
3.
State, standards-based summative assessments in grades 3-8 with a college entrance
assessment in grade 11
SCENARIO 1: STATE, STANDARDS-BASED
SUMMATIVE ASSESSMENTS IN GRADES 3-8, 10
Guaranteed alignment to Oklahoma State Standards (i.e., no augmentation to
assessment)
Supports clearer interpretations when calculating growth in Grades 3-8 to Grade 10
Requires external information to support post-secondary readiness claims in grade 10
Requires additional items to support across-state comparisons (beyond NAEP
comparisons)
All accommodations are available for students with IEPs or 504 plans
All students would receive directly comparable scores by grade
SCENARIO III: STATE, STANDARDS-BASED SUMMATIVE
ASSESSMENTS IN GRADES 3-8 ; COLLEGE ENTRANCE
ASSESSMENT IN GRADE 11
Grade 11 tests may not be aligned to Oklahoma State Standards (i.e., would require
augmentation to the assessment)
Does not support clear interpretations when calculating growth in Grades 3-8 to Grade 11
Would not require external information to support post-secondary readiness claims
May require stringent administration conditions to support across-state comparisons (beyond
NAEP comparisons)
May not provide the opportunity to provide necessary accommodations to students who
have an IEP or 504 plan
Students who receive accommodations may not have directly comparable scores in Grade 11
SCENARIO II: STATE, STANDARDS-BASED SUMMATIVE
ASSESSMENTS IN GRADES 3-8, 10; COLLEGE ENTRANCE
ASSESSMENT IN GRADE 11
No augmentation would be necessary because of Grade 10 assessment
No external information would be necessary—post-secondary claims are baked into system
Would support clearer interpretations when calculating growth in Grades 3-8 to Grade 10.
Additional information could be provided in terms of readiness/remediation using Grade 10
to grade 11 data
Grade 11 test may still require stringent administration conditions to support across-state
comparisons (beyond NAEP comparisons)
Grade 11 test still may not provide the opportunity to provide necessary accommodations or
scores for students who receive all accommodations reflected in the IEP or 504 plan
The initial use of the college-entrance assessment can support early studies that allow the
Grade 10 assessment to support similar claims as the college-entrance assessment over time
WHICH OF THE THREE SCENARIOS SHOULD BE
ADOPTED OR EXPLORED BY THE OSDE? WHY?
1.
State, standards-based summative assessments in grades 3-8, 10
2.
State, standards-based summative assessments in grades 3-8, 10 with a college entrance
assessment in grade 11
3.
State, standards-based summative assessments in grades 3-8 with a college entrance
assessment in grade 11
ADDITIONAL QUESTIONS OR
RECOMMENDATIONS?
THANK YOU
 
Slide Note
Embed
Share

This content details the recommendations for Oklahoma's assessment system, focusing on alignment with Best Practices, House Bill 3218 requirements, goals set by the task force, and considerations for developing the assessment system. It covers aspects such as providing useful information to teachers and students, clear reporting to parents, supporting curriculum evaluation, and meeting state accountability needs.

  • Assessment System
  • Oklahoma
  • Recommendations
  • Best Practices
  • House Bill

Uploaded on Oct 10, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. OKLAHOMA ASSESSMENT SYSTEM RECOMMENDATIONS JUAN M. D BROT, CENTER FOR ASSESSMENT SEPTEMBER 19, 2016

  2. GOALS FOR TODAY Discuss Oklahoma s Assessment System aligned to Best Practices House Bill 3218 Requirements Previously named goals Considerations for developing an assessment system Discuss recommendations Recommendations that emerged from the previous Task Force meeting Address other necessary outcomes addressing accountability, peer review, and technical quality Finalize Recommendations for the Assessment System Discuss scenarios for standards-based assessments and college entrance assessments and the associated implications

  3. OKLAHOMAS ASSESSMENT SYSTEM Aligning with Best Practices

  4. HOUSE BILL 3218 REQUIREMENTS Study the following aspects for the assessment system Alignment to Oklahoma Academic Standards Provide a measure of comparability among other states Have a track record of statistical reliability and accuracy Yield both norm-referenced and criterion-referenced scores Provide a measure of future academic performance for assessments administered in high school

  5. TASK FORCE RECOMMENDED GOALS OF THE ASSESSMENT SYSTEM 1. Provide instructionally useful information to teachers and students with appropriate grain-size and timely reporting 2. Provide clear and accurate information to parents and students regarding achievement and progress toward key outcomes using a meaningful assessment 3. Provide meaningful information to support evaluation and enhancement of curriculum and programs 4. Provide information to appropriately support federal and state accountability decisions

  6. TASK FORCE RECOMMENDED GOALS OF THE ASSESSMENT SYSTEM Note the general language used as large targets for the assessment system Are these reflective of what should be valued? Are there ones that should be revised? Are there pieces that are missing?

  7. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Types of Assessments and Appropriate Uses Standards, Instruction, Assessment The Assessment Development Process Cross-walking those against the study requirements

  8. BALANCED ASSESSMENT SYSTEM Formative Tools Interim Assessment Based on learning theory Minute by minute between teacher and student Includes instructional resources to build student learning Not intended for aggregation or teacher/program evaluation Summative Assessment Optional District choice Diagnostic information Tracks growth Predicts summative Can be aggregated at classroom or building level End of year Can be used as a snapshot within and across schools and districts ESSA eliminated punitive consequences Information & transparency Examine equity and resource allocation All based on Oklahoma Standards and Goals for Students

  9. BALANCED ASSESSMENT SYSTEM Summative Assessment End of year Can be used as a snapshot within and across schools and districts ESSA eliminated punitive consequences Information & transparency Examine equity and resource allocation All based on Oklahoma Standards and Goals for Students

  10. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Alignment to Oklahoma Academic Standards

  11. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Alignment to Oklahoma Academic Standards Standards Instruction Assessment (in that order) Standards Instruction Assessment

  12. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Alignment to Oklahoma Academic Standards Standards Instruction Assessment (in that order) Standards Standards drive the curriculum design This, in turn, informs instructional delivery Instruction The assessment supports an evaluation of student response to instruction The assessment system s results inform adjustments to instruction (and potentially curriculum) Assessment

  13. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Alignment to Oklahoma Academic Standards An assessment is only as good as the standards it is intended to measure

  14. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Alignment to Oklahoma Academic Standards An assessment is only as good as the standards it is intended to measure The state standards always come first From here, eligible content for the assessment is identified (must demonstrate alignment) A blueprint is then developed to cover the standards with sufficient alignment

  15. ALIGNED TO STATE STANDARDS Match: The degree to which assessment items connect to standards Match Depth Depth: The degree to which assessment items cover the cognitive complexity of the standards Alignment Breadth: The degree to which assessment items cover the full range of the standards Breadth

  16. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Alignment to Oklahoma Academic Standards An assessment is only as good as the standards it is intended to measure The state standards always come first From here, eligible content for the assessment is identified (must demonstrate alignment) A blueprint is then developed to cover the standards with sufficient alignment Items are developed based on the blueprint and the items are field tested High quality items are used to create final forms of the assessment to make inferences about student mastery of the standards

  17. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Provide a measure of comparability among other states How does the development process differ if we desire comparability? Depends on the level of comparability.

  18. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Provide a measure of comparability among other states How does the development process differ if we desire comparability? Depends on the level of comparability. Within state (same development process) Proficiency: same claims for every student across the state regardless of grade Scale Score: same claims for every student across the state by grade

  19. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Provide a measure of comparability among other states How does the development process differ if we desire comparability? Depends on the level of comparability. Within state (same development process) Proficiency: same claims for every student across the state regardless of grade Scale Score: same claims for every student across the state by grade Across states (added complexity) Requires some common information from other states (e.g., same students take different items, or same items are administered to different students) Requires additional testing time or test administrations Requires items that are available for use (e.g., national tests) Requires administering those items under similar conditions

  20. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Yield both norm-referenced and criterion-referenced scores How does the development process differ if we desire criterion-referenced scores? It doesn t. Our criteria are the standards.

  21. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Yield both norm-referenced and criterion-referenced scores How does the development process differ if we desire criterion-referenced scores? It doesn t. Our criteria are the standards. How does the development process differ if we desire norm-referenced scores? It depends on our norm group. Remember, the inferences are normed, not the test.

  22. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Yield both norm-referenced and criterion-referenced scores How does the development process differ if we desire criterion-referenced scores? It doesn t. Our criteria are the standards. How does the development process differ if we desire norm-referenced scores? It depends on our norm group. Remember, the inferences are normed, not the test. Within state (same development process)

  23. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Yield both norm-referenced and criterion-referenced scores How does the development process differ if we desire criterion-referenced scores? It doesn t. Our criteria are the standards. How does the development process differ if we desire norm-referenced scores? It depends on our norm group. Remember, the inferences are normed, not the test. Within state (same development process) Across states (added complexity). Norms are dependent on many things: The sample who takes the test The conditions under which the test is administered (e.g., timed test, test environment) The types of questions that are administered (e.g., questions not prioritizing content, but instead student score)

  24. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Provide a measure of future academic performance for assessments administered in high school How does the development process differ if we desire measures of future academic performance?

  25. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Provide a measure of future academic performance for assessments administered in high school How does the development process differ if we desire measures of future academic performance? Similar to previous points, our claims of future performance rely on data from the target time E.g., Post-secondary readiness requires data on things like remediation rates or career placement

  26. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM Study Point: Provide a measure of future academic performance for assessments administered in high school How does the development process differ if we desire measures of future academic performance? Similar to previous points, our claims of future performance rely on data from the target time E.g., Post-secondary readiness requires data on things like remediation rates or career placement It is often difficult to satisfy both alignment to state standards and prediction toward future academic performance without performing additional studies Linking to a test that has post-secondary inferences Supplementing post-secondary readiness tests with items targeting state-specific standards Using longitudinal data to confirm post-secondary claims on HS assessments

  27. CONSIDERATIONS FOR DEVELOPING AN ASSESSMENT SYSTEM These considerations must intersect with the task force recommendations to yield an assessment of sufficient quality

  28. OKLAHOMAS ASSESSMENT SYSTEM Task Force Recommendations

  29. RECOMMENDATIONS FOR 3-8 AND HS ASSESSMENTS Recommendations were pulled from stated goals and discussions with the Task Force during the August meeting Drafted recommendations are an attempt to summarize conversation points that emerged Recommendations will be displayed and Task Force members will be asked to make revisions Grade 3-8 High School General Recommendations Additional questions will be posed. Task Force members will be asked to quickly answer these questions in small groups and share their responses with the room

  30. RECOMMENDATIONS FOR GRADES 3-8

  31. RECOMMENDATIONS FOR GRADES 3-8 ASSESSMENTS The Oklahoma assessment should maintain its focus on the Oklahoma State Standards Support the ability to measure growth for students and provide a measure of predicted performance on future OK tests Support criterion-referenced interpretations (i.e., performance against standards) and report Content-coverage: In an effort to support coverage of the OK standards, the assessment should include an adequate assessment of writing Individual claims should focus on how students perform relative to OK standards and should reflect scale score, Lexiles, Quantiles, content cluster, and growth. Support norm-referenced interpretations that include within-state percentile comparisons and some across-state comparison

  32. RECOMMENDATIONS FOR GRADES 3-8 ASSESSMENTS: ADDITIONAL QUESTIONS Given what we is necessary to support across-state comparisons, how granular must the comparison be? Is NAEP sufficient to show state-by-state competitiveness? Should the state attempt to support more granular comparisons (e.g., proficiency by grade, performance level, percentile, scale score), which may require longer tests, additional test administrations, nationally-available items, or additional costs? What student accountability uses should be implemented and why? If not, why should the 3-8 assessments not be used for student accountability?

  33. RECOMMENDATIONS FOR HIGH SCHOOL

  34. RECOMMENDATIONS FOR HIGH SCHOOL ASSESSMENTS Link proficiency on the grade 10 assessment to post-secondary readiness. Consider the use of an off-the-shelf high school assessment that measures post- readiness. Several questions are presented at the end of the document that was provided in advance to you that focus on potential issues that must be addressed with this approach Ensure that the high school assessment has applicability and value to students by connecting criterion-based inferences to outcomes of value (e.g., readiness for post- secondary, prediction of STEM readiness, remediation risk)

  35. RECOMMENDATIONS FOR HIGH SCHOOL ASSESSMENTS: ADDITIONAL QUESTIONS The first two recommendations are somewhat in conflict with each other. Given the potential constraints around the use of an assessment that provides a measure of college- readiness, should the state prioritize a standards-based assessment or a measure of college- readiness? Why? Please address; Common administration conditions and the possible lack of accommodations Timed testing Non-reported scores to post-secondary institutions for students who need accommodations Augmentation of measures of college-readiness to cover the OK standards Additional data from post-secondary sources to substantiate claims of post-secondary readiness for an OK standards-based assessment (e.g., Grade 10 assessment)

  36. RECOMMENDATIONS ACROSS ASSESSMENTS

  37. RECOMMENDATIONS ACROSS ASSESSMENTS Provide a reporting dashboard to support timely and accessible access to performance information Report scale score (and errors), performance levels, clear indicators of proficiency (i.e., a performance level that reflects proficient or above), and relevant predictive information (e.g., next grade, potential remediation, distance to proficiency) Provide an indicator of content cluster performance Provide appropriate comparison data depending on the level of reporting (e.g., student, teacher, building/district administrator) Maintain transparent subgroup reporting Continue providing Lexile and Quantile reporting with additional information on how to interpret information

  38. RECOMMENDATIONS ACROSS ASSESSMENTS: ADDITIONAL QUESTIONS Prior feedback indicated interest in earlier reporting. Please provide feedback to the following topics to better understand Task Force recommendations Should the testing window be moved up to allow for earlier reporting? Why or why not? How can we support the increased awareness of preliminary assessment reports? How can they be made more useful? A more widespread transition to online testing can facilitate a faster turnaround of reports. Consider the possible advantages and challenges associated with online testing. Should the state advocate for a more aggressive transition to online testing? Why or why not?

  39. OKLAHOMAS ASSESSMENT SYSTEM Finalizing Recommendations

  40. SCENARIOS AND REACTIONS Three possible scenarios will be presented to the Task Force members Task Force members will be asked to consider the scenarios and some of the issues associated with each Please note and discuss the advantages or disadvantages of each scenario Task Force members will then recommend one of the scenarios to be adopted by the state department to ensure it can be implemented (or explored for deeper study given the impending regulations being prepared)

  41. THREE POTENTIAL SCENARIOS 1. State, standards-based summative assessments in grades 3-8, 10 2. State, standards-based summative assessments in grades 3-8, 10 with a college entrance assessment in grade 11 3. State, standards-based summative assessments in grades 3-8 with a college entrance assessment in grade 11

  42. SCENARIO 1: STATE, STANDARDS-BASED SUMMATIVE ASSESSMENTS IN GRADES 3-8, 10 Guaranteed alignment to Oklahoma State Standards (i.e., no augmentation to assessment) Supports clearer interpretations when calculating growth in Grades 3-8 to Grade 10 Requires external information to support post-secondary readiness claims in grade 10 Requires additional items to support across-state comparisons (beyond NAEP comparisons) All accommodations are available for students with IEPs or 504 plans All students would receive directly comparable scores by grade

  43. SCENARIO III: STATE, STANDARDS-BASED SUMMATIVE ASSESSMENTS IN GRADES 3-8 ; COLLEGE ENTRANCE ASSESSMENT IN GRADE 11 Grade 11 tests may not be aligned to Oklahoma State Standards (i.e., would require augmentation to the assessment) Does not support clear interpretations when calculating growth in Grades 3-8 to Grade 11 Would not require external information to support post-secondary readiness claims May require stringent administration conditions to support across-state comparisons (beyond NAEP comparisons) May not provide the opportunity to provide necessary accommodations to students who have an IEP or 504 plan Students who receive accommodations may not have directly comparable scores in Grade 11

  44. SCENARIO II: STATE, STANDARDS-BASED SUMMATIVE ASSESSMENTS IN GRADES 3-8, 10; COLLEGE ENTRANCE ASSESSMENT IN GRADE 11 No augmentation would be necessary because of Grade 10 assessment No external information would be necessary post-secondary claims are baked into system Would support clearer interpretations when calculating growth in Grades 3-8 to Grade 10. Additional information could be provided in terms of readiness/remediation using Grade 10 to grade 11 data Grade 11 test may still require stringent administration conditions to support across-state comparisons (beyond NAEP comparisons) Grade 11 test still may not provide the opportunity to provide necessary accommodations or scores for students who receive all accommodations reflected in the IEP or 504 plan The initial use of the college-entrance assessment can support early studies that allow the Grade 10 assessment to support similar claims as the college-entrance assessment over time

  45. WHICH OF THE THREE SCENARIOS SHOULD BE ADOPTED OR EXPLORED BY THE OSDE? WHY? 1. State, standards-based summative assessments in grades 3-8, 10 2. State, standards-based summative assessments in grades 3-8, 10 with a college entrance assessment in grade 11 3. State, standards-based summative assessments in grades 3-8 with a college entrance assessment in grade 11

  46. ADDITIONAL QUESTIONS OR RECOMMENDATIONS?

  47. THANK YOU

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#