Developing and Using Scoring Guides and Rubrics in Assessment

undefined
 
Michigan Assessment
Michigan Assessment
Consortium
Consortium
Common Assessment
Common Assessment
Development Series
Development Series
Module 12
Module 12
D
D
e
e
v
v
e
e
l
l
o
o
p
p
i
i
n
n
g
g
 
 
a
a
n
n
d
d
 
 
U
U
s
s
i
i
n
n
g
g
 
 
S
S
c
c
o
o
r
r
i
i
n
n
g
g
G
G
u
u
i
i
d
d
e
e
s
s
 
 
a
a
n
n
d
d
 
 
R
R
u
u
b
b
r
r
i
i
c
c
s
s
 
1
 
N
a
r
r
a
t
e
d
 
B
y
:
 
Bruce Fay
Wayne 
RESA
 
2
I
n
 
T
h
i
s
 
M
o
d
u
l
e
 
Why and when you need rubrics
Different kinds of rubrics
How to develop a rubric
How to use a rubric
What scoring guides are
How to use scoring guides
3
S
u
b
j
e
c
t
i
v
i
t
y
 
i
n
 
S
c
o
r
i
n
g
 
No such thing
If it’s truly subjective, it’s just someone’s
opinion, and is of little or no value to the
person being assessed
Two definitions
4
T
h
e
 
I
s
s
u
e
 
i
s
 
B
i
a
s
 
Use rubrics and scoring guides to
have more objective and transparent
criteria
Use criteria to make quality work
visible to students
5
W
h
a
t
 
i
s
 
a
 
R
u
b
r
i
c
?
 
“…
guidelines, rules, or principles by which
student responses, products, or
performances are judged.  They describe
what to look for in student performances or
products to judge quality
.”
    
Scoring Rubrics in the Classroom, 
Judith Arter and Jay
McTighe, page 4
6
 
I
t
e
m
s
 
t
h
a
t
 
R
e
q
u
i
r
e
 
a
S
c
o
r
i
n
g
 
G
u
i
d
e
 
Anything that isn’t multiple choice
If a human scorer needs to make a
decision about a score, a rubric or
scoring guide is needed
7
W
h
e
r
e
 
D
o
 
R
u
b
r
i
c
s
 
F
i
t
?
 
 
Almost Everywhere!
8
 
9
S
c
o
r
i
n
g
 
a
n
 
A
s
s
e
s
s
m
e
n
t
 
I
t
e
m
 
Determine if student meets criteria
Use assessment results instructionally
Inform students about quality of work and
areas for improvement
Efficient for teachers
See patterns and trends
10
F
i
r
s
t
 
C
o
n
s
i
d
e
r
 
Checklists
Performance lists
11
C
C
h
h
e
e
c
c
k
k
l
l
i
i
s
s
t
t
s
s
 
Simple task
Present or not present
Students know ahead of time
12
P
P
e
e
r
r
f
f
o
o
r
r
m
m
a
a
n
n
c
c
e
e
 
 
L
L
i
i
s
s
t
t
s
s
 
Two or more aspects
Quality aspect for each aspect
Common scale, such as “Sometimes”,
“Always”, or “Never”
Unique scale for each aspect
13
R
R
u
u
b
b
r
r
i
i
c
c
 
Assessment task more complex
Levels of quality need to be
distinguished
Multiple scorers involved
14
R
R
e
e
l
l
i
i
a
a
b
b
i
i
l
l
i
i
t
t
y
y
 
 
o
o
f
f
 
 
R
R
u
u
b
b
r
r
i
i
c
c
 
 
a
a
n
n
d
d
S
S
c
c
o
o
r
r
i
i
n
n
g
g
 
 
G
G
u
u
i
i
d
d
e
e
 
Reasonable
Appropriate
Aligned
Consistent and fairly applied
15
V
a
r
i
e
t
i
e
s
 
o
f
 
R
u
b
r
i
c
s
 
Different types
Appropriate uses
Typical scoring ranges
16
 
R
u
b
r
i
c
 
P
r
o
p
e
r
t
i
e
s
 
17
R
u
b
r
i
c
 
P
r
o
p
e
r
t
i
e
s
18
R
u
b
r
i
c
 
P
r
o
p
e
r
t
i
e
s
19
R
u
b
r
i
c
 
P
r
o
p
e
r
t
i
e
s
20
R
u
b
r
i
c
 
P
r
o
p
e
r
t
i
e
s
21
R
u
b
r
i
c
 
T
y
p
e
s
 
A.  Generic – Holistic
B.  Generic – Analytic
C.  Task Specific – Holistic
D.  Task Specific – Analytic
 
Ex: NWREL 6+1 Traits Writing Rubric
22
H
o
l
i
s
t
i
c
 
R
u
b
r
i
c
s
 
-
 
S
t
r
e
n
g
t
h
s
 
Provide a quick, overall rating of quality
Judge the “impact” of a product or
performance
Use for Summative or large-scale
assessment
23
H
o
l
i
s
t
i
c
 
R
u
b
r
i
c
s
 
-
 
L
i
m
i
t
a
t
i
o
n
s
 
May lack the diagnostic detail needed to
Plan instruction
Allow students to see how to improve
Students’ work may get:
The same score for vastly different reasons
Lower scores based on only one missing
element of the work
24
A
n
a
l
y
t
i
c
 
R
u
b
r
i
c
s
 
 
S
t
r
e
n
g
t
h
s
 
Judge aspects of complex work
independently
Provide detailed/diagnostic data by trait
that can better inform instruction and
learning
25
A
n
a
l
y
t
i
c
 
R
u
b
r
i
c
s
 
 
L
i
m
i
t
a
t
i
o
n
s
 
More time consuming to learn and apply
May result in lower inter-rater agreement
when multiple scorers are used (without
appropriate procedures)
26
G
e
n
e
r
i
c
 
R
u
b
r
i
c
s
 
 
S
t
r
e
n
g
t
h
s
 
Complex skills that generalize across
tasks, grades, or content areas
Help students see “the big picture”,
generalize thinking
Promote/require thinking by the student
Allow for creative or unanticipated
responses
27
G
e
n
e
r
i
c
 
R
u
b
r
i
c
s
 
 
S
t
r
e
n
g
t
h
s
 
Situations where students are doing a
similar but not identical task
Can’t give away the answer ahead of time
More consistency with multiple raters 
(only
one rubric to learn, so you can learn it well)
28
G
e
n
e
r
i
c
 
R
u
b
r
i
c
s
 
 
L
i
m
i
t
a
t
i
o
n
s
 
Difficult to develop and validate
Takes time and practice to learn, internalize, and
apply consistently
Takes time and discipline to apply correctly
Requires a scoring procedure to ensure consistent
scores when multiple raters are involved
29
T
a
s
k
-
S
p
e
c
i
f
i
c
 
R
u
b
r
i
c
s
 
S
t
r
e
n
g
t
h
s
 
Specialized or highly structured assignments
Specific/detailed assessment goals
Provide focused feedback to student on work
Situations requiring consistent scoring from
multiple scorers with less training and/or inter-
rater control procedures
30
T
T
a
a
s
s
k
k
-
-
S
S
p
p
e
e
c
c
i
i
f
f
i
i
c
c
 
 
R
R
u
u
b
b
r
r
i
i
c
c
s
s
 
 
L
L
i
i
m
m
i
i
t
t
a
a
t
t
i
i
o
o
n
n
s
s
 
Can’t show to students ahead of time as they give
away the answer
Does not allow the student to see what quality
looks like ahead of time
Need a new rubric for each task
Rater on autopilot may miss correct answers not
explicitly shown in the rubric
31
T
h
e
 
R
u
b
r
i
c
 
D
e
s
i
g
n
e
r
s
 
T
a
s
k
 
Develop rubrics that…
Allow trained scorers to consistently assign
the correct score to each student’s work
Provide useful, actionable information to
the student and teacher
32
undefined
H
H
o
o
w
w
 
 
G
G
o
o
o
o
d
d
 
 
A
A
r
r
e
e
 
 
Y
Y
o
o
u
u
r
r
 
 
R
R
u
u
b
b
r
r
i
i
c
c
s
s
?
?
 
A meta-rubric for evaluating rubrics
33
A
 
T
r
a
i
t
-
A
n
a
l
y
t
i
c
 
R
u
b
r
i
c
 
f
o
r
E
v
a
l
u
a
t
i
n
g
 
R
u
b
r
i
c
s
 
(
D
e
b
 
W
a
h
l
s
t
r
o
m
)
 
Trait 1: Content/coverage
Trait 2: Clarity/detail
Trait 3: Usability
Trait 4: Technical quality
 
Highest quality = the WOW! level
34
M
-
R
 
T
r
a
i
t
 
1
 
 
C
o
n
t
e
n
t
/
C
o
v
e
r
a
g
e
 
Matching/alignment of
Learning targets
Actual instruction
Assessment task
WOW = very clear!
35
M
-
R
 
T
r
a
i
t
 
2
 
 
C
l
a
r
i
t
y
 
&
 
D
e
t
a
i
l
 
Different users likely to interpret the rubric in
the same way (correctly)
Use of rubric supports consistent scoring
across students, teachers, and time
WOW = Clear, complete, concise language –
not ambiguous, vague, or contradictory
36
M
-
R
 
T
r
a
i
t
 
3
 
 
U
s
a
b
i
l
i
t
y
 
&
P
r
a
c
t
i
c
a
l
i
t
y
 
Can be applied in a reasonable amount of time
Can easily explain/justify an assigned score
Student can see what they are doing well and why,
so they can maintain
Student can see what to do differently next time to
improve (earn a better score)
Teacher can see how to alter instruction for greater
student achievement
37
M
-
R
 
T
r
a
i
t
 
4
 
 
T
e
c
h
n
i
c
a
l
 
Q
u
a
l
i
t
y
 
Evidence of reliability (consistency) – across
students, teachers, and time
Evidence for validity (appropriateness) – students
and teachers agree that it supports teaching and
learning when used as intended
Evidence of fairness and lack of bias – does not
place any group at a disadvantage because of the
way the rubric is worded or applied
38
D
e
v
e
l
o
p
i
n
g
 
Y
o
u
r
 
O
w
n
 
R
u
b
r
i
c
s
 
Form a learning team
Locate/acquire additional resources
Study existing (high quality) rubrics
Modify them for your own use
Introduce them to your students (if
appropriate)
39
 
T
h
r
e
a
t
s
 
t
o
 
G
o
o
d
 
R
u
b
r
i
c
s
 
40
 
T
h
r
e
a
t
 
1
 
 
L
a
c
k
 
o
f
 
c
l
a
r
i
t
y
 
a
b
o
u
t
 
t
h
e
 
t
a
s
k
a
n
d
 
i
t
s
 
c
o
m
p
o
n
e
n
t
s
 
41
 
?
 
T
h
r
e
a
t
 
2
 
 
L
a
c
k
 
o
f
 
c
l
a
r
i
t
y
 
i
n
 
r
u
b
r
i
c
 
l
e
v
e
l
d
e
s
c
r
i
p
t
i
o
n
s
 
42
    
?
    
?
    
?
    
?
 
T
h
r
e
a
t
 
3
 
 
S
c
a
l
e
 
T
w
i
s
t
 
43
 
apples
 
oranges
 
Really good apple
 
Really bad apple
R
u
b
r
i
c
 
D
e
v
e
l
o
p
m
e
n
t
 
P
r
o
c
e
s
s
 
1.
Gather samples of student work
2.
Sort student work into groups and write
down the reasons for how it is sorted
3.
Cluster the reasons into traits
4.
Write a value-neutral definition of each trait
44
R
u
b
r
i
c
 
D
e
v
e
l
o
p
m
e
n
t
 
P
r
o
c
e
s
s
 
5.
Find samples of student work that illustrate
each quality level for each trait or overall
6.
Write value-neutral descriptions of each
quality level for each trait or overall
7.
Evaluate your rubric using the Meta-rubric
8.
Test it out with students, revise as needed
45
S
c
o
r
i
n
g
 
G
u
i
d
e
s
 
Not just a Rubric (it’s a rubric plus…)
Instructions for use in scoring
Anchor papers
Anon ex: student work (anonymous) each level
Annotated
May have to be developed after some initial
use of the rubric
46
S
c
o
r
i
n
g
 
P
r
o
c
e
s
s
 
Common Assessments = multiple human scorers
Human scorers must be trained in the correct,
consistent application of the rubric / scoring guide
Although more resource intensive, it is useful to
have items scored by more than one person, with a
process to resolve differences of opinion
At a minimum, some student work should be
scored by more than one person
47
I
n
t
e
r
-
r
a
t
e
r
 
R
e
l
i
a
b
i
l
i
t
y
 
(
I
R
R
)
 
The scoring of student work is unbiased,
i.e., the score assigned to the work does
NOT depend on who did the scoring
The technical term for this is Inter-Rater
Reliability (IRR)
Requires multiple scores for at least some
student responses
48
I
n
t
e
r
-
r
a
t
e
r
 
R
e
l
i
a
b
i
l
i
t
y
 
(
I
R
R
)
 
Good rubrics and scoring guides
Scorers trained in the use of the scoring
guides
A process for calibrating scorers
A scoring process to resolve discrepancies
49
 
A
c
k
n
o
w
l
e
d
g
m
e
n
t
s
 
This module is based on material adapted from:
Scoring Rubrics in the Classroom
Judith Arter and Jay McTighe
Corwin Press, Thousand Oaks, CA
and
material provided by
Edward Roeber, Michigan State University
 
50
S
S
e
e
r
r
i
i
e
e
s
s
 
 
D
D
e
e
v
v
e
e
l
l
o
o
p
p
e
e
r
r
s
s
 
Kathy Dewsbury White, Ingham ISD
Kathy Dewsbury White, Ingham ISD
Bruce Fay, Wayne RESA
Bruce Fay, Wayne RESA
Jim Gullen, Oakland Schools
Jim Gullen, Oakland Schools
Julie McDaniel, Oakland Schools
Julie McDaniel, Oakland Schools
Edward Roeber, MSU
Edward Roeber, MSU
Ellen Vorenkamp, Wayne RESA
Ellen Vorenkamp, Wayne RESA
Kim Young, Ionia County ISD/MDE
Kim Young, Ionia County ISD/MDE
51
D
D
e
e
v
v
e
e
l
l
o
o
p
p
m
m
e
e
n
n
t
t
 
 
S
S
u
u
p
p
p
p
o
o
r
r
t
t
 
 
f
f
o
o
r
r
 
 
t
t
h
h
e
e
A
A
s
s
s
s
e
e
s
s
s
s
m
m
e
e
n
n
t
t
 
 
S
S
e
e
r
r
i
i
e
e
s
s
 
The MAC Common Assessment Development
Series is funded in part by the Michigan
Association of Intermediate School Administrators
In cooperation with
Michigan Department of Education
Ingham and Ionia ISDs, Oakland Schools, and Wayne
RESA
Michigan State University
 
 
 
 
 
52
Slide Note

[ 12R-V Rubrics & Scoring Guides 20110630-0746.ppt x ]

Welcome to the Michigan Assessment Consortium common assessment development series. The topic of this module is Rubrics and Scoring Guides.

Embed
Share

Explore the importance of rubrics and scoring guides in assessment development, understanding different types of rubrics, developing and using them effectively, and addressing bias through objective criteria. Learn how rubrics help in evaluating student responses and providing feedback for improvement.

  • Assessment
  • Rubrics
  • Scoring Guides
  • Student Evaluation
  • Criteria

Uploaded on Oct 02, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Michigan Assessment Consortium Common Assessment Development Series Module 12 Developing and Using Developing and Using Scoring Guides and Rubrics 1

  2. Narrated By: Narrated By: Bruce Fay Wayne RESA 2

  3. In This Module In This Module Why and when you need rubrics Different kinds of rubrics How to develop a rubric How to use a rubric What scoring guides are How to use scoring guides 3

  4. Subjectivity in Scoring Subjectivity in Scoring No such thing If it s truly subjective, it s just someone s opinion, and is of little or no value to the person being assessed Two definitions 4

  5. The Issue is Bias The Issue is Bias Use rubrics and scoring guides to have more objective and transparent criteria Use criteria to make quality work visible to students 5

  6. What is a Rubric? What is a Rubric? guidelines, rules, or principles by which student responses, products, or performances are judged. They describe what to look for in student performances or products to judge quality. Scoring Rubrics in the Classroom, Judith Arter and Jay McTighe, page 4 6

  7. Items that Require a Items that Require a Scoring Guide Scoring Guide Anything that isn t multiple choice If a human scorer needs to make a decision about a score, a rubric or scoring guide is needed 7

  8. Where Do Rubrics Fit? Where Do Rubrics Fit? Almost Everywhere! 8

  9. 9

  10. Scoring an Assessment Item Scoring an Assessment Item Determine if student meets criteria Use assessment results instructionally Inform students about quality of work and areas for improvement Efficient for teachers See patterns and trends 10

  11. First Consider First Consider Checklists Performance lists 11

  12. Checklists Checklists Simple task Present or not present Students know ahead of time 12

  13. Performance Lists Performance Lists Two or more aspects Quality aspect for each aspect Common scale, such as Sometimes , Always , or Never Unique scale for each aspect 13

  14. Rubric Rubric Assessment task more complex Levels of quality need to be distinguished Multiple scorers involved 14

  15. Reliability of Rubric and Reliability of Rubric and Scoring Guide Scoring Guide Reasonable Appropriate Aligned Consistent and fairly applied 15

  16. Varieties of Rubrics Varieties of Rubrics Different types Appropriate uses Typical scoring ranges 16

  17. Rubric Properties Rubric Properties Holistic Analytic Generic A B Task Specific C D 17

  18. Rubric Properties Rubric Properties Holistic Analytic Generic A B Task Specific C D 18

  19. Rubric Properties Rubric Properties Holistic Analytic Generic A B Task Specific C D 19

  20. Rubric Properties Rubric Properties Holistic Analytic Generic A B Task Specific C D 20

  21. Rubric Properties Rubric Properties Holistic Analytic Generic A B Task Specific C D 21

  22. Rubric Types Rubric Types A. Generic Holistic B. Generic Analytic C. Task Specific Holistic D. Task Specific Analytic Ex: NWREL 6+1 Traits Writing Rubric 22

  23. Holistic Rubrics Holistic Rubrics - - Strengths Strengths Provide a quick, overall rating of quality Judge the impact of a product or performance Use for Summative or large-scale assessment 23

  24. Holistic Rubrics Holistic Rubrics - - Limitations Limitations May lack the diagnostic detail needed to Plan instruction Allow students to see how to improve Students work may get: The same score for vastly different reasons Lower scores based on only one missing element of the work 24

  25. Analytic Rubrics Analytic Rubrics Strengths Strengths Judge aspects of complex work independently Provide detailed/diagnostic data by trait that can better inform instruction and learning 25

  26. Analytic Rubrics Analytic Rubrics Limitations Limitations More time consuming to learn and apply May result in lower inter-rater agreement when multiple scorers are used (without appropriate procedures) 26

  27. Generic Rubrics Generic Rubrics Strengths Strengths Complex skills that generalize across tasks, grades, or content areas Help students see the big picture , generalize thinking Promote/require thinking by the student Allow for creative or unanticipated responses 27

  28. Generic Rubrics Generic Rubrics Strengths Strengths Situations where students are doing a similar but not identical task Can t give away the answer ahead of time More consistency with multiple raters (only one rubric to learn, so you can learn it well) 28

  29. Generic Rubrics Generic Rubrics Limitations Limitations Difficult to develop and validate Takes time and practice to learn, internalize, and apply consistently Takes time and discipline to apply correctly Requires a scoring procedure to ensure consistent scores when multiple raters are involved 29

  30. Task Task- -Specific Rubrics Specific Rubrics Strengths Strengths Specialized or highly structured assignments Specific/detailed assessment goals Provide focused feedback to student on work Situations requiring consistent scoring from multiple scorers with less training and/or inter- rater control procedures 30

  31. Task Task- -Specific Rubrics Specific Rubrics Limitations Limitations Can t show to students ahead of time as they give away the answer Does not allow the student to see what quality looks like ahead of time Need a new rubric for each task Rater on autopilot may miss correct answers not explicitly shown in the rubric 31

  32. The Rubric Designers Task The Rubric Designer s Task Develop rubrics that Allow trained scorers to consistently assign the correct score to each student s work Provide useful, actionable information to the student and teacher 32

  33. How Good Are Your Rubrics? How Good Are Your Rubrics? A meta-rubric for evaluating rubrics 33

  34. A Trait A Trait- -Analytic Rubric for Analytic Rubric for Evaluating Rubrics Evaluating Rubrics (Deb (Deb Wahlstrom Wahlstrom) ) Trait 1: Content/coverage Trait 2: Clarity/detail Trait 3: Usability Trait 4: Technical quality Highest quality = the WOW! level 34

  35. M M- -R Trait 1 R Trait 1 Content/Coverage Content/Coverage Matching/alignment of Learning targets Actual instruction Assessment task WOW = very clear! 35

  36. M M- -R Trait 2 R Trait 2 Clarity & Detail Clarity & Detail Different users likely to interpret the rubric in the same way (correctly) Use of rubric supports consistent scoring across students, teachers, and time WOW = Clear, complete, concise language not ambiguous, vague, or contradictory 36

  37. M M- -R Trait 3 R Trait 3 Usability & Practicality Practicality Usability & Can be applied in a reasonable amount of time Can easily explain/justify an assigned score Student can see what they are doing well and why, so they can maintain Student can see what to do differently next time to improve (earn a better score) Teacher can see how to alter instruction for greater student achievement 37

  38. M M- -R Trait 4 R Trait 4 Technical Quality Technical Quality Evidence of reliability (consistency) across students, teachers, and time Evidence for validity (appropriateness) students and teachers agree that it supports teaching and learning when used as intended Evidence of fairness and lack of bias does not place any group at a disadvantage because of the way the rubric is worded or applied 38

  39. Developing Your Own Rubrics Developing Your Own Rubrics Form a learning team Locate/acquire additional resources Study existing (high quality) rubrics Modify them for your own use Introduce them to your students (if appropriate) 39

  40. Threats to Good Rubrics Threats to Good Rubrics WOW Most Some None Trait 1 Trait 2 Trait 3 Trait 4 40

  41. Threat 1 Threat 1 Lack of clarity about the task Lack of clarity about the task and its components and its components WOW Most Some None Trait 1 Trait 2 ? Trait 3 Trait 4 41

  42. Threat 2 Threat 2 Lack of clarity in rubric level Lack of clarity in rubric level descriptions descriptions WOW Most Some None Trait 1 ? Trait 2 ? Trait 3 ? Trait 4 ? 42

  43. Threat 3 Threat 3 Scale Twist Scale Twist WOW Most Some None Trait 1 oranges apples Trait 2 Trait 3 Really bad apple Really good apple Trait 4 43

  44. Rubric Development Process Rubric Development Process 1. Gather samples of student work 2. Sort student work into groups and write down the reasons for how it is sorted 3. Cluster the reasons into traits 4. Write a value-neutral definition of each trait 44

  45. Rubric Development Process Rubric Development Process 5. Find samples of student work that illustrate each quality level for each trait or overall 6. Write value-neutral descriptions of each quality level for each trait or overall 7. Evaluate your rubric using the Meta-rubric 8. Test it out with students, revise as needed 45

  46. Scoring Guides Scoring Guides Not just a Rubric (it s a rubric plus ) Instructions for use in scoring Anchor papers Anon ex: student work (anonymous) each level Annotated May have to be developed after some initial use of the rubric 46

  47. Scoring Process Scoring Process Common Assessments = multiple human scorers Human scorers must be trained in the correct, consistent application of the rubric / scoring guide Although more resource intensive, it is useful to have items scored by more than one person, with a process to resolve differences of opinion At a minimum, some student work should be scored by more than one person 47

  48. Inter Inter- -rater Reliability (IRR) rater Reliability (IRR) The scoring of student work is unbiased, i.e., the score assigned to the work does NOT depend on who did the scoring The technical term for this is Inter-Rater Reliability (IRR) Requires multiple scores for at least some student responses 48

  49. Inter Inter- -rater Reliability (IRR) rater Reliability (IRR) Good rubrics and scoring guides Scorers trained in the use of the scoring guides A process for calibrating scorers A scoring process to resolve discrepancies 49

  50. Acknowledgments Acknowledgments This module is based on material adapted from: Scoring Rubrics in the Classroom Judith Arter and Jay McTighe Corwin Press, Thousand Oaks, CA and material provided by Edward Roeber, Michigan State University 50

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#