ASSESSMENT LITERACY STANDARDS - A NATIONAL IMPERATIVE - BRIEF // FALL 2017 - MICHIGAN ASSESSMENT CONSORTIUM
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
“Assessment literacy is the set of beliefs, knowledge and practices about assessment that lead a teacher, administrator, policymaker, or students and their families, to use assessment to improve student learning and achievement.” Michigan Assessment Consortium. (2017). Assessment Literacy Standards. Mason, MI : Michigan Assessment Consortium. 2 Assessment Literacy Standards // Version 5.1
Purpose Student assessment has become increasingly important to educators, students, their families, and the public. Yet, despite the link between instruction and assessment—and the proliferation of large-scale and classroom assessment programs—most of those affected by student assessment (students, their families, teachers, school administrators, and local and state policymakers) may not understand the assessment tools and strategies used, their pur- poses, the types of assessment that can best match purposes for assess- ment, and the strengths and shortcomings of the various types of measures. They are also not prepared to use the results from these assessments to benefit students—to improve their learning and their achievement. Because of these issues, the Michigan Assess- achievement and educational accomplishment, ment Consortium (MAC) has undertaken an as well as to guide improved learning, is criti- effort to create “Assessment Literacy Stan- cal. Understanding what assessment can and dards” for various individuals who are affected cannot accomplish is important to ensure that by student assessments. Assessment literacy such information is used in the most positive standards for students and their families, teach- and accurate manner possible. ers, administrators and policymakers will serve The standards are not intended to be technical as the foundation from which the field comes in nature, nor should the standards be di- to understand what assessment literacy means vorced from consideration of the various ways and the role and purpose of comprehensive, in which teachers instruct and students learn balanced, quality assessment systems. This set in classrooms and elsewhere. The emphasis of assessment literacy standards shows what of the standards is on assessment for learn- assessment literate students and their families, ing. This includes student self-assessment and teachers, administrators and policy makers goal setting. Consequently, balance is a critical value, know, and can do. The ultimate goal of component of the standards: (a) Balance needs the MAC is that the assessment literacy stan- to exist between multiple measures, which dards are used to inform policy and program may include formative assessment strategies, TABLE of contents development and decisions regarding assess- Assessment Literacy as well as interim and summative assessments, ment practices, teacher preparation, adminis- Standards for: and (b) Balance of multiple uses of assessments trative certification, educator evaluation, and Elementary Students such as diagnostic, placement, and progress, school accreditation. and Their Families.................... 5 as well as users such as students and their The purpose of the standards is driven by both families, parents, educators and policymakers, Secondary Students and Their Families.................... 6 the users and uses of assessment. Assessment is critical to the effective use of assessment Teachers.................................... 8 literacy is essential in this era where important results. It is the intention of the MAC that decisions are being made about students, accomplishing these standards will improve cur- Building Administrators........ 10 educators, and educational systems based on ricula, instruction, and assessment, all leading District Administrators.......... 12 the data collected from students. Understand- to improved student achievement. Policymakers........................... 14 ing the appropriate roles that student assess- Glossary.................................. 16 ment can play to determine levels of student Appendices............................. 18 MichiganAssessment 3 Consortium.org
Assessment Development Format and Use Literacy Standards The standards are intended for long-term use The Assessment Literacy Standards were written in the field of education as opposed to being a for five groups of individuals: temporal topic that fades from importance with ·· Students and Their Families Local and State the rise of new issues. A number of documents Policymakers ·· Classroom Teachers were used in the development of the standards ·· Building Administrators at the five levels. A list of these documents is ·· District Administrators also included. One of these documents, Assess- ·· Policymakers ment Literacy in Michigan Education (Roeber, District 2011), provided the basic multifaceted frame- The standards for each group who are assess- Administrators work of assessment literacy and was used to ment literate are separated into Dispositions, align the standards for these groups. A brief Knowledge, and Performance. Administrator bibliography of resources is provided at the end ·· Dispositions: standards address what Building Certification of the document. the individuals who are assessment literate Administrators After the standards were drafted, they were believe regarding assessment. sent to various authors and experts in order to ·· Knowledge: standards specify the particular solicit suggestions and support for the project. Pre-Service vocabulary, processes and practices Teachers Revisions based on the feedback were complet- Teachers that individuals who are assessment literate ed in the spring of 2013. A protocol for review comprehend. of the standards by Michigan educators was created and posted on the MAC website. Board ·· Performance: standards address the skills members of the MAC and others hosted review and competencies for which individuals who Students sessions to obtain additional suggestions to the are assessment literate be proficient. and Their Families standards. Final revisions were made based on A purposeful decision was made to include feedback received at the Michigan School Test- all the relevant standards for each of the five ing Conference in February 2014. groups, understanding that some redundancy would result. It was considered critical to include Goals all the standards for each group in order that There are two primary goals for developing the the standards of any of the five groups could Assessment Literacy Standards: stand alone, and yet remain comprehensive. In addition, understanding the standards for the 1. Create a set of standards that provides the different groups increases the overall under- dispositions, knowledge and skills various standing for individuals within a group. parties who are assessment literate need to possess and use in order to maximize the The Glossary and Appendices provide definitions benefits of student assessments, and reduce/ of the terms in the Assessment Literacy Standards eliminate the negative impacts or conse- as a tool for greater understanding of the intent quences of assessment. and meaning of the standards. 2. Develop and implement activities and materi- The Appendices provide useful reference mate- als that can be used to increase the knowl- rial about assessment practices. edge and skills of assessment users: educa- tors, parents/guardians, and policymakers. The ultimate goal of this effort is to create a more assessment-literate population able to better use student assessments to improve student learning and achievement. The standards are intended for long-term use in the field of education as opposed to being a temporal topic that fades from importance with the rise of new issues. 4 Assessment Literacy Standards // Version 5.1
Assessment Literacy Standards // Elementary Students and their families Dispositions Performance Elementary students and their families who Elementary students and their families who are assessment literate believe that students: are assessment literate are able to: A. Learn best when they know the targets for their learning. A. Use feedback to improve their learning. B. Learn from taking quality assessments. B. Use rubrics to look at their work and that of their peers. C. Learn from effective feedback provided on their work from C. Use assessment results to improve their achievement. their teachers. D. Use assessments and assessment feedback to improve their D. Are responsible for their own learning. attitude toward learning. E. Need to use assessment results to learn more. E. Explain their assessment results to their teachers and their parents/guardians. Knowledge F. Keep track of their own learning over time. Elementary students and their families who are assessment literate know: A. That there are different reasons for taking assessments. B. That different types of assessments are used in the classroom. C. That different types of assessments provide different types of information about what they know and can do. D. How to use rubrics to assess their own work. E. How to use assessment results to reflect on their learning and to set goals for future learning. MichiganAssessment 5 Consortium.org
Assessment Literacy Standards // Secondary Students and their families Dispositions Performance Secondary students and their families who are Secondary students and their families who are assessment literate believe that students: assessment literate are able to: A. Learn best when they know the targets for their learning. A. Use learning targets to understand the standards and to B. Learn from taking quality assessments. support their learning. C. Learn from effective feedback provided on their work from B. Use feedback to decide on how to improve their achieve- their teachers. ment. D. Are responsible for their own learning. C. Use different protocols for looking at their work with peers and teachers. E. Can use self-monitoring to improve their achievement. D. Use assessment feedback to improve their attitudes, aspira- F. Need to use their own assessment results to learn more. tions, mindsets and achievement. Knowledge E. Interpret and explain their assessment results to their teachers and their parents/guardians. Secondary students and their families who are F. Use multiple sources of data over time to identify trends in assessment literate know: their learning. A. That there are different reasons for taking assessments. 1. Improving their achievement and learning 2. Student accountability and grading 3. Providing information that predicts their future perfor- mance/achievement B. That different types of assessments are used in the classroom. 1. Selected response: Multiple-Choice, True-False, Matching 2. Constructed response: Short or Extended Written Response 3. Performance: Written responses, presentations or products 4. Personal Communication: Observations and Interviews C. That different types of assessments provide different types of information about what they know and can do. D. How to use rubrics to assess their own work. E. That feedback can be descriptive vs. evaluative. F. How to use assessment results to reflect on their learning and to set goals for future learning. 6 Assessment Literacy Standards // Version 5.1
Assessment literacy standards for students and their families, teachers, administrators and policymakers will serve as the foundation from which the field comes to understand what assessment literacy means and the role and purpose of comprehensive, balanced, quality assessment systems. MichiganAssessment 7 Consortium.org
Assessment Literacy Standards // Teachers Dispositions Knowledge Teachers who are assessment literate believe: Teachers who are assessment literate know: A. All educators must be proficient in their understanding and A. A balanced assessment system consists of both of the following: use of assessment. 1. Different users have different assessment purposes. B. An effective assessment system must balance different 2. Different assessment purposes may require different purposes for different users and use varied methods of assessment methods. assessment and communication. B. There are different purposes for student assessment: C. When assessment is done correctly, the resulting data can be 1. Student improvement used to make sound educational decisions. 2. Instructional program improvement 3. Student, teacher or system accountability D. Multiple measures can provide a more balanced picture of a 4. Program evaluation student or a school. 5. Prediction of future performance/achievement E. Quality assessments are a critical attribute of effective teach- C. The definitions of and uses for different types of assessments: ing and learning. 1. Summative assessment F. Assessment results should be used to make instructional 2. Interim benchmark assessment decisions to improve student learning. 3. Formative-assessment practices G. Clear learning targets, understood by students, are neces- 4. Criterion vs. norm-referenced assessment interpretations sary for learning and assessment. D. The differences between the types of assessment tools: H. Effective feedback is critical to support learning. 1. Achievement I. Students should be active partners in learning how to use 2. Aptitude assessment results to improve their learning. 3. Diagnostic 4. Screening J. Students can use instructionally-sensitive assessment results to improve their learning. E. The different types of assessment methods best matched to learning targets: K. Good classroom assessment and quality instruction are intri- 1. Selected response: Multiple-choice, True-False, Matching cately linked to each other. 2. Constructed response: Short or Extended Written Response L. Grading is an exercise in professional judgment, not just a 3. Performance: Written responses, presentations or products numerical, mechanical exercise. 4. Personal Communication: Observations and interviews F. Non-technical understanding of statistical concepts associ- ated with assessment: 1. Measures of central tendency 2. Measures of variability 3. Reliability 8 Assessment Literacy Standards // Version 5.1
Knowledge continued PERFORMANCE continued 4. Validity: A characteristic of the use of the assessment, E. Implement the 5-step process for assessment development: not the assessment itself 1. Plan the assessment 5. Bias/sensitivity 2. Develop the assessment items 6. Correlation vs. causation 3. Review and critique the assessment items G. How to develop or select high quality assessments: 4. Field test the items to see if they work 1. Determine the purpose for assessing 5. Review and revise the items 2. Determine the standards or learning targets to be F. Use assessment data within appropriate, ethical and legal assessed guidelines. 3. Select the assessment methods appropriate to learning G. Use a variety of protocols for looking at and scoring student targets and assessment purpose(s) work. 4. Design a test plan or blueprint that will permit confident conclusions about achievement H. Accurately determine and communicate levels of proficiency. 5. Select or construct the necessary assessment items and I. Use assessment results to make appropriate instructional scoring tools where needed decisions for individual students and groups of students. 6. Field test the items in advance or review them before J. Provide timely, descriptive and actionable feedback to stu- reporting the results dents based on assessment results. 7. Improve the assessment through review and analysis to K. Support student use of assessment feedback to improve eliminate bias and distortion attitudes, aspirations, mindsets and achievement. 8. Assessments can be purchased or developed locally; each approach has advantages and challenges L. Use grading practices that result in grades that are accurate, consistent, meaningful and supportive of learning. H. There are different ways to report results: 1. Normative interpretations and M. Use assessment results appropriately to modify instruction 2. Criterion-referenced interpretations to improve student achievement. I. What assessment data validly reflects a teacher’s effectiveness N. Collaboratively analyze data and use data to improve instruction. J. How to translate standards into clear learning targets that are written in student-friendly language and used as the O. Use multiple sources of data over time to identify trends in basis for the everyday curriculum. learning. K. What assessment accommodations are available and when P. Use data management systems to access and analyze data. to use them with students with disabilities and English Q. Communicate effectively with students, parents/guardians, Language Learners. other teachers, administrators and community stakeholders L. How to provide effective feedback from assessments suit- about student learning. able for different audiences: descriptive vs.evaluative R. Seek to increase their knowledge and skills in assessment. M. How to use and create scoring tools (guides, rubrics, check- lists, scoring rules, standards) Teachers who are assessment literate pro- N. Sound grading and reporting practices mote the use of assessment data to improve O. How to engage students in using their own assessment student learning through the alignment of results for reflection and goal setting curriculum, instruction and assessment by: Performance A. Implementing district-developed learning progressions Teachers who are assessment literate are able to: B. Clearly explaining how to analylize and use asessment results. A. Self-assess their work and model this for students. C. Using assessment results, including subgroup performance, B. Select and use various assessment methods appropriate to to influence the classroom’s curriculum and instructional assessment purposes and learning targets. program. C. Use learning targets aligned to the standards and under- D. Using multiple sources of data over time to identify trends stood by students to guide instruction. in learning. D. Use learning progressions to guide instruction and assessment. E. Using assessment results to reflect on their own effectiveness. MichiganAssessment 9 Consortium.org
Assessment Literacy Standards // Building administrators Dispositions Knowledge Building Administrators who are assessment Building Administrators who are assessment literate believe: literate know: A. All educators must be proficient in their understanding and A. A balanced assessment system consists of both of the following: use of assessment. 1. Different users have different assessment purposes B. An effective assessment system must balance different 2. Different assessment purposes may require different purposes for different users and use appropriate assessment assessment methods methods to measure different learning targets. B. There are different purposes for student assessment: C. When assessment is done correctly, the resulting data can be 1. Student improvement used to make sound educational decisions. 2. Instructional program improvement 3. Student, teacher or system accountability D. Multiple measures can provide a more balanced picture of a 4. Program evaluation student or a school. 5. Prediction of future performance/achievement E. Quality assessments are a critical attribute of effective teach- C. The definitions of and uses for different types of assessments: ing and learning. 1. Summative assessment F. Assessment results should be used to make instructional 2. Interim benchmark assessment decisions that impact learning. 3. Formative-assessment practices G. Clear learning targets, understood by students, are neces- 4. Criterion vs. norm-referenced assessment interpretations sary for learning and assessment. D. The differences between the types of assessment tools: H. Effective feedback is critical to support learning. 1. Achievement I. Students should be active partners in their learning and 2. Aptitude assessment. 3. Diagnostic 4. Screening J. Students can use instructionally-sensitive assessment results to improve their learning. E. The different types of assessment methods and when teachers should use each: K. Time and resources are needed to: 1. Selected response: Multiple-choice, True-False, Matching 1. Learn to select or develop assessments 2. Constructed response: Short or Extended written response 2. Administer assessments 3. Performance: Written responses, presentations or products 3. Use the assessment results appropriately 4. Personal Communication: Observations and interviews L. Good classroom assessment and quality instruction are intri- F. Non-technical understanding of statistical concepts associ- cately linked to each other. ated with assessment: M. Grading is an exercise in professional judgment, not just a 1. Measures of central tendency numerical, mechanical exercise. 2. Measures of variability N. Appropriate, high-quality assessment practices should be used in all classrooms. 10 Assessment Literacy Standards // Version 5.1
Knowledge continued Building Administrators who are assessment 3. Reliability literate promote a culture of appropriate 4. Validity: a characteristic of the use of the assessment, not assessment practice by: the assessment itself A. Promoting assessment literacy for self and staff through: 5. Bias/sensitivity 1. Professional Learning Communities 6. Correlation vs. causation 2. Targeted and Differentiated Professional Development G. How to develop or select high quality assessments: 3. Walk-throughs (data collection – goal setting) 1. Determine the purpose for assessment 4. Educator evaluation practices (i.e., program, teacher, 2. Determine the standards or learning targets to be assessed and administrator) 3. Select the assessment methods appropriate to learning B. Providing time and support for staff to implement a bal- targets and assessment purpose(s) anced assessment system by providing opportunities to 4. Design a test plan or blueprint that will permit confident develop skills in: conclusions about achievement 1. Using instructionally embedded formative assessment 5. Select or construct the necessary assessment items with 2. Administering assessments scoring guides where needed 3. Scoring/Analyzing results 6. Field test the items in advance or review them before 4. Developing instructional plans based on results reporting the results 5. Developing school improvement plans based on results 7. Improve the assessment through review and analysis to eliminate bias and distortion C. Assuring that each and every staff member is: 8. Assessments can be purchased or developed locally; each 1. A confident, competent master themselves of the targets approach has advantages and challenges 2. Sufficiently assessment literate to assess their assigned tar- gets, productively in both formative and summative ways. H. There are two ways to report results, and specific circum- stances when each is useful: D. Holding building-level staff accountable for implementing. 1. Normative interpretations and 2. Criterion-referenced interpretations Building Administrators who are assessment I. Assessment data that validly reflects a teacher’s effectiveness. literate promote the use of assessment data to Performance improve student learning through the alignment Building Administrators who are assessment of curriculum, instruction and assessment by: literate are able to: A. Implementing district-developed learning progressions. A. Use assessment data within appropriate, ethical and legal B. Assuring horizontally and vertically aligned curriculum, guidelines. instruction and assessment in the building. B. Understand and communicate levels of proficiency accurately. C. Clearly explaining how to analyze and use assessment results. C. Use assessment results to make appropriate instructional D. Leading dialogues with staff in interpreting results and decisions for groups of students. creating goals for improvement. D. Collaboratively analyze data and use data to improve E. Assisting teachers in collaboratively analyzing and using instruction. data in a professional learning community. E. Use multiple sources of data over time to identify trends in F. Using assessment results, including subgroup performance, to learning. influence the school’s curriculum and instructional program. F. Use data management systems to access and analyze data. G. Using multiple data sources over time to identify learning trends. G. Communicate effectively with students, parents, other H. Using assessment data to reflect on effectiveness of teach- teachers, administrators and community stakeholders about ers’ instructional strategies. student learning. I. Incorporating assessment knowledge in evaluation practices H. Seek to increase their knowledge and skills in assessment. (i.e., program, teacher, and administrator). J. Clearly communicating results to various constituents through a coherent communication system that uses a variety of methods. K. Using data management systems to access and analyze data. L. Using assessment data within appropriate, ethical, and legal guidelines. MichiganAssessment 11 Consortium.org
Assessment Literacy Standards // District administrators Dispositions Knowledge District Administrators who are assessment District Administrators who are literate believe: assessment literate know: A. All educators must be proficient in their understanding and A. A balanced assessment system consists of both of the use of assessment. following: B. An effective assessment system must balance different 1. Different users have different assessment purposes purposes for different users and use varied methods of 2. Different assessment purposes may require different assessment and communication. assessment methods C. When assessment is done correctly, the resulting data can be B. There are different purposes for student assessment: used to make sound educational decisions. 1. Student improvement 2. Instructional program improvement D. Multiple measures can provide a more balanced picture of a 3. Student, teacher or system accountability student or a school. 4. Program evaluation E. Quality assessments are a critical attribute of effective teach- 5. Prediction ofPrediction future performance/achievement ing and learning. C. The definitions of and uses for different types of assessments: F. Assessment results should be used to make instructional 1. Summative assessment decisions that impact learning. 2. Interim benchmark assessment G. Clear learning targets, understood by students, are neces- 3. Formative-assessment practices sary for learning and assessment. 4. Criterion vs. norm-referenced assessment interpretations H. Students should be active partners in their learning and D. The differences between the types of assessment tools: assessment. 1. Achievement I. Students can use instructionally-sensitive assessment results to 2. Aptitude improve their learning. 3. Diagnostic 4. Screening, J. Users of assessments require time to learn to select, develop, and administer the assessments, as well as use the assess- E. The different types of assessment methods and when educa- ment results appropriately, and resources are needed to carry tors should use each: out these activities. 1. Selected response: Multiple-choice, True-False, Matching 2. Constructed response: Short or Extended written response K. Good classroom assessment and quality instruction are 3. Performance: Written responses, presentations or products intricately linked to each other. 4. Personal Communication: Observations and interviews L. Grading is an exercise in professional judgment, not just a F. Non-technical understanding of statistical concepts associ- numerical, mechanical exercise. ated with assessment: M. Appropriate, high-quality assessment practices should be 1. Measures of central tendency used in all buildings. 2. Measures of variability 12 Assessment Literacy Standards // Version 5.1
Knowledge continued PERFORMANCE continued 3. Reliability A. Promoting assessment literacy for self and staff through: 4. Validity: a characteristic of the use of the assessment, not 1. Professional Learning Communities the assessment itself 2. Targeted and Differentiated Professional Development 5. Bias/sensitivity 3. Walk-throughs (data collection – goal setting) 6. Correlation vs. causation 4. Educator evaluation practices (i.e., program, teacher, G. How to develop or select high quality assessments: and administrator) 1. Determine the purpose for assessment B. Providing time and support for staff to implement a balanced 2. Determine the standards or learning targets to be assessed assessment system by providing opportunities to develop skills 3. Select the assessment methods appropriate to learning in: targets and assessment purpose(s) 1. Using instructionally embedded formative assessment 4. Design a test plan or blueprint that will permit confident 2. Selecting, creating, and developing assessments conclusions about achievement 3. Administering assessments 5. Select or construct the necessary assessment items with 4. Scoring/Analyzing results scoring guides where needed 5. Developing instructional plans based on results 6. Field test the items in advance or review them before 6. Developing school improvement plans based on results reporting the results C. Instituting policies with supportive resources (time and budget) 7. Improve the assessment through review and analysis to to implement a balanced system of assessment in the district. eliminate bias and distortion 8. Assessments can be purchased or developed locally; each D. Assuring that each and every staff member is: approach has advantages and challenges 1. A confident, competent master themselves of the targets that they are responsible for teaching H. There are two ways to report results, and specific circum- 2. Sufficiently assessment literate to assess their assigned tar- stances when each is useful: gets productively in both formative and summative ways. 1. Normative interpretations 2. Criterion-referenced interpretations E. Holding building-level staff accountable for implementing high quality assessments. I. The multiple sources of assessment data that validly reflect a teacher’s effectiveness. District Administrators who are assessment District Administrators who are assessment literate promote the use of assessment data to literate are able to: improve student learning through the alignment A. Use assessment data within appropriate, ethical and legal of curriculum, instruction and assessment by: guidelines. A. Developing learning progressions to implement the B. Understand and communicate levels of proficiency accurately. district-wide standards. C. Use assessment results to make appropriate instructional B. Assuring horizontally and vertically aligned curriculum, decisions for groups of students. instruction, and assessment in the building. D. Collaboratively analyze data and use data to improve C. Clearly explaining how to analyze and use assessment results. instruction. D. Leading dialogues with staff in interpreting results and E. Use multiple sources of data over time to identify trends in creating goals for improvement. learning. E. Assisting teachers in collaboratively analyzing and using F. Use data management systems to access and analyze data. data in a professional learning community. G. Communicate effectively with students, parents, other F. Using assessment results, including subgroup performance, to teachers, administrators and community stakeholders about influence the district’s curriculum and instructional program. student learning. G. Using multiple data sources over time to identify learning trends. H. Seek to increase their knowledge and skills in assessment. H. Using assessment data to reflect on effectiveness of principals’ instructional leadership. Performance I. Incorporating assessment knowledge in evaluation practices District Administrators who are assessment (i.e., program, administrator). literate promote a culture of appropriate J. Clearly communicating results to various constituents assessment practice by: through a coherent system that uses a variety of methods. K. Using data management systems to access and analyze data. MichiganAssessment 13 Consortium.org
Assessment Literacy Standards // policymakers Dispositions Knowledge continued Policymakers who are assessment literate E. There are two ways to report results, and specific circum- believe: stances when each is useful: 1. Norm-referenced interpretations A. Teacher and administrator certification standards should 2. Criterion-referenced interpretations include competence in assessment as a criterion for licensing. F. There are several essential technical standards for high quality B. A balanced assessment system is essential at the local school assessments: district level (using summative and interim assessments, as 1. Reliability—Do the assessments produce replicable scores? well as formative assessment practices). 2. Validity—Is there evidence that supports the intended C. Assessments closer to the classroom usually have a greater uses of the assessment? impact on improving student achievement. G. Assessments can be purchased or developed locally; each D. Teachers and administrators need formal training in the devel- approach has advantages and challenges. opment and use of assessments to increase student success. H. There are a number of steps in the assessment development E. Important decisions about schools, educators or students process to produce high quality assessments. should be made on the basis of accurate and multiple sources I. There is little evidence to suggest that local, state, national of data. and international summative assessments, in themselves, Knowledge improve education or student learning. Policymakers who are assessment literate know: J. Users of assessments require time to learn to administer assessments and use the results appropriately, and resources A. A balanced assessment system consists of both of the following: may be needed to carry out these activities. 1. Different users have different assessment purposes 2. Different assessment purposes may require different K. Which student measures are appropriate for teacher and assessment methods administrator evaluation. B. There are different purposes for student assessment: Performance 1. Student improvement 2. Instructional program improvement Policymakers who are assessment literate: 3. Student, teacher or system accountability A. Provide the necessary authorization and resources (time, 4. Program evaluation money and staff) to create and implement quality balanced 5. Prediction of future performance/achievement assessment systems. C. The differences between the types of assessments in a bal- B. Ensure that only high-quality assessments will be selected/ anced system of assessment: developed and used. 1. Summative Assessments C. Strive to learn more about how assessment can be used to 2. Interim Benchmark Assessments improve student achievement. 3. Formative Assessments D. Support activities to improve their assessment literacy and that D. There are different ways to measure student achievement; of their staff. each has advantages and challenges. 14 Assessment Literacy Standards // Version 5.1
It is the intention of the MAC that accomplishing these standards will improve curricula, instruction, and assessment, all leading to improved student achievement. Acknowledgements Reference Documents Developed by the Michigan Assessment Used for MAC Assessment Consortium Board of Directors (2011-2013) and Literacy Standards members of the MAC Knowledge and Practices American Federation of Teachers, National Committee (2012-2014): Judith Dorsch Backes Council on Measurement in Education, and the (chair), Molly Bruzewski, Kathryn Dewsbury- National Education Association. “Standards for White, Patricia Farrell-Cole, Patricia McNeill, Teacher Competence in Educational Assessment Edward Roeber. of Students,” Buros Institute of Mental Mea- surements. http://www.unl.edu/buros/bimm/ Additional reviews provided in 2016 and 2017 html/article3.html by the MAC Assessment Development and Review Committee, Denise Brady (co-chair), Jim Brookhart, Susan M. “Educational Assessment Gullen, and Ed Roeber (co-chair). Knowledge and Skills for Teachers,” Educational Measurement: Issues and Practices, Spring 2011, With special appreciation to these national Vol. 30, No. 1, pp. 3-12. assessment experts for their thoughtful input and review: Susan Brookhart, Carol Commo- Chappuis, Jan. Learning Team Facilitator Policymaker Audiences dore, Ken O’Connor, James Popham, and Rick Handbook: A Resource for Collaborative Study Stiggins. of Classroom Assessment for Student Learning. Pearson, 2007. pp. 55-59. State-Level The following organizations contributed to the ·· State Board of Education refinement of the Assessment Literacy Stan- Chappuis, S., Stiggins, R., Arter, J. and Chappuis, ·· Superintendent of dards as a result of: presentations, focus groups, J. Assessment FOR Learning: An Action Guide for Public Instruction online surveys, and individual member and School Leaders. Pearson, Second Ed., 2009, p. 99. ·· Legislature leader reviews: Chappius, Steve, Commodore, Carol and Stiggins, 1. House Education ·· Bay-Arenac ISD Rick. Assessment Balance and Quality: An Action Committee ·· Lenawaee ISD Guide for School Leaders. Pearson, Third Edition, 2010. 2. Senate Education ·· Council of Chief State School Officers Committee O’Connor, Ken. How to Grade for Learning (CCSSO) — National Conference on 3. Legislative Staff K-12. Corwin, 2009. Student Assessment 4. House and Senate ·· Michigan Association of School Popham, James. Everything School Leaders Need to Fiscal Agencies Administrators (MASA) Know About Assessment, Corwin, 2010. ·· Governor ·· Michigan Association of Intermediate School 5. Governor’s Education Staff Roeber, Edward. “Assessment Literacy in Michigan District Administrators (MAISA) ·· Department of Management Education” and “Preparing Michigan Educators in ·· Michigan Elementary and Middle School and Budget Assessment,” East Lansing, MI: Michigan State Univer- Principal’s Association (MEMPSA) sity, Presentation, 2011. roeber@msu.edu ·· Michigan Association of Secondary School Local-Level Principals (MAASP) Educational Leadership Policy Standards: ISLLC ·· Local Board of Education ·· Michigan Association of Supervision and 2008, Appendix 2: ISLLC 2008 at a Glance.” Na- ·· Local School Superintendents Curriculum and Development (MIASCD) tional Policy Board for Educational Administration ·· Michigan School Improvement Facilitator’s (NPBEA), adopted December 12, 2007. Network (MSIFN) “Model Core Teaching Standards: A Resource for ·· Michigan School Testing Conference (MSTC) Dialogue,” CCSSO’s Interstate Teacher Assess- ·· Marquette Alger ISD ment and Support Consortium (InTASC), Council ·· Reeths-Puffer School District of Chief State School Officers, July 2010. http:// ·· Wayne RESA www.ccsso,org/intasc ·· Wexford-Missaukee ISD “Standards for Teacher Competence in Edu- cational Assessment of Students,” American Federation of Teachers, National Council on In 2017, the Michigan State Board of Measurement in Education, and the Education endorsed the MAC Assessment National Education Association. Buros Institute of Mental Measurements. http://www.unl.edu/ Literacy Standards for the State of Michigan. buros/bimm/html/article3. html MichiganAssessment 15 Consortium.org
Assessment Literacy Standards // GLOSSARY Accountability Assessment Methods Balance of Representation Holding educators or others responsible for Selected-Response Item In this type of item, The match between the relative emphasis of the performance of students, educators, or students select a correct answer from among concept areas in a set of content standards school programs. several answer choices. This items type and the assessment that measures those includes multiple-choice, true/false items, and standards. The key question is does the Achievement Level matching items. The multiple-choice item balance of representation in the assessment The standard of performance set through format is the selected-response format most match that of the content standards? a standardsetting procedure. Also called used in a large-scale assessment program. a “performance standard.” Defines how Balanced Assessment System well students need to do on an assessment Constructed-Response/Item The item type The use of different types of assessment for to meet or exceed predefined targets for requires the individual to create their own different purposes. Can also mean the use of achievement, such as “proficient.” answer(s) rather than select from prewritten assessments for learning (to guide it as it is options. There are usually several ways in occurring) and of learning (to measure how Achievement Test which these items can be answered correctly. much students have learned at the end of A test used to determine the current level of These items are scored using a standardized instruction). knowledge and skills of an individual. scoring rubric that is objective and clearly defined. Bias Alignment The manner in which a test question is posed Refers to whether an assessment item Performance Assessment These are types that disadvantages some students (due to measures any part (ideally, the most of assessments that require the student to factors other than their knowledge of the important part) of a content standard. Also perform some activity. There are two types, topic being assessed.) refers to how much of a set of content distinguished by their complexity and the standards that an assessment instrument length of time students have to respond to Causation measures. them. This is a demonstration that one variable has a direct and predictable impact on another Two-Way Alignment refers to how much of Performance Task In this assessment, students variable. a set of content standards is measured by an have days, weeks, or months to compose assessment instrument, as well as, whether a response. Thus, these assessments may Cognitive Complexity the assessment instrument covers most, if not involve multiple responses of different types The type(s) of mental processing (i.e., all, of the set of content standards. to multiple prompts. The resultant work thinking skills) required by an item or set may be lengthy and compromise of multiple of items. This may refer to the Depth of Aptitude Knowledge (Webb), Bloom’s Taxonomy, or A term to describe the ability of an individual parts. Embedded in the Task may be written response items, presentations, papers, other definition of thinking skills. to carry out a task or activity. Also indicates the extent to which an individual will be student self reflections, and so forth. Constructed- or Written-response Items successful in a future activity. Performance Event This is an on-demand Test items that require students to write out performance assessment on which students their responses. Often, responses take the Aptitude Test form of short- or extended-response essays, A test used to determine the ability of an are given little or no time to rehearse their performance nor limited opportunities to although other items might ask students to individual to carry out a task or activity. Also draw a picture, construct a table, show a indicates the extent to which an individual improve their initial performance. Such assessments may take a class period or less to flow chart, and so forth. A traditional “fill will be successful in a future activity. in the blank” type question is also a written administer. Assessment Administration Procedures response item. Constructed- or written- The set of policies, guidelines, and/or Personal Communication An assessment response items typically require a checklist or procedures in place to help ensure that the conducted one-on-one between an adult rubric for scoring. administration of an assessment provides and a student- sometimes an observation or interview. Correlation valid results consistent with the designed This is a demonstration that two variables purpose of the assessment. Assessment Purposes move in the same or opposite manner, Student Improvement Using test results although there is no proof that one causes to review past instruction or to alter future the other. instruction provided to the student, due to performance on the test. Criteria A basis for making a judgment. Accountability Using test results to hold educators or others responsible for the Criterion-referenced (and interpretation) performance of students, educators, or Relating a test score to a pre-established school programs. absolute standard of performance. Program Evaluation Using results to Data Management System determine the success of a program and A computer software system that is used to perhaps to suggest improvements. store educational data and to permit these data to be retrieved and analyzed. Prediction Using test results to determine the likelihood of a success of an individual in some future activity. 16 Assessment Literacy Standards // Version 5.1
Depth of Knowledge (DOK) Instructionally Embedded Performance Assessments A taxonomy of four levels, developed by Assessments or activities that occur while Assessments where students are asked to Norm Webb, that can be used to classify the instruction is taking place. perform in some way, such as completing cognitive complexity of test items, content an experiment in science, conducting an standards, and learning objectives. Interim investigation in science, singing, acting out An assessment program that is administered a character in a theatrical production, or Diagnostic Test periodically to students, such as at the completing a painting in an arts class. The A test used to determine the areas of conclusion of each marking period. products of performance assessment can be strength and weakness of an individual. many types. Performance assessments typically Interviews Dispositions In this type of assessment, a teacher typically require a checklist or a rubric for scoring. Attitudes or beliefs about something. works with an individual student, asks a series Performance Event of planned and/or unplanned questions, and This is an on-demand performance assessment Distortion records students’ responses to the questions. A factor in the assessment process that does on which students are given little or no not permit the accurate determination of Item time to rehearse their performance nor student performance or that of a school or An assessment question, problem, or exercise. limited opportunities to improve their initial district. The individual measures used in a test. performance. Such assessments may take a class period or less to administer. One of two Essential Learnings Learning Progressions types of assessments that require the student A set of prioritized outcomes, derived from The sequence of learning topics that students to perform some activity. These two types are state standards, that helps focus on the may go through to learn an important topic. distinguished by their complexity and the length most needed aspects of the curriculum for of time students have to respond to them. instructional planning purposes. Learning Targets The individual learning skills for teaching and/ Performance Task Feedback or testing. On this assessment, students have days, Information about performance provided by weeks, or months to compose a response. another person or an instrument. Levels of Proficiency Thus, these assessments may involve multiple The different levels of performance on an responses of different types to multiple Field Test assessment. prompts. The resultant work may be lengthy Trying out of newly-created items in a Measures of Central Tendency: Mean, and comprised of multiple parts. Embedded formal manner on a representative sample Mode and Median in the Task may be written response items, of students. Mean The arithmetic average of a set of presentations, papers, student self reflections, Formative Assessment data, calculated by adding up all the scores and so forth. One of two types of assessments Information collected and used by teachers and dividing by the number of scores. that require the student to perform some and students during instruction to improve activity. These two types are distinguished teaching and learning as it is occurring. Mode The most frequently occurring score in by their complexity and the length of time a set of scores. students have to respond to them. Grading Rating an individual or program on the basis Median The score at the middle point in a set Personal Communication of external standards. of scores. An assessment conducted one-on-one between an adult and a student—sometimes High Quality Assessment Measures of Variability: an observation or interview. An assessment externally judged to be of Variance and Standard Deviation superior quality. Variance The deviation of each score in a Pilot Testing set of scores from the mean score of the set, A preliminary use of assessment items to see if Horizontally Aligned squared. they work. If they don’t, they may be discarded The alignment of instruction provided by or revised. If they do work, the next step is to multiple teachers teaching the same content at Standard Deviation The square root of the field test them. the same grade or in the same course. variance of each score in a set of scores, divided by the number of scores. Placement Test Instructional Decisions A test used to determine the best program or The choices made by educators as they teach. Multiple Measures treatment for an individual. The use of different types of measures to Instructional Objective assess students or programs from somewhat Prediction A statement that specifies what a learner different perspectives in order to obtain a The use of test results to determine the will know and be able to do as a result of broader picture of students or a program. likelihood of success of an individual in some instruction. Most often found in curriculum future activity. framework documents. Norm-Referenced (and /interpretation) Professional Development Instructional Program Improvement The comparison of a student or school (Targeted and Differentiated) The learning The use of the test results to determine areas score to a representative sample of students programs and experiences provided to in- of the instructional program that need to be or schools – the norm group. Scores are service educators to improve their knowledge modified and improved. interpreted as above or below the average and skills, and thus, their performance on (mean score) of the norm group. the job. MichiganAssessment 17 Consortium.org
Assessment Literacy Standards // GLOSSARY Professional Learning Communities Scoring Rubrics Success Criteria Small groups of educators who work on a Often used to score constructed response Statements that tells students what they common issue or program over a period of items, and performance tasks and should know, understand and be able to do, time for the purposes of increasing educator performance events. A rubric establishes the at the end of a lesson. These criteria identify effectiveness and student results. expectations for performance and delineates elements of quality that will be present what a response must include. Performance in student work. These criteria become Program Evaluation levels are described for each dimension or the measures teachers use to determine The use of test results to determine the criteria of the performance task, performance proficiency. success of a program and perhaps to suggest event, or constructed response item. Sample improvements to it. student work drawn from actual responses Summative Assessment used to illustrate performance levels for each As assessment of performance, conducted Progress Monitoring Test at the conclusion of a course or program A test used to gauge the improvement in dimension/criteria are sometimes attached to a rubric. completion. performance of an individual or a program. Screening Test Test Blueprint Protocols A document that describes the key attributes Protocols are an agreed upon set of guidelines A test used to determine eligibility of an individual for a program or activity. of a new assessment, such as standards to be for conversation; a code of behavior for groups assessed, the types and numbers of items to be to use when exploring ideas. Selected-Response Items written, and how the results of the assessment Quality Assessment A test item that requires students to pick a will be reported to different audiences. A judgment that an assessment is of high response from among two or more answer choices provided for each item. Multiple- Types of Assessments quality. Different ways of assessing students or choice, true-false, and matching items are Reliability all examples of selected response items. programs. A determination of the internal consistency, Multiple-choice items are the most frequently Unpacking Standards comparability or stability of an assessment. A used type of selected-response items. To determine the key attributes and aspects necessary but not sufficient condition for an of a content standard. assessment to be useful. Selection Test A test used to determine which individuals Validity Reporting will most likely be successful in a program. The collection of evidence to support the Describing the performance of a student on intended uses of an assessment. Note: The test an assessment in written or verbal terms. Sensitivity The use of a topic in an assessment item itself is not “valid” or “not valid.” It is the uses Rigor that some students may find troubling or of the assessment that are or are not valid. The level of knowledge necessary to achieve offensive. Vertically Aligned a content standard or to correctly respond The alignment of instruction provided by to an assessment item. Typically measured Standard The larger expectations we express in multiple teachers teaching in the same in the Depth of Knowledge category, one of content area across two or more grades. four dimensions of the Webb Alignment Tool, association with knowledge, skills and developed by Norm Webb, Wisconsin Center dispositions, comprising entire disciplines Walk-Through for Education Research. (mathematics, science, etc.). A dry-run of a process or a procedure. Student-Friendly Language Also can mean a school administrator Scoring who periodically observes teachers in their The process of determining how well a Writing of some educational language in a jargon-free manner understandable by classrooms. student did on an assessment. students. Scoring Checklists This might be a series of steps used to remind Student Improvement students about a complete performance or The use of test results to review past used to score the responses of students. instruction or to alter future instruction provided to the student, due to performance Scoring Guide on the test. A scoring guide is composed of a rationale for the correct or preferred responses to the Subgroup Performance assessment. A guide includes one or more The performance of a subset of the students scoring rubrics, examples of student responses in a larger group, examined to assure that all for each score level of each rubric, and sets of groups of students in a school are doing well pre-scored student papers used to train, certify, academically. and monitor the scorers. 18 Assessment Literacy Standards // Version 5.1
The mission of the Michigan Assessment Consortium is to improve student learning and achievement through a coherent system of curriculum, balanced assessment and effective instruction. We do this by collaboratively promoting assessment knowledge and practice; providing professional learning opportunities; and providing and sharing assessment tools, products and resources. MichiganAssessment 19 Consortium.org
Michigan Assessment Consortium is a profes- sional association of educators who believe quality education depends on accurate, balanced, meaningful assessment. MAC members work together with educators to advance assessment literacy and advocate for assessment education and excellence. 1980 North College Road, Mason, MI 48854 · 517-816-4520 To learn more about the Michigan Assessment Consortium and the resources it has created: MichiganAssessmentConsortium.org ©2015, 2017 Michigan Assessment Consortium
You can also read