Try not to shy away from the potential contained within all of the student data that exist in abundance in our classrooms and schools. Incorporate them into the wide variety of decisions you make every day. Embrace this as an opportunity to bring a little “science” of teaching into your “art of teaching”?
~Craig A. Mertler, ASCD author
If you are like most educators, you spend an abundance of time wondering if you are making the right instructional decisions for students. You do this because you know that teaching and learning are interconnected. This knowledge leads you to think about the curriculum, instructional strategies, and what you know about your students as you plan for instruction.Part of the information that you understand about students comes from assessment data. Assessments are vital to effective teaching and learning (Heritage, Kim, Vendlinski, & Herman, 2009) as it provides teachers with valuable data regarding student proficiency. There are three basic types of assessment – preassessment; formative assessment and summative assessment. Preassessments provide teachers with student’s background knowledge prior to delivering instruction. Formative assessment occurs during instruction; teachers continuously collect information to provide feedback, check for student understanding, determine what to teach next to improve student outcomes (Black and William, 1998, p. 7-19). Summative assessments are given at the end of a unit of study, grade or course. Examples of this type of assessment includes unit tests, exams, and standardized tests. All three forms are important as the data gathered guides planning, instructional delivery, tasks, and differentiation. Do you feel that you have had enough professional development on how to interpret data, how to avoid the three types of bias when deciphering data, and the effect content knowledge plays in the data-driven decision-making process? The reality is that the interpretation of data systems is an intricate process that few teachers would describe themselves proficient when asked about their data literacy (Hamilton et al, 2009).
This entry is intended to help you strengthen your knowledge of data, the biases to avoid when interpreting data, and questions to consider as you incorporate what you learned from data into your instructional practices. Let’s get started…
Psychologists have discovered three types of bias that impede people’s cognitive processing when making decisions that involve interpreting data. Tversky and Kahneman (1982) identify three biases that influence decisions. They are representativeness bias, availability bias, and anchoring and adjustment bias.
- Representativeness bias occurs when the similarity of objects or events confuse people’s thinking regarding the probability of an outcome. With this type of bias, people frequently make the mistake of believing two similar things or events are more closely correlated than they actually are. In a school, this is what this type of bias might look like. When it comes to using data, an example would include assumming that students who always score proficient on weekly tests will automatically achieve proficiency on a summative assessment, or comprehensive end of the year assessment.
- Availability bias is the human tendency is to judge the likelihood of an event, or frequency of its occurrence by the ease with which the examples or instances come to mind. This psychological incident impedes how people process complicated information. In a school, this might manifest itself in a situation such as assuming that one student caused the disruption in the classroom instead of investigating to determine what actually occurred. For example, when a teacher predetermines which students will not master a skill and group them prior to gathering data during instruction and believe that their decision is justified due to these students usually not mastering skills. This is an example of this type of bias.
- Anchoring and adjustment bias involve people seeing new information through an essentially distorted lens. With this type of bias, people tend to anchor their decisions on the initial calculations without actually working through the process. Research indicates the more complex a number is, the more likely people are to choose an estimate versus mentally working through until a solution is found. For education, this would manifest itself in a teacher over or underestimating the effect of an instructional practice without actually implementing and gathering enough data to prove its validity.
In 2011, the U.S. Department of Education (2011), outlined five areas teachers need to be proficient in to accurately use data to inform instruction. These areas include:
- Pinpointing significant pieces of information within a data system: Teachers and schools collect data from a variety of sources: quizzes and tests, formative, common, interim, and summative assessments, exit tickets, teacher observation, student work, progress monitoring, and benchmarks.
- Comprehending what the data indicates: Every report that you encounter contains different information. Have you had training on the layout and specific information that is included in every type of data report that you use? Does your school or team use a specific protocol to analyze student work samples?
- Presuming what the data signifies: Do you know the purpose of each data set, what it is supposed to collect, whether it is norm referenced, whether it is intended to meansure student’s current instructional level, their level of proficiency, or how they scored in comparison to other students who took that particular test? It’s important that you understand they type of data that each assessment collects so that you will be able to figure out what the data signifies?
- Choosing an instructional method that focuses on the conditions discovered through the examination of data: It will take a lot of practice for teachers to become skillful in identifying the instructional strategies to integrate into instruction in order to move students from their current academic level towards mastery of grade level standards.
- Outlining instructionally applicable questions that can be tackled by the data in the system: Here are a few questions that you should consider asking yourself when disaggregating data:
- How will I determine which data points get priority?
- Do I notice any trends in the data?
- What do I notice in terms of student strengths?
- What areas of growth can be identified within the data?
- How will I address student misconceptions?
- Can I pinpoint any skills/standards which may have impacted student’s mastery of standards?
Thanks for visiting the Digital PD 4 You Blog!! If you are interested in listening to weekly tips as you get ready for work or drive home from school. Visit https://anchor.fm/jami-white or any podcast platform and search or the Digital PD 4 You podcast. I will see you next week for another reflection tip. Have a great week of teaching and learning!!!
Data-Driven Decision Making (How To)
High-Yield Instructional Strategies
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy and Practice, 5(1), 7–73.
Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/.
Heritage M., Kim J., Vendlinski T., & Herman, J. (2009) From evidence to action: A seamless process in formative assessment? Educational Measurement 28(3): 24–31.
Tversky, A. & Kahneman, D. (1982). Judgments of and by Representativeness. In Kahneman, D., Slovic, P., & Tversky, A. (Eds). Judgment under Uncertainty: Heuristics and Biases. Cambridge University Press, p. 84-98.
U.S. Department of Education, Office of Planning, Evaluation and Policy Development (2011). Teachers’ ability to use data to inform instruction: Challenges and supports. Washington, D.C.