When you look at your student achievement data and development data, how do you determine “significance”
and how do you develop a plan of action to address deficiencies?
Student achievement data can be the single most powerful indicator of instructional success. However, what does that achievement data tell us? The process of data analysis and developing plans to address what it points out is the power of the continuous improvement cycle within a classroom, a building and a district. Determining which data to use for this cycle is critical.
Digging into the RIGHT data
We talk about “student data” all the time. Phrases like “digging into the data,” and “using the data,” have become clichés in our educator conversations. What kind of digging actually happens? What instructional changes happen as a result of using that data?
When the rubber meets the road, all too often the data analysis conversation mistakenly focuses on qualitative factors like the student’s home life or attitude. All too often the data analysis conversation focuses on hunches, or what we expect.
If we don’t often have the RIGHT kind of data, we don’t see specific student needs nor can we see what is working (and not working) instructionally. As we determine the significance of any student achievement data, we need to be sure it is the RIGHT data.
Common Roadblocks to Getting the RIGHT Data:
- Analyzing the wrong assessments (tests that don’t tell us much about instructional effectiveness and don’t give useful data)
- Not collecting data in a usable format (not easy to interpret or analyze)
- Failing to assess enough for fear of “over assessing” (which is a legitimate fear)
Standardized tests, while providing some valuable information about student learning, do not measure deep or creative thinking in any field, and miss measuring any thinking in most fields (ie: Arts, Social Sciences, PE, etc.). The RIGHT assessments can often be local measures, measuring many kinds of learning, objectives and goals in a valid and reliable way. The RIGHT data is always significant.
Planning to Address Data: The Action Plan
Once the right tools have been identified to measure and monitor achievement needs, the results need to be collected, organized and interpreted. This is a process of looking at results and determining what it “means.” From here, it is time to set goals and expectations.
Data interpretation in schools finds a variety of proficiency levels from data-phobic to data-geek to statistician. You don’t need to be an educational statistician to really understand your data. The goal is to highlight current achievement as compared to expected or desired achievement. Then, we determine how we can move from where we are to where we want to be. This can be done with various measures from a rubric measuring students’ sight reading a piece of music to a rubric measuring students’ argumentative essays to a series of reading achievement tests. Making the data easy to interpret will be an important move for the teachers.
Simple Strategies Improving Interpretation
- Look for Trends: Identify overall patterns--don’t focus on individual points. This is helpful for looking at data as a complete picture. This is not just looking at the average score. Averages can be misleading depending on your distribution of scores. Rather, look at your range and look for trend line data.
- Examine Outliers: Look for data points far above or far below the pattern of distribution. This work will highlight students who are improving at a rate faster than typical or less than typical. Outliers will also inform your trend data as it may be appropriate to remove certain outlier data points when looking for patterns.
- Zoom Out: Look for natural breaks in the data or clusters of scores. Are some students scoring similarly? Why? Often choosing a method of color-coding data (Ex: red, yellow, green) will help speed this process and make data more user friendly.
- Zoom In: Grab outliers, clusters of scores, or anything that strikes you about data and go deeper. What other information do you have about these students? What skills was this group of students struggling with?
The Improvement Cycle
Have you ever written a goal that feels more like throwing a dart at a target while blindfolded? Good interpretation of data will make the next steps of developing action plans much easier. With the surge in SMART goal writing, our goals have become better, but we still struggle with aspects such as: “what IS realistic?” and “is this too much or too little?” Looking at historical data and using that to set realistic performance goal is the best weapon we have to address this challenge.
Past performance is a predictor of the future performance. Making changes in the personnel, curriculum, or instructional strategies will cause a variance from past performance. What has improvement looked like in the past? Do you want to continue at this rate, or are we sparking change and working toward a new type of improvement and growth?
Pivot Points: The “cycle within the cycle”
An improvement cycle happens across a longitudinal period of time. We don’t look for big picture achievement and growth in a week or month, but rather over the course of several months or a school year. Nobody would get on a ship and start sailing toward a destination without ever looking at a map, compass or GPS. We have to “check in” on our goals and see if we are on track to make the goal. If we are not on track, we need to pivot and do something differently.
Within your improvement cycle, there should be built in opportunities to look at your progress and determine next steps. Pivot points are akin to the pivot foot in a basketball game. They are places within an improvement cycle (classroom or district) where we stop, look at the data, interpret and change what we plan to do based on those results. These smaller cycles of “collecting data, interpreting data and pivoting” should happen at intervals within the larger cycle so we can be responsive.
The more pivot points embedded within a cycle, the more times we have an opportunity to be reflective on the data and change our direction based on what we interpret. This reflective approach is at the core of good teaching and good leadership and helps ensure our action plans are implemented and successful.
Anne Weerda is an assessment and data specialist. She is a
sought after speaker and consultant helping districts with effective design and
use of student learning measures. She is the founder of Kids At The Core:
specializing in developing assessment and data literacy in schools. Anne
has experience as a Director of Curriculum and Assessment, a School Improvement
Assessment Specialist, a High School Administrator, a Department Chair and a
classroom teacher. She can be reached via twitter (@StudentGrowth) and at www.KidsAtTheCore.com