4.3
(4)

Q: I’m a middle school teacher who excels at collecting data in my classroom. However, I’m not always able to shift my instruction to support what I learn, and I’m not sure I’m interpreting the data correctly. Could you provide some tips on using checks for understanding with the data I’ve collected?

Signed, Wants to improve my data understanding

A: I’m so glad you asked this question. I am fortunate to work with many teachers, and this is a struggle that many don’t even realize they have. It’s one thing to collect data (hopefully the useful kind), and it is a whole other thing to know what to do with it. When I was in the classroom, it took me a long time to figure out first what data to collect and then how to use it successfully. After working with many teachers in many schools, I learned that this stymies many folks. As a matter of fact, it’s why Michael McDowell and I wrote our upcoming book, Actionable Assessment: A Step-by-Step Guide to Responsive Teaching and Student Growth.

Below are practical ways to (1) interpret your data with more accuracy and (2) build a repeatable system for responding to it in real time, especially in a middle school classroom where attention, confidence, and momentum can change by the minute. This applies not only to middle school; these steps can be useful for all age groups.

Recognize that data is noisy

Checks for understanding (CFUs) — exit tickets, quick quizzes, hinge questions, whiteboards, thumbs, polls — are fast by design. That speed is their superpower, but it also means the data can be messy: Students rush. They copy. They misunderstand directions. They know it but can’t yet explain it. They guess correctly without understanding. These and other challenges make these quick and powerful tools a little less accurate than we need them to be.

Sometimes you wonder, “Is this data even telling me the truth?” You’re not wrong to ask. The goal isn’t to make CFU data perfect; it’s to make it useful enough to act on.

Tighten the question

A lot of “I don’t know what to do with the data” actually starts upstream: The CFU question may not be aligned tightly enough to the learning target. Before using a CFU, ask yourself: What decision am I going to make based on student responses? If you can’t name the decision, the question probably needs adjustment.

Try this quick filter:

  • CFU questions should test one specific skill (not the whole lesson).
  • They should match the exact thinking you want (not just vocabulary recognition).
  • They should be assessable fast (you should be able to sort answers in under two minutes).

A strong middle school CFU is less like “Did you get it?” and more like: “Can you do the one step that proves you’re ready for the next step?” Remember, language and precision matter.

Use a three-bucket sort to interpret faster (and more accurately)

When you look at CFU data, don’t start by calculating averages. Start by sorting into three buckets (Table 1). This goes for a lot of data; sometimes, on its own, it doesn’t make sense, but when you observe trends, you can make a bigger impact with your instructional move.

Table 1. Three-bucket sort

This approach keeps you from overreacting to simple errors. It also helps you avoid reteaching the whole lesson when only a third of students need a tune-up. It’s a waste of time for your learners who already get it if you reteach the whole class when only a few don’t really understand.

Look for error patterns

The fastest way to become confident in interpreting data is to stop treating each wrong answer as a separate issue and start treating them as symptoms. When I was teaching my students to write, patterns often emerged. Instead of marking up each incorrect idea or repetitive word or phrase, I started to read the whole thing first and then respond to the pattern by asking clarifying questions and having students take a second look before I started to make corrective suggestions. Here are some questions you can ask to identify pattern:

  • Are students making the same mistake in the same place? That’s likely an instructional gap or a misconception worth addressing with the whole class. I saw this a lot when I asked students to do something I wasn’t really clear about.
  • Are mistakes scattered and random? That might indicate issues with attention, stamina, unclear directions, or a need for more practice rather than reteaching.
  • Are high-performers missing it too? That’s a sign that the task may be confusing, the question may be flawed, or the concept wasn’t clearly modeled.

A helpful rule is that if 60-70% miss it in the same way, reteach. If 20-30% miss it, regroup. If everyone misses it, rewrite the question and/or reteach differently.

Build a menu of moves

You don’t need a new intervention for each result. You need six to 10 go-to responses you can apply repeatedly and that students can learn as readily. The routine makes it easier for them to pivot, too. See Table 2 for a practical menu you can keep in a notebook or a plan book.

Table 2. Menu of instructional moves

When you have a menu, data use becomes less stressful because you’re selecting a response rather than designing one from scratch every time.

Pre-plan your pivot points

A powerful habit is to plan conditional instruction — basically, “If the data says X, I’ll do Y.” It’s always good to have a few ready plans and pivots to ensure you are making the most of class time. When you write your lesson plan, add one small section:

  • If most students are in Bucket 2: I’ll do a four-minute reteach with a new example + two practice problems.
  • If more than 30% are in Bucket 3: I’ll pull a small group tomorrow during warm-up while others do review/application.
  • If most are in Bucket 1: I’ll skip the extra practice and move to the application. And then have a couple of ways of applying to give some choice.

This keeps you from feeling like you’re abandoning your plan if students need additional instruction. You’re not. You’re executing the plan you prepared for the data you expected to see. I will say that there is no harm in abandoning a plan, though. If your plan isn’t working, you should abandon it and evaluate what didn’t work during your reflection time.

Confirm you’re interpreting the data right

If you’re not sure whether your CFU results are “real,” use quick verification methods:

  • Ask two students to explain their reasoning verbally (one correct, one incorrect response). You’ll learn in 30 seconds whether it’s a misconception or a misread. It’s also good to ask them to share their best wrong answer and explain why as this lowers barriers.
  • Use one follow-up hinge question the next day with the same target, but a different format. If results change drastically, the first task may have been unclear.
  • Check for direction-reading errors (especially in middle school). If the mistake is procedural or formatting, you may not need reteaching — just clarification.

Think of CFUs as a signal, not a final verdict. Sometimes you take a second reading before you change the whole route.

Make feedback the bridge between data and instruction

Data only changes learning when students do something with it. This is why it’s essential to not just give feedback but to allow students time to do something with it.

After a CFU, build a short routine:

  1. Name the pattern: “A lot of us mixed up ___ and ___. That’s normal.”
  2. Show an example: Correct vs. common incorrect.
  3. Have students revise. Fix one problem, rewrite one sentence, add one justification.
  4. Collect a second, smaller check. One question to check understanding.

Even five minutes of structured revision can turn CFU data into actual growth — without needing an entire reteach day.

Your strength is the hard part — now let’s systematize the rest

Many teachers struggle to consistently collect meaningful data. You already have that muscle. The next step is building a repeatable “interpret → respond → recheck” routine that fits your time and your teaching style.

What are your favorite go-to checks for understanding?

If you have an issue that you would like me to address, please email me at ssackstein@educatorsrising.org. You will be kept anonymous.


ABOUT THE AUTHOR

Starr Sackstein

Starr Sackstein is the Massachusetts state coordinator for PDK’s Educators Rising program, COO of Mastery Portfolio, an education consultant, instructional coach, and author. She was a high school English and journalism teacher and school district curriculum leader. She is the author of more than 15 educational books, including Hacking Assessment (Times 10, 2015), Making an Impact Outside of the Classroom (Routledge, 2024), and Actionable Assessment (Routledge, 2026).

Visit their website at: https://www.mssackstein.com/

How useful was this post?

Click on a star to rate it!

Average rating 4.3 / 5. Vote count: 4

No votes so far! Be the first to rate this post.