|
Student Survey Item |
Year-over-Year Change |
|
Most kids at school follow the rules. |
+10.1% |
|
The homework and projects I’m assigned help me learn and are more than just busy work. |
+9.2% |
|
Students treat me with respect. |
+7.6% |
The only one that went down? “I can go online or use a device at school when I need it.”
What is this telling us? Schools are making substantial improvements in student discipline. Kids feel fewer disruptions, and they feel more respected. That means that the only item that went down is probably a good thing, and it very likely reflects cell phones. The less access there is to cell phones, the easier it is to teach. In other words, it might be a good thing to see a decline in that one item. (Something similar happened last year too.)
Discipline in stereo from the staff.
What we’re learning from students is similar to what we’re learning from staff.
|
Staff Survey Item |
Year-over-Year Change |
|
Our school’s student discipline practices and policies are effective. |
+7.6% |
|
Our staff handles student discipline in a consistent manner. |
+7.5% |
|
The school board is doing what it takes to make our district successful. |
+4.2% |
Thus, it’s not just students reflecting discipline growth. Here you see it explicitly from staff.
We just can’t emphasize enough how important this is. For years coming out of the pandemic, the lowest scoring items for any group were related to discipline. You looked at the data, you learned from it, you fixed it, and it’s working.
By the way, if you’re curious, not only are staff more satisfied with leadership at the board level, but staff are also more satisfied with district administration – it just wasn’t one of the biggest year-over-year changes.
Unlike students, there were a handful of items that decreased for staff. Staff feel like their workload is increasing and that their input is slightly less valued. It’s only fair to mention these because we want to honor those teacher and staff voices. At the same time, the declines for those two questions were not, by any means, substantial: four-hundredths of a point and three-hundredths of a point, respectively. Their feeling of recognition when they do a good job is down 0.6%, but that’s very likely statistical noise (i.e., not statistically significant). I point that out because I’m going to briefly touch on it later.
Leadership and expectations from parents.
This leaves parents, and their scores are mixed.
|
Parent Survey Item |
Year-over-Year Change |
|
The school has high expectations for my child. |
+2.0% |
|
The school board is doing what it takes to make our district successful. |
+1.7% |
|
District administration is doing what it takes to make our district successful. |
+1.6% |
As you can see, parents are more satisfied with their child’s leaders in both the boardroom and the district office. They also feel like teachers and staff are pushing their children academically, and that’s great. Not for nothing, it’s also a little easier to do when there are fewer discipline issues.
Parents did have a few items decline. This includes:
I didn’t make a table out of those questions on purpose. Like staff, I want to honor the data and be transparent about where things are lagging. However, these items are still scoring very high. It’s kind of like going from an A+ to an A. Are we keeping an eye on it? Absolutely. We want to avoid a trend. We’re going to withhold focus for now but, again, more on this below
Why all this matters.
Last summer, our bow-tie-wearing nerds—err… “the research team”—asked themselves, “What questions matter most?” If you want to make the biggest bang for your buck and improve satisfaction quickly and efficiently for staff, students, and parents, what should you fix first?
For staff, they need tailored and relevant professional development opportunities, appropriate recognition, and a voice in the decision-making process. If you can improve in these three areas, you will move the needle the most. Put differently, it will be difficult for you to make progress in other areas if you don’t address these issues first. Thus, when we saw recognition and the value of input ever-so-slightly decline for staff, it caught our eyes.
For parents, there’s one question that’s far and away the most important: My child feels safe at school. An interesting corollary: satisfaction with the school board seems to matter more than satisfaction with administration. As we said above, safety is down a touch year-over-year, but the score remains high. However, if this score continues its decline throughout the spring, note the impact: you will need to fix this first if you want to make progress elsewhere.
For students, the biggest impact items vary by age. All students need to feel safe, need to feel like they can be themselves, and need to know teachers care about them. As students age into high school, they, unsurprisingly, want more agency. They need to be able to relate to what they’re learning, and they need to know their classwork is meaningful, not just busy work. This final item is up. Students do, in fact, feel like what they’re learning is meaningful. What this means is that, if you’ve cleared that hurdle, you’re ready to improve in other areas.
You could stop reading here. However, our bow-tie-wearing nerds are insisting we briefly touch on how we know these things.
To figure out which questions matter most, we used a process called a LASSO (get it? the bull in the picture? please keep reading…) variable selection method reapplied to an OLS regression.
But if not every question matters to the same extent, then why not just ask the questions that matter most? Two reasons. One, people don’t believe it. Respondents don’t believe you can almost perfectly predict their satisfaction with just a few questions, even though the stats tools allow us to do just that. Two, additional questions can help you figure out how to fix something. If you get low satisfaction around communication, we can ask “back-up” questions about how people want to be communicated with, when, and how frequently.
How this relates to Response to Data.
Your own survey data may mirror these aggregate results. They may not. Implementing staff, student, and parent surveys at home is the only way to understand whether you’re matching what we’re seeing in totality (ecological fallacy, anyone?).
If you need help fixing the things that you need to fix in order to see progress in other areas (i.e., those most impactful questions), we can help. These data points are exactly why we created Response to Data (R2D).
We will help you pinpoint your school-level or grade-level results, help you connect the what (the scores) to the why (why the scores are the way they are), develop solutions to fix what’s not going well and maintain what is, determine who is responsible for fixing what, and produce a communication plan to help your survey-takers understand clearly how and why you’re using their data meaningfully.
Even if it is your first rodeo.
The School Perceptions Resource Center features the voices of our team members. This post was written by Rob DeMeuse, Vice President of Research. And bow-tie wearer once.