Dealing with a narrow, limited view of scholastic data
From attendance records, test scores, and disciplinary actions to demographic information, the Granite School District had a wealth of information. But turning that data into actionable insight was a challenge. School administrators wanted to see more information about the district, measure multiple performance metrics, and diagnose issues at a higher level. Administrators got a variety of narrow reports but were left using gut instinct to decide what the reports meant in aggregate. Eventually, administrators became so frustrated waiting for timely reporting that they stopped asking for the data entirely. “If you've got to go to multiple sources to get the data, it can be difficult to get a coherent feel for what's going on,” says Kim Brunnenmeyer, System Administrator for the Granite School District. In addition, says Craig Schow, Business Consultant for the Granite School District, “One of the big frustrations our teachers had was spending all this time testing students. We’d make them do SAGE tests, and DIBELS tests, and SRI and more, and then the teachers wouldn't always get the data from that. They’d ask, ‘Why am I spending all of this educational time giving them an assessment and then I don't get timely results to change how I teach?’” And when teachers did get testing data back, the information lacked important context. Teachers struggled to compare one year’s results to the next. They also couldn’t view scores alongside proficiency expectations that would help them place a student’s score into perspective. “They could see a score was 248, for example, but they couldn’t see what that meant in terms of approaching standard,” Craig says. To solve these problems, Granite School District first used the reporting functionality within their single sign-on solution—offering visual reports with drill-down capabilities. Unfortunately, this wasn’t enough to meet the district’s needs. “You could look at a pie chart of the ethnic breakdown of the district, for example, then go look at a pie chart or a bar graph of something else. But it could only do one thing at a time. So it didn't present a very good overall picture of anything,” Kim explains. This made important efforts like identifying at-risk students slow and difficult. The At-Risk Reports that the original system could deliver were static and inflexible. Worse, teachers weren’t able to dig down into the underlying issues. Kim continues, “We ran some reports that tried to identify at risk students. But it wasn't in the same place as we could see other stuff. You could run an At-Risk Report and see who was at risk, but then you would have to go elsewhere to find more detail as to why they were at risk.” This lack of visibility into student’s academic careers even affected the athletic programs. Coaches couldn’t pull a list of who was eligible to play, and had to rely on word of mouth from the teachers. The school district needed a tool where they could see all of their data in one place—with richer analytical depth.