Alexander Kurz is quick to describe himself as a data person. As the data and research manager for Thriving Together in Phoenix, Kurz is always on the search for quantitative insights into how more students can succeed in school en route to their careers.
But even when Kurz discovered data insights that could help boost student performance, there was no guarantee of action. When he’d share his findings, many of the school and district leaders were wary and defensive even though the data wasn’t specifically critical of their individual work.
“I had to learn a lot around relationship-building”—a skill he admits he’s had to hone, says Kurz. Over time, he refined his approach to starting conversations with teachers and administrators about student-performance data.
Kurz is not alone, according to the discussion at a recent Tableau Data Fellows training. Selected from StriveTogether’s nationwide network of 65 Cradle-to-Career programs, the 15 Data Fellows, Kurz included, are each working alongside educators to build a continuum of services and keep at-risk kids on course. Yet all shared experiences involving educators reluctant to analyze student- and school-performance data.
But why? Wouldn’t the insights help struggling or at-risk students succeed?
Not everyone shares the enthusiasm over data, says Candace Simon, senior data analyst at Bridging Richmond in Richmond, Virginia.
“Schools are not that receptive when we initially say that we are working on their data, or we're working to support their data plans. Data culture, especially in the education setting, is usually met with a lot of tension,” Simon said.
‘A Finger-Pointing Mentality’
As for a source of this tension, several point to the 2002 implementation of the No Child Left Behind Act and the use of standardized testing to measure adequate yearly progress toward achievement targets. At the time, NCLB was touted as a cutting-edge use of data in the education system. The law would reshape the education culture by holding teachers accountable for student outcomes, advocates believed.
But 14 years later, there are indications that NCLB has actually weakened the data culture in many school districts.
NCLB did help establish a more comprehensive and inclusive data infrastructure for districts, but few educators and administrators were educated on the use of data in their individual work. Since NCLB passed, the primary focus has been the end-of-year assessment of student performance.
That type of testing does little to inform ongoing instruction; it primarily affects teacher evaluation. And in the absence of data to monitor student progress throughout the year, post-school year evaluation was akin to turning the crank on a Jack-in-the-Box—a loud surprise with little forewarning.
Tensions began rising almost immediately, in part due to negative evaluations resulting in pay freezes, skipped promotions, or even dismissals.
Geoff Zimmerman was there in the early days, working with educators and administrators in the greater Cincinnati area.
Now the interim executive director of StrivePartnership (the local StriveTogether member organization), he is supporting the StriveTogether team running the Data Fellows program with a sense for what went wrong—and what might still be possible.
“When I started this work 10 years ago, one of the first project was a data report card on outcomes across Cincinnati and Northern Kentucky,” said Zimmerman. “[A]t the time, there was more of a finger-pointing mentality and not a lot of talk at all about data being used to improve.”
While the original intent was accountability, he’s quick to point out that the data from this period went a long way in scoping the problem.
For the first time researchers could disaggregate the data by different subgroups—race, gender, poverty level—but “beyond admiring the problem, we didn’t do much at the time to build understanding of how using data can improve decision-making,” Zimmerman said.
By most estimates, this accountability-only approach did not provide the intended boost to student performance, but it did show that no single role owned the task of improving student performance.
“What we’ve since tried to do with the StrivePartnership is create this idea of shared accountability and differentiated responsibility,” Zimmerman said.
We’re trying to build this mentality that we are all in this together, that these are our shared outcomes, and asking what we can collectively do to improve these outcomes.
Still, the ripples of the imbalanced rollout are still palpable to the Data Fellows.
“In many, many cases, educators have been called out by virtue of data,” said Kurz. “So, usually, the association with data is negative, and over time, this has built an aversion to data.”
Sarah Weppner of the Treasure Valley Education Partnership echoed her colleagues. Data has been used “more for accountability than improvement,” said Weppner, director of continuous improvement at the Boise, Idaho organization. But she was also quick to point out the value in rebuilding trust by showing data “as a flashlight rather than a hammer.”
The flashlight-hammer analogy comes up often when you talk to the Data Fellows. They’ve developed a support network of sorts for dealing with people who distrust data. And their efforts to rebuild a culture of data deliberately starts at the top.
Building a culture isn’t something that happens overnight or can be adopted out of the box. It requires continued focus and principled decision-making to overcome varying degrees of defensiveness at every level.
For many, earning buy-in from senior leadership is a critical—and often challenging—first step.
A recent McKinsey Global Survey found that organizations with high-performing analytics programs are nearly three times more likely than their low-performing peers to have executive sponsorship for their analytics program.
Comparing notes across the cohort, the fellows say the same is true in their school districts.
In his workshops, Adams incorporates data visualizations into an environment that gives the superintendents a chance to explore the information just as he would, drilling down into certain issues without judgment.
“We let the principals go through their day, function by function, just to see where they stand,” said Adams. “A lot of them have been through the reports that the state provides, but this kind of gives a clearer picture of where they are and how much they've gained from the previous year.”
The process is not designed to lead to specific conclusions, but rather to unearth other questions. The data can show what happened, but it is still up to the people in the room to ask why.
“The data gives us a chance to have powerful conversations about groups of children who might not be doing as well as they should,” said Adams.
Other Data Fellows described variations on the workshop concept. Many build “data walks” to allow educators to be equal participants in the analysis process rather than just the object of the findings.
While the style of a data walk can vary, the core concept is to get people up, walking, and engaged around a set of questions. Facilitators are not telling participants what to think, but taking them through the process and helping them come to conclusions together.
“When we were able to present the data first and show them some of the bright spots, we get more buy-in,” Simon said. “People can really see how they can use data to help their work.”
Allowing Data to Drive Conversations
Collaboration is a crucial piece of this culture-building, allowing people to quickly and easily share and build on each other’s findings. The Data Fellows say it is important is to create a safe space where the educators can see themselves in the data and talk about their experiences. They’re more open about changes and where those changes could be seen in the metrics.
In doing so, the review and analysis of student-performance data becomes something that happens through teachers, not just to them during evaluations. Treasure Valley’s Weppner has even seen a physical change in people as they go through the process.
“We will have meetings where people are quite reserved. But then they get up for the data walks, and that's when the conversation really starts to generate,” she said.
By exploring data alongside educators in a trusted and secure environment, administrators can empower teachers to do what is necessary to help their students be successful.
“Ultimately, I believe these conversations help to set up a strong factor analysis,” said Zimmerman. “Once everyone has a shared idea of what we know, we can start looking into the strategies that are helping and that are hindering progress.”
In return, teachers have confidence that the district understands their reality as they make difficult decisions or advocate for increased resources.
These types of feedback loops are critical to rebuilding data culture and the trust needed for schools to not just survive but to thrive.
Kurz is optimistic that change is possible. While working alongside teachers and administrators to dive deeper into the data, he’s seen attitudes quickly change.
“Once we start digging, they realize they have a lot of data. Initially, we help them with that, but in the long term, we want to build capacity in the local sites so they can do it themselves. My hope is that I'm working myself out of a job,” he said.