This column was originally published on the Ball State University Center for Business and Economic Research Weekly Commentary blog.
By Michael J. Hicks
April 5, 2026
For families with kids, the most important part of choosing a neighborhood is usually the quality of local schools. In a new Ball State University study, co-author Dagney Faulk and I asked an important question: How much of each school district’s success can be attributed to staff and leadership?
What we found may change how you think about your local school because some Indiana districts are dramatically outperforming what their demographics would predict, while others are leaving potential on the table.
Like it or not, American public education is central to most communities. We name our streets after high school mascots, we follow athletic teams on Facebook, and we celebrate conference, sectional and state athletics. It’s provincial to admit this, but if you are a local plumber or landscaper who is a Yorktown graduate, I’m far more likely to do business with you.
Yet, we measure schools in Indiana based on methods better suited to 1926 than 2026. Simple measures, such as differences in standardized test scores, account for as much as one-third of the differences in home values between communities. So, measuring school quality should be a central goal of state policymakers.
One key challenge in evaluating school quality is that households choose their schools. Family characteristics play a huge role in the performance of local schools on standardized tests. So, if you wish to understand how well a school performs, you have to account for the kinds of families it serves — because schools don’t get to choose their students.
Fortunately, there’s abundant data on each school corporation. We know the size, race and ethnicity of students in every Indiana school. We know how many are labeled as English language learners because they speak another language at home. These data provide us with a way to isolate schools’ contributions to outcomes from families’ contributions.
We can also add measures of poverty, such as the share of students eligible for the free and reduced-price lunch program and per-student spending, which is adjusted upward based on three measures of poverty or family hardship.
In this model of Indiana’s 290 public school corporations, there were only two issues that correlated with test scores: our poverty measures and the share of Asian students in the school (but only tests at 8th grade and above). This held across the 3rd- and 8th-grade ILEARN and 10th-grade SAT share who met the college-ready benchmark.
School size didn’t matter. The share of White or Black students didn’t matter. And the English language learner share didn’t matter — which is sure to disappoint anti-immigrant folks in the state.
So, what affected school performance is poverty, plain and simple — that is among the most studied and clear findings of the last half-century of social science research.
We wanted to understand what the numbers behind the numbers tell us — namely, how good the teachers are, how strong the curriculum is, how rigorous the grading standards are, how effective the superintendent and board are, and how other characteristics, like school buildings, affect performance.
This process is often referred to as the value-added method of grading school performance. Perhaps the most useful thing about this method is that it offers families another way to judge their local school — how much value added to student learning can be attributed to those things the school can control — their staff, standards and curriculum. As a parent, that’s what I am interested in knowing, not just how well-heeled my neighbors are.
To help parents, we listed a ranking for every Indiana school corporation, breaking them into five equal groups, ranked from highest to lowest. It’s worth taking a quick look at the excellent performance of the top five school corporations in our sample.
Crawford County, Speedway, Brownsburg, Avon and North Spencer racked up stunning value-added measures. The students at these schools passed their standardized tests at levels ranging from 24.2% to 31.5% higher than they were predicted to do, given their community demographics.
We tested these measures over several years, and there was a lot of consistency from year to year. But there were some outliers as well. The five schools that saw the biggest year-to-year change were South Central, Blackford, DeKalb, Crawford and Caston. Those schools saw their value-added increase from 20.8% to 32.8%.
And that brings us to the real value of this study. Our measures allow schools to find peer institutions that are performing well and learn from them.
Our measure is not intended to replace whatever ranking system the state designs. But, as an independent assessment of the value-added of local school staff to the educational outcome of students, it should be part of every debate about school quality.
And if you lead a school corporation that isn’t doing as well as you wish, start with Crawford County. Whatever is going on there, you should bottle it and sell it to other school corporations.
Michael J. Hicks is professor of economics and the director of the Center for Business and Economic Research at Ball State University. He previously served on the faculty of the Air Force Institute of Technology’s Graduate School of Engineering and Management and at research centers at Marshall University and the University of Tennessee. His research interest is in state and local public finance and the effect of public policy on the location, composition, and size of economic activity.
The views expressed here are solely those of the author, and do not represent those of funders, associations, any entity of Ball State University, or its governing body. Also, the views and opinions expressed do not necessarily reflect the views of The Indiana Citizen or any other affiliated organization.