Why School Finance Data Lets Everyone Be “Right”

When Dale Russakoff published The Prize, her story about the Zuckerberg–Booker reform effort in Newark, it wasn’t just a postmortem on a failed experiment. It was a case study in how school finance — especially at the district level — turns money into mud.

Russakoff didn’t argue that Newark needed more funding. The city already spends over $20,000 per student, among the highest in the country. The problem, she wrote, is that less than half of that money ever reaches classrooms. The rest is absorbed by layers of central office bureaucracy, maintenance costs, and contracts that seem designed to sustain the system itself rather than the students it serves.

As Joe Nocera summarized in The New York Times:

“The KIPP charter network, which runs SPARK, gets $16,400 per SPARK pupil, of which $12,664 is devoted to the school. The district schools get $19,650 per pupil, but only $9,604 trickles down to the schools.”

The difference isn’t ideology — it’s structure. KIPP SPARK, a Newark charter, could direct funds to whatever the principal and staff believed students needed most. BRICK Avon, a Newark district school, could not. It’s a simple but devastating contrast in autonomy and accountability.

Yet not everyone buys Russakoff’s story.

New Jersey teacher and Rutgers PhD candidate Mark Weber wrote a 22-page “debunking” of The Prize, using state education data to challenge her conclusions. The data is right. The interpretation is wrong — and that’s the whole point. The way we categorize and report school finance data makes it possible for everyone to be “right,” while avoiding the real issue: whether dollars actually serve students.

Here’s what Weber argues — and what it reveals about the deeper problem.

Weber: Newark’s district has a lower per-pupil administrative cost than charters.

Of course it does. Scale guarantees that outcome. If Newark tripled its superintendent’s salary tomorrow, it would still report lower per-pupil admin costs than any charter network in the city. Dividing one giant number by tens of thousands of students produces a smaller ratio — that’s math, not efficiency.

Russakoff’s argument isn’t about aggregate ratios. It’s about allocation. Who controls the money, and how easily can they adjust it to meet the needs of real students? Newark’s central office might technically spend “less” on administration per pupil, but that doesn’t mean it’s leaner or smarter. It just means the waste is harder to see.

Weber: District schools have more instructional and support staff per pupil than charters.

True again — and still irrelevant. Counting bodies tells us nothing about impact.

Russakoff’s “Tale of Two Schools” highlights that clearly. At BRICK Avon, one kindergarten teacher waited eight months for district approval to get behavioral support for a student who routinely threw chairs. At SPARK, a similar student got help from a school social worker within days. Both schools had “support staff.” Only one had flexibility to deploy them effectively.

That’s the difference between compliance-driven staffing and mission-driven spending. One fills positions to satisfy formulas; the other shifts resources to meet real needs.

When Weber argues that the district “supports students better” because it employs more people under that label, he’s assuming all students and schools need the same prescription. They don’t — and that’s exactly why decentralization matters.

Weber: The district spends more per pupil on student support services.

That may be true on paper, but the paper is the problem.

Weber acknowledges that many charters (including KIPP) report zero spending in that category — even though they obviously employ social workers and intervention specialists. Why? Because accounting codes are ambiguous. One school might record a counselor’s salary under “instruction,” another under “administration,” and another under “support.” The result: numbers that can’t be meaningfully compared.

This is how big districts win debates. They can point to data showing higher “support service” spending and claim they’re investing in students — even if that money pays for attendance clerks, consultants, or professional development sessions that have no measurable effect.

As the New Jersey Taxpayers’ Guide to Education Spending shows, “Classroom Instruction” includes professional development, and “Educational Support Services” can include salaries for staff who record attendance. A district could spend $1 million on useless PD and still boast about its commitment to “instruction.”

The Real Lesson: Data That Defends Dysfunction

Aggregate data lets everyone protect their own narrative. Districts claim efficiency. Charters claim agility. Researchers trade charts while classroom teachers wait months for resources.

That’s not an argument for more categories or stricter reporting. It’s an argument for transparency that matters — data that helps leaders understand student needs and whether dollars are meeting them.

Transparency doesn’t mean more numbers. It means being able to trace a dollar from its source to its impact.

Until we build systems that make that visible, debates like Russakoff vs. Weber will keep looping in circles. Everyone will have data. No one will have clarity.

That’s what we’re changing at bookreport — replacing abstract financial data with real-time insight schools can actually use. Not to prove who’s right, but to make sure the money works for kids. sure our spending reflects our values.