By Aaron Benz, Founder, CampusIQ
Over the last few days, no fewer than five people have asked me what I thought about EAB’s recent article, “Space utilization reports aren’t actionable enough for higher ed leaders.”
Since the piece is making the rounds—and since EAB clearly succeeded in getting the attention they wanted—I figured I’d share my thoughts. And, in the spirit of the article’s own tone, I’m going to have a little feisty fun with this.
Before I begin: I respect EAB. They’re smart, their brand is strong, and the marketing move here is clever. But this particular critique is ironically an example of the very problem it claims to diagnose: a partial, incomplete analysis that gestures toward expertise while oversimplifying the real story.
Acknowledgement: I’ve directly been a part of several space utilization studies in the form that is being critiqued. As far as I’m aware, the ones I’ve participated in are not yet available on public websites, so I don’t believe they would have been in their analysis. We are often brought in to add massive quantitative data for the reports, and then are kept around to operationalize efforts to actually deliver the results universities are seeking post-report.
So - Let’s break it down.
This is one of the strangest critiques in the article.
EAB lists the kinds of policy recommendations they keep seeing in space utilization studies and implies it’s a sign of lazy, copy-paste consulting. But let’s be honest: these “common” recommendations show up repeatedly for a simple reason—
Standardizing room assignment rules, clarifying scheduling windows, enforcing enforceable policies… these aren’t examples of uncreative consultants. They’re the basic building blocks of any functional space management ecosystem. If ten institutions lack the same core structures, ten consultants will—accurately—recommend addressing them.
Calling that laziness is like telling a cardiologist they lack creativity because they keep recommending reducing sodium intake.
Sometimes the patterns show up because the patterns are real.
This is the section where I genuinely agree—and genuinely disagree.
Benchmarking is wildly inconsistent across higher ed. It’s a huge problem. Different methodologies, different assumptions, different interpretations of room demand or station occupancy… measuring space is far from standardized.
At CampusIQ, we’re tackling this directly. We’re preparing to release one of the most comprehensive and diverse benchmarking datasets in the world: a quarter-billion gross square feet of measured utilization from real higher-ed environments. As far as I know, it’s the largest empirically measured space dataset for space in higher education.
EAB is right: higher ed deserves better benchmarking.
The article takes shots at consultants for pointing out well-known issues like “classes are concentrated from 10am to 2pm.”
Sure—people in our field know this. But the people who need to understand it in order to drive change are executive leaders, boards, and strategic decision-makers.
And guess what?
Those folks don’t always know it.
Pointing out peak patterns is not filler—it’s political capital. You cannot ask faculty to shift pedagogical patterns or teach outside preferred windows without organizational authority and stakeholder alignment. Reports call out these patterns to create that momentum, not because the analyst discovered some hidden mystery of the universe.
These aren’t “generic” insights—they’re essential narrative framing for change management. Alyson Goff’s blog, “Not Every Solution is Physical” discusses this very idea.
Here the critique sounds good… until you consider reality.
I’ve sat in plenty of space rationalization and utilization studies. You won’t find all the actionable details in the publicly posted documents—and you shouldn’t.
Why?
Because hammering specific departments, individuals, or political fault lines in a glossy publicly shared PDF is a great way to:
The detailed recommendations—the real, meaningful, sometimes painful ones—happen in conversations, presentations, working sessions, committees, and governance discussions.
They’re scaffolding.
The real work happens off the page.
This “lack of specificity” isn’t evidence of weak analysis—it’s evidence of responsible consulting.
This is my biggest critique of the EAB post, and the reason the article feels more like a marketing funnel than a true analysis:
That’s the whole ballgame.
Take the University of Central Florida—a study from a couple years ago that must have been included in EAB’s meta-analysis. On the surface, the published report may look generic. But the outcomes? Massive.
The study worked. And its work is only accelerating. (Check out our discussion with UCF and the University of Texas San Antonio here.)
But EAB’s article makes no attempt to evaluate impact—only to critique style.
If the argument is “these plans don’t change anything,” then show me the receipts.
EAB says space utilization reports are often:
But ironically—and I say this with affection—their blog post shares those same qualities.
It’s a superficial pass designed to:
Again: smart marketing. I respect it.
But let’s call it what it is.
In all seriousness, I appreciate EAB for stirring the pot. If their goal was to spark conversation, they succeeded—I’m writing this because people keep asking me about it.
And in fairness: their critique isn’t entirely wrong. Higher ed needs better benchmarks, better clarity, and better integration between policy and reality. That’s the work we’re doing at CampusIQ, and I’m glad the industry is finally giving the topic the attention it deserves.
But let’s not pretend that space utilization studies don’t matter or don’t drive real change. They do. The evidence is there—you just have to look beyond the PDFs.
If EAB wants to push the field forward, I’m all for it.
But let’s do it with full context, real outcomes, and a complete picture—not just a catchy headline.
For real case studies, real stories, and real results, consider attending APPA webinars, or rewatching ones like University of Kentucky - A Living System for a Changing Campus. Consider going to Tradeline or SCUP conferences, where we showcase institutions like UTSA, or APPA T3s where you can learn more about UCF’s journey. Many organizations offer education programs on space management and planning such as this Tradeline event which features a fundamentals course co-led by Alyson Goff.