Speckle AEC voices

Apr 27, 2026

The feedback loop AEC never built

Every project generates data, and the best firms are starting to use it to learn from past wins and mistakes. Here's what's standing in the way of broader use of historical design data, and what it will take to change it.

There's a question worth asking at your next leadership meeting: if a client asked you to show them, with data, how your last twenty projects informed the one you're pitching them now, could you?

Not anecdotally. Not with a case study PDF assembled the week before the interview. With actual evidence, comparable projects, real benchmarks, and measurable outcomes.

Most firms can't do this. Not because the evidence doesn't exist, but because the data that would prove it is locked in project folders no one opens after handover, archived in formats that can't be queried, or sitting in software versions that are no longer in production. The knowledge is there, it just isn't easily accessible.

The gap between the data firms generate, and the intelligence they can actually use is beginning to define the competitive landscape in AEC right now. And it's one the industry has barely begun to solve.

The building industry’s structural problem no one is fixing

Here's what makes this hard: it's not a tooling problem. AEC has plenty of tools.

Over the last decade, AEC has seen a wave of software investment aimed at improving project delivery. Better authoring tools. More integrated common data environments. Construction management platforms that connect the field to the office. Each generation of tooling has made the individual project more efficient, more coordinated, more data-rich.

And yet almost none of it has addressed what happens to that data after a project wraps up.

The prevailing model, from virtually every major software vendor, is still built around the project as the unit of value. Data lives in project containers. Models are stored as point-in-time artifacts. The CDE you're paying for is, by design, a walled garden. It protects and governs your project data, but it doesn't help you learn from it.

The prevailing model is still built around the project as the unit of value. The CDE you're paying for protects your project data, but it doesn't help you learn from it.

When one global design studio with 500+ employees and studios across Australia, Asia, the UK, and the US set out to build a portfolio intelligence capability, they started with 744 completed projects. They ended up with 38 they could actually analyze, not because the rest of the data didn't exist, but because it was inaccessible; either inconsistently formatted or trapped in incompatible software versions. That means the firm loses 95% of its most valuable knowledge.

And it gets harder before it gets easier. Even within a single firm, the same spatial element is named differently across teams, offices, and project types. Their study found 299 distinct names for what was essentially a meeting room. If designers can't agree on terminology, cross-project analysis requires enormous manual effort before it even begins. That’s work nobody has time for, and most firms never start.

The deeper issue isn't naming conventions, but rather the current mental model. AEC has spent thirty years building better tools for creating design data, and almost no time building the infrastructure to make that data compound in value over time. Every project still starts from scratch. Benchmarks are assembled manually. Intuition substitutes for evidence. And the accumulated intelligence of decades of built work sits in a landfill of archived project folders, generating no return for the firm.

The opportunity the AEC market is missing

All of this matters more now than it did five years ago, for a specific reason: AI.

The firms and asset owner/operators that will get the most out of AI-powered design tools are those with clean, structured, and queryable data. AI doesn't improve your intuition. It scales the evidence; if you don't have evidence, you just get faster guesses.

The vendors building AI features into their platforms understand this, but most are still orienting around the same project-centric model. AI tools are being layered on top of walled gardens. The copilot helps you work faster inside a single project. The intelligence stays local. The learning doesn't transfer beyond the wall.

What's being missed, almost universally, is the cross-project layer. The question isn't "how do I design this building faster?" It's "What do my last fifty buildings tell me about how to design this one better?" Those are different questions, and the second one needs fundamentally different infrastructure.

AI doesn't improve intuition. It scales evidence. If you don't have evidence, you just get faster guesses.

Portfolio intelligence isn't about aggregating data. It's about asking meaningful questions across a body of work:

What space mix performs best for law firms in high-rise towers?

Which structural approaches deliver the best constructability outcomes at this scale?

What meeting-to-workstation ratio do our most satisfied clients have in common?

How do the layouts in our last ten hospitals correlate with clinical outcomes?

The global design studio behind the study describes a designer facing a client brief with a 7m²-per-person density target, who can instantly surface the three most comparable projects in the firm's portfolio. Not by asking a colleague or searching the drive. But through a live query against structured project data. That's not a futuristic scenario. It's achievable with current tooling, given your firm has invested in the data infrastructure needed to support it.

Sophisticated owners in industries like healthcare, infrastructure, and commercial real estate are beginning to ask their design partners: What are we actually getting from our design data? The answer today is usually "not much". That gap is a competitive opportunity for firms willing to close it, and a competitive risk for those who aren't.

What has to change: The data infrastructure investment

Three things stand between where the industry is now and the point at which portfolio intelligence becomes a routine capability.

The first is data accessibility. Getting design data out of BIM models at scale has historically required manual intervention. Someone has to open each file, run the extraction scripts, and clean the output. The study we mentioned involved manually opening 172 models. That's not a workflow. And it's why most firms, even those that understand the value, never build the capability at scale.

The second is normalization. Even when data is extracted, it's rarely in a state that supports cross-project comparison without significant cleaning. Different naming conventions, inconsistent levels of model detail, and software version drift; all of it creates friction that compounds as scale increases. Portfolio intelligence requires a common language: enforced systematically through the data layer, so that what gets called a meeting room on project one is what gets called a meeting room on project fifty.

The third is data infrastructure investment. Storing and querying structured project data at scale requires a data pipeline that most AEC firms have never needed to build. No procurement process optimizes for it. No client contract has historically demanded it. But this is changing, and the firms that treat it as an afterthought will find themselves rebuilding from scratch when the pressure becomes unavoidable.

The extraction, normalization, and infrastructure problems are all solvable. The convergence of always-on model connectivity, automated data pipelines, and AI-assisted classification is removing these barriers faster than most firms realize. The tooling is there, or nearly there, for firms that are ready to use it.

What isn't automated is organizational intention. Firms that decide at the leadership level that their design data is a strategic asset will be in a fundamentally different position than those that continue treating it as a project byproduct. That decision requires someone at the top to say: " This is infrastructure. We're investing in it the same way we invest in any long-term capability”.

Why portfolio compounds in AEC

There's an asymmetry worth understanding here: a firm that starts building a normalized, queryable portfolio data layer today will have a multi-year advantage that's nearly impossible to replicate quickly.

Unlike point-in-time capabilities, like hiring a BIM manager, deploying a new authoring tool–portfolio intelligence compounds. Each new project makes the dataset richer, the benchmarks more reliable, the queries more precise. The firm that has twenty years of structured project data can answer questions that a firm with two years simply cannot.

Portfolio intelligence compounds. Each new project makes the dataset richer. The firm with twenty years of structured data can answer questions a firm with two years simply cannot.

That compounding dynamic is why the window matters. The firms that move now will be making evidence-based recommendations in ten years that their competitors can only aspire to. And the clients they serve, particularly sophisticated owners building across large portfolios, will gravitate toward the partners who can answer questions with data rather than memory.

The design firm that can say "based on your last twelve projects, here's what correlates with cost overruns" is a fundamentally different kind of partner than one who hands over a PDF at the end of schematic design.

The contractor that maintains data continuity between design and construction doesn't lose intelligence in translation.

The owner who treats their buildings as a data asset, not just a capital asset, gets better outcomes on every future project.

At a recent Suffolk Technologies event, the idea of data rights associated with a built asset was raised. Not just water or air rights, but the idea that you might sell the design and operating data associated with the building to the next owner so they could make better decisions during their time of stewardship.

This is not a technology story. The technology is ready, or nearly so. It's a strategy story. And it starts with a simple question: Are you capturing the data you're generating today in a way that will be queryable in five years?

The intelligence layer doesn’t build itself

The real unlock isn't a new platform, it's a new mental model.

AEC has spent a generation optimizing the project. The next competitive frontier is optimizing across projects. Building the feedback loop the industry never built, so that every building you deliver makes the next one better.

That starts with accessible, normalized design data. Not trapped in project folders. Not locked in proprietary containers. Not dependent on someone manually opening 172 models. Available, structured, and connected, so the intelligence that exists in your body of work can actually do the work it's capable of.

The firms that get there first won't just be more efficient. They'll be better. And they'll be able to prove it.

Subscribe to get real-world AEC data workflows and ideas straight to your inbox.
Virginia Senf

Virginia Senf

Growth Lead