Why we still scan with Matterport in 2026
Photogrammetry got cheaper. Gaussian splats got everywhere. We kept reaching for the same Pro2 we've used since 2019 — here's the unromantic reason.
The 3D capture space exploded since 2023. Polycam and Luma made photogrammetry phone-pocketable. Gaussian splats produced renders that read more like memory than geometry. Apple's RoomScan landed in iOS. The whole field was supposed to make Matterport feel like a fax machine.
It didn't. Not for the work we do. And the reason is unromantic.
Determinism beats wow
A virtual tour for a Porsche dealer doesn't need to wow the engineer who scanned it. It needs to load on a Samsung from 2021, embed inside an iframe nobody chose, indexed by Google, and read the same in three years as it does today.
Matterport's stack does that. The output is a known quantity. The hosting is a known quantity. The embed is a known quantity. The model is owned, indexed, citable, and won't disappear when a startup pivots.
The competitive workflow
Yes, photogrammetry can produce higher-resolution geometry. Yes, splats can render lighting that Matterport flatly cannot. We use both. But not for the same brief.
For a museum hall that needs to be archive-grade and citable, Matterport wins. For a single hero object that needs to be viewed at film-set fidelity, splats win. For a marketing render of a space that doesn't exist yet, photogrammetry of mood + Unreal beats both.
The mistake is choosing a tool by what's new instead of what the brief actually needs.
What we're watching
Matterport's Genesis camera is a step-change in geometry quality. Niantic's Scaniverse is making smartphone splats actually usable. The first product to combine deterministic hosting + splat-quality rendering will be a quiet revolution. We'll switch the day someone ships that.
Until then: the unromantic answer is that boring tools that ship the brief beat exciting tools that don't. We're paid to ship the brief.