The Homestead has essentially been a three year R&D project that began as a simple trial and error learning curve passion project, a thing of a love for encapsulating environments via a single-pass photogrammetry philosophy.
Single-pass photogrammetry is basically what we’ve coined as the art of encapsulating / processing whole entire environments in one full sweep. This has a number of challenges due to the pure size of data acquisition, the scale of point-cloud data required to be processed/baked/repaired. The Homestead itself, not the artwork, was approximately 4Bn points of color/ normal / displacement map details, not large by our standards of today, though back in 2017 this was an extensive challenge. How do you bake this many points down in a single go without running out of memory? We essentially fragment the high to low baking task onto multiple system nodes via a distributed network and then recombine on completion. Though simple in theory, we discovered in practice dealing with such a large amount of data over hundreds of 8k tiles took much trial and error, consistency was key.
The imperfections of photogrammetry
Doing it in post is a thing. So once we had our bakes of approx 100x8k of texture data, we needed to remove unwanted artifacts, though more importantly, how would we remove the artwork from the walls, as one needs to realize the work that sits in the current experience is not part of the original scene. Technically speaking, we worked out how to apply something not too dissimilar to Photoshop’s Content Aware Fill, though over multiple 8k textures in a swiping lasso of a Wacom tablet. Quite thrilling removing gigabytes of data and replacing it with contextually aware content with a single click.
Each painting in the scene is approx 1.5Bn points of detail. Even more importantly for the paintings though was retaining roughness variation accurate to the original works. This technique, at the time, used a cross-polarization / ring flash setup as to obtain accurate diffuse in addition to specular capture that was then later converted to a moderately accurate form of roughness via internal methods for a close to true PBR workflow.
The detail was the least of our issues. Thankfully Granite by Graphene catered for this via megatextures / texture streaming. Our self-inflicted issue was the use of extensive dynamic lighting / planar reflections / volumetric fog / advanced water shaders and extensive draw-calls for geometry.
LOD techniques were our friend. Though for a single-pass environment this was extremely challenging. Tackling the seams issue of a scene divided into hundreds of sections took some out of the box thinking. At any one point, we’re only showing 400k polys on the screen. It’s a bit like adaptive tessellation with on the fly filling of seams.
It’s a bit of a thing for me, this mad obsession on wanting to have all the bells and whistles. To this day we’ve seen very few other photogrammetry environments that insist on this level of shine and shimmer, as everything else just looks flat. It really is the attention to detail. Simple things like making the lights hum at 60hz are to simulate electrical light flicker. The use of 3D spatial audio with impulse responses for secondary bounce reflections. Dynamic weather introducing subtle environmental changes. Clever reflection capture to simulate raycast reflections.
The project has a huge amount of legacy techniques in there. As stated this has been in varying stages of POC for well over two years now. If we were to do it all again, it would be many times more detailed and likely half the file-size. Our original scan of the environment was flawed, we just didn’t take enough photos. We could use deep-learning super-sampling to increase the resolution as we did with our internal build of Nefertari. Though to be honest, maybe best if we start fresh. As they say, garbage in garbage out.
Living breathing art gallery
One of the reasons we had to work out how to remove the artwork from the original scan was so we could simply add more artwork in overtime. This means we can push lightweight updates with only the content that changes plus a revised compressed lightmass bake. In addition we will have the video streams actively hot swapped with revised content. Eventually we’d like to expend the environment itself as The Homestead is an extensive and beautiful building. Worthy of a full revised capture.