Thanks for sharing and detailed production breakdown
i thought there wouldnt be anything better than akeytsu for creating easy animations. im happy if i am proven wrong.
Keith, I just wanted to stop by and say: Thank you.
Not so long ago we had a little chat with Simon Barle from DICE, who was a bit frustrated about the possibility of photogrammetry taking over the 3d content production for games in the upcoming years. While it’s obviously not the case, it’s still interesting to learn what does photogrammetry do, how does it help and how does it impact game development. We’ve talked with Simon Che de Boer from realityvirtual.co and discussed the way you can use this technology.
We are realityvirtual.co, we are an Auckland, New Zealand based creative technologies research & development collective with an enthusiasm towards the visual realm. We emphasize on implementing off-the-shelf components & pro-sumer grade equipment in to our workflow. This alongside open-source tool-sets, crowd-sourcing principles & modern fabrication techniques allows us to achieve rapid, cost-effective development. Once a project has reached a satisfactory level of practicality & reliability, on the field applications & eventually commercialization is achieved. This keeps us fresh!
In a nutshell, we are streamlining VFX & capture technologies – allowing us to deliver presence in many forms, a ‘being there’ sensation. Truly an absolute godsend for foreign clients, innovative viral video marketing strategies & end-user experiences alike! Our current focus is towards ultra-realism, capturing a real world environment with as much realism a presence currently achievable via immersive technologies.
Photogrammetry is a huge deal right now, I’d argue that is should have been a huge deal a year ago, when I first started toying with it a few years back, it seemed quite simply like magic. I remember seeing a TEDTalk about Photosynth back in 2008, not even realising back then the potential. The way I stumbled across it was unique in itself, but that is a very personal story, for another time.
The applications for photogrammetry can not be overlooked, as the technology allows for rapid ‘user experience’ development a magnitude faster than traditional game asset & environment creation methodologies. There’s also the holy grail that artists have attempted to achieve for many years – leaping over uncanny valley! Uncanny valley has been difficult, it boils down to the fact that the ‘perfection of imperfection’ is difficult via traditional means. An artist can be the most skilled artist ever, however, when experiencing VR, the mind is not so easily tricked. Details that our conscious mind may not even be aware of, our unconscious just knows when something is simply not right. The way rust has formed on a facet, how leaves, dust and rubbish has accumulated over time, the degradation of brickwork & masonry, these are all subtle cues to a prolonged environment of deterioration, something not so easily faked.
Scanning a Scene for VR
The complexity of this ‘Phogotrammetry to VR’ teaser is actually a remarkably simple, rushed and crude demonstration of the process (we never had intended the fanfare, far better to come). The whole process from this particular scene, from the initial photography, to the VR ready experience took less than a day (only an hour if you remove the processing dead-time), semi-automated.
A number of different applications are used in order of importing / image processing / point-cloud alignment / dense point cloud creation / point cloud noise & particles & outlier removal / decimation / retopology / uv unwrapping / 1B+ dense point cloud to lowpoly bake (alberto, normal, ao, roughness, displacement map) and eventual importation of assets into UE4 or Unity. We can’t list all our specific packages, the process is constantly evolving and took me a good six months of consistent research and adaptation. The learning process was complex as I honestly didn’t come from a game design / 3D / photography background what so ever. Prior to this adventure I was really just editing videos in Adobe Premier.
The equipment used was hardly high level. A simple pro-sumer Canon D600 & Phantom 2 with GoPro. In this demo we didn’t even use the Phantom. The lighting was very poor in the late afternoon and I was just on my way home, took a few quick photos, about 200 I believe. Our far more ambition projects contain about 2000 photos and we are using slightly better equipment, but in all honesty, if you know a thing or two about signal processing, you can get a hell of a lot out of prosumer based equipment. The pipeline took a long time to learn, and aspects of this pipeline / workflow were not even discovered or even doable a few months ago. It’s been new software introduced by friends of mine in the industry and having early beta access to many of these packages that have allowed us to do what we do. The scene is just one big pile of quads, no segmentation what so ever.
Lighting is interesting, with this scene in particular, I’d say it is a hybrid. Basically we flatten the highlights / lift shadows and reintroduce these by mixing both emissive baked lighting with dynamic lighting placed in the same position, this dynamic lighting then reintroduces the highlights & shadows. We always try to shoot with overcast cloudy days in mind. In saying that, however, we have been experimenting using full blown daylight and it’s elements to capture scenes with genuine lighting to embrace absolute realism, soon to experiment with HDR.
Basically there are many ways to skin a cat, this scene was overcast, some future scenes will not be, others are completely delighted using proprietary techniques, easy to do with assets, near impossible to do with environments, but we are finding ways around this.
Using Unreal Engine
UE4 is by far the best engine to work with photogrammetry, simply because of it’s cinematic roots, matanee, amazing lighting engine and from a personal experience as someone who had prior training, its ease of use. I would love to talk about the number of other packages we used, however, very few can I mention as many of the applications are currently not on the market / obscure / under alpha development. Others however, such as Meshlab, xNormal, 3D studio MAX were all used in the pipeline.
Photogrammetry in Game Development
Photogrammetry for game development is awesome! Depending on how you want to go about it. We see it as a great tool for immersive environment capture, however when doing these large scale environments, you must keep in mind, these objects in any given environment are not going to have a certain level of interactivity. For asset capture of smaller objects, mixed with delighting and automated retopology / uv mapping, photogrammetry is priceless.