SnapTank: Democratizing 3D Scans

SnapTank: Democratizing 3D Scans

We’ve talked with the guys from SnapTank about modern 3d scanning and how it can be used in your game production.

We’ve talked with the guys from SnapTank about modern 3d scanning and how it can be used in your game production.


James HanlineAt SnapTank were a diverse team made up of people from all over the world, from the UK and Europe, to New Zealand. We have backgrounds in 3D scanning, film production, and the game industry. We’re a passionate bunch who believe in making photorealistic 3D models accessible and affordable within the community. So, we have created an online platform just for 3D scanned realistic assets.

SnapTank is an online marketplace, dedicated to the exceptional 3D scans out there and giving exposure to the capture artists who create them. Anybody with the ability to scan a real world subject and turn it into an accurate 3D model, can upload it to the website. We have created a space dedicated to what they do, so 3D scans of anything, and everything, can be found easily.

The uses these scans have are endless; artistic reference, base mesh, background 3D assets, architecture visualization, virtual reality spaces, rendered scenes, and as realistic game assets. With most of our scans commercially licensed, anybody can use them.


Working on Zombie Scans

James HanlineWe wanted these Zombie scans to be usable as 3D reference as well as a photorealistic model. With Zombies, the costumes and makeup had to be bang on, as well as talented actors to play the part. Fortunately, we had the help of Robbie Drake, a special effects make-up artist with over 15 years experience. He designed the makeup based on the anatomy of each actor so each zombie’s appearance, from clothes to sores, was unique.

The concept of scanning the Zombies was pretty straightforward, but the steps we have to take once that information has been collected is another matter.

The key feature of the scanning process was to use photogrammetry experts Clear Angle Studios. They hold a ‘rig’ made up of 150 DSLR cameras, with each camera aligned to a particular section of the subject and programmed to trigger at the same time. Each separate image captured is then aligned using incredibly efficient software such as Agisoft Photoscan, Reality Capture, or Autodesk ReMake. Like a puzzle, each piece is put together in 3D space, creating the model you recognize as a terrifying zombie.

1 of 2


James Hanline: Scans generally come out as a tri-poly mesh in either an FBX or OBJ format with a UV map automatically attached by the scanning software.

Dependent on the quality and detail captured in the scanning process, not much clean up is needed. However optimized topology could be required, for rigging and animating, in which case retopologising will be needed. This is where tools like TopoGun or Wrap3 can really help speed up the process. ZRemesher or Dynamesh in ZBrush are also useful. However, this is all dependant on the quality level you are trying to achieve.

If you have a scan in RAW form, then more clean up will be needed if you want to achieve a polished result. Needing to correct floating artifacts, holes or issues in the mesh created by occlusions, and blurred textures.

The use of unwrapping UV’s is almost always needed.

1 of 2

This is partly our aim. We want to save artists time who want the benefits of the realism 3D scans provide, but don’t want to go through this process.


James Hanline: With a photogrammetry scan, you need to get the lighting perfectly right at the point of capture. Otherwise you might as well not have the texture at all and create it from scratch in something like Substance Painter or Mari. This means if shooting outside, shooting on a flat-lit day with a polarizing lens to hand. This may not work with all cases, but it minimizes retexturing work needed later down the pipeline.

Shooting in a studio, you can achieve cross-polarization, to remove all trace of specular reflection, to give you the best textures for later use. This means there is no de-lighting needed and you get a clean diffuse/albedo map, which you can feed straight into your renderer. This is one of the benefits you can see in the Zombie scans we produced.


Miriam-Sandra Sarbu: For efficient optimization of materials and shaders one would generally use the same software used when initially creating the materials from scratch. So Substance Painter and Substance Designer. But, usually, most game studios would want to create their own materials and shaders , according to their already established Stylistic Vision.

How time-consuming is the whole process?

James Hanline: That is a little bit like asking how long is a piece of string. This depends on quite a few things. Such as the size, shape and location of the subject being scanned, as well as camera settings, lenses used, lighting, image quality, and the software used.

We generally work with the more professional photogrammetry software. In the case of the Zombie scans, Agisoft Photoscan Pro was used. If you’re 3D scanning a rock, and process it in Photoscan, it could take you anywhere between 1 hour and 10 hours. Depending on the number of photos taken, how high a quality setting you want. As well as the actual quality of the images taken, lighting, focus, number of images the software actually aligned.

Whether you want it UV’d, retopologised, with specular, displacement, normal, albedo, or cavity maps. This would all come down to the workflow each person uses and how badly they needed a realistic 3D rock!

1 of 2

But if you’re looking to create a highly detailed scan of a sculpture the same size as a person. Including the scanning and processing time to create a RAW scan, you could be looking at 16 hours on a standard workstation. About 4 – 6 hours of which would be man-hours.

There are many different scenarios. However we can count on the fact that the capture artists we have signed on work hard, so you don’t have to.

Scanning Environments

James Hanline: Well, scanning environments can also be achieved using photogrammetry. With the use of a drone and a camera on a flat-lit overcast day, you can capture almost any structure or environment. It comes down to getting the right images to then piece together in the software later and create that 3D model.

Other methods, like LiDAR systems, have traditionally been used to capture large environments. It’s fast and efficient and didn’t used to take as long as taking 500 images of a church for example. But photogrammetry drone users are getting better and better, optimizing their workflow to create some incredible scans of architecture.

The main different with scanning a structure or environment is it’s not likely to move. Scanning a person, you need a photogrammetry rig as people will move. If the subject being scanned moves in 3D space while you’re moving around taking images of them, the software will have trouble aligning the cameras when you come to process it. A static object or structure doesn’t have this problem, so you can take as many photos as you like (within reason!).

Scans For Games

Miriam-Sandra Sarbu: All scans potentially work with games, as all scans can be made game-ready, the most important thing to remember is polycount, optimized topology flow and efficient UV-mapping.

James Hanline: You can basically get any scan into a game engine these days. Poly budgets have been getting much higher, allowing for better quality, perfectly suited for 3D scans. But the old method of low poly topology and use of normal maps still plays a part.

A key thing is what you want that scan to do in your game. Is it a static background object or moving part? Scans work great as static objects. They’re from the real world and look realistic with little effort from the 3D artist.

But they’re also great as moving, animated models. Particularly when married with motion-capture data sets. As long as the scan has been retoplogised and rigged. Which as mentioned earlier, can be done pretty quickly these days with tools like Topogun and Wrap3.

3D scans are increasingly being used by AAA studios. Dice’s use in the Frostbite Engine in Star Wars Battlefront and Battlefield 1 show how effective they can be.

Our goal is to make this accessible to all studios. Powered by a community of capture artist passionate and driven by what they do. Creating the best 3D scans out there.

James HanlineManaging Director at SnapTank Ltd

Miriam-Sandra Sarbu3D Character Modeler 

Interview conducted by Kirill Tokarev

Follow on Facebook, Twitter and Instagram

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more