AI applications are only beginning to materialize. It's so exciting what neural networks will do to art workflows in the coming years.
W T F!!! Why, really don't understand, EA = just a fucking looser company!
Amazing... Congratulations for the new way to show information.. I hope they could use this to teaching.
We’ve talked with Billy Lundevall about his experience working on creating photogrammetry materials for Megascans.
Me after finding an enormous dandelion plant!
Could you give us a little intro on how you got to work on Megascans? How did you get this gig and when did you become interested in photogrammetry?
I started working on Megascans after about two years of various other work for Quixel, which consisted mostly of usable content for the Quixel Suite and promotional art. I have always wanted to do anything and everything as accurately as possible, this is just my nature as an artist. I have never been able to just drag some sliders around and call it a day after proclaiming “nah, as long as it looks good enough”, it has to be correct. This is where my interest in real world measurements of material properties comes in. There simply is no better way to make sure a material is accurately represented in 3D than using a setup that can extract the necessary information from the actual object you are trying to recreate. I have been interested in photogrammetry since I first heard of it, but didn’t do any in practice until I started working on Megascans.
How did you work on vegetation scans for Megascans? How did you approach the photo production? How did you clean up the scans, how did you make sure that the scans were left with no captured lighting? It seems like a very laborious process?
The bulk of the work consists of physically getting the materials to the scanner and setting the materials up. My scanner is stationary, unlike some of the bigger scanners we use for other things. These machines are proprietary technology and purpose built by Quixel specifically to be used for Megascans. The asset production more or less takes care of itself through automated processing pipelines for different types of scans. The cleaning, preparation and tiling of the scans are completely automated, and part of the scan data compiler pipeline Teddy (CEO of Quixel) has been building for the past 6 years. Unlike most photogrammetry, these machines capture albedo, alpha, gloss/roughness, displacement, AO, specular, cavity, normal and translucency all within the same sequence.
The initial scan setup varies depending on what it is I’m going to scan and there is a lot of solutions that has to be built on spot from whatever I happen to have around. For surfaces like forest floors, gravel, moss and so on I have constructed a few trays out of flower boxes, I wasn’t able to find any large enough so I chopped two of them up and made larger ones through the magic of duct tape. I also raised the bottom to avoid shadows being cast from the sides. I then simply put whatever I need to scan inside this tray and then insert it into the scanner, tweak the settings and press the button.
The essential Megascans toolkit.
Things requiring alphas are a little more elaborate, as they have to be evenly spaced out, leave enough room in between the objects to be usable, whilst still not wasting too much space as well as be well placed and centered. In case of plants, this also has to be done fairly quickly as most of them die very fast once unrooted or otherwise dissected. I have developed several tricks along the way to do this efficiently and quality of the atlases have improved a lot since I first got my hands on the scanner, simply through trial and error.
The scanning process has a tendency to create problems I previously could have never imagined to ever become a problem in the first place, and with them, interesting stories.
For example, how does one get rid of all that previously scanned wood? My first idea was to burn it, which I did in my grill. I have now learned that burned half-rotten pine wood smells very, very bad and produces a thick black smoke the nearby kindergarten teachers weren’t that fond of. Luckily they just thought somebody was cooking some really smelly stakes, and I could scan the ash afterwards!
Disposing previous scan subject by the ways of fire.
Scanning moss was also an adventure. Firstly I had to walk out to the woods with big bags, physically dig up the moss from the ground, in chunks as it sits without disturbing it too much, then carry it home to prepare it and scan it.
I am fairly certain my neighbours think I’m completely insane and wonder what I could possibly do with that tenth bag of moss I carried home that day. Another hidden aspect of this is how much space it requires, how much filth it carries in with it, not to mention, the quantity of spiders.
Classic Swedish fika, Megascans style (I ran out off dedicated moss-space inside).
Some dandelion plants I threw in the garbage after scanning decided this wasn’t the day to die and started flowering a day or so later, most likely feeding on the other dead plants in there. Or maybe that ice cream? Who knows.
Decorative gravel was also an interesting story. It comes in these 20kg bags and are covered with a super fine dust, making the rocks look dull and boring. The bag told me that this dust is simply going to be washed away by the rain, but as far as I know it won’t rain much inside my apartment any time soon. I also had no access to a hose or any durable surface I could temporarily spread the gravel out on to rinse it of. The solution I came up with was to carry the bags to the laundry room as it had a sink big enough to fit an entire bag, place it in the sink, poke the bag full of holes in the bottom and then fill it up with water and letting it escape through the bottom, and then repeat that a couple of times.
The next immediate problem was drying time. The surface layer of the gravel dried quite quickly, but the rocks underneath didn’t. And as I need to mix it up to create multiple scans this was simply way to slow to just wait out. Luckily i had a hair dryer nearby.
The stories like these are practically endless, but sadly I don’t document things as often as I perhaps should.
Making sure those pebbles doesn’t leave me without a stunning hair job.
Could you talk about what kinds of content did you help to get into the library? What content does the user get (like floral elements, leaves, branches) and how can this content be used? What freedom does the user have with the scans he receives?
I have scanned a variety of different things, but mostly focusing on scandinavian vegetation (4,000 plants and counting all available in the library). When it comes to plants, I try to make one atlas including every component the plant is made out of, as well as do individual scans for larger things like leaves.
Here’s a shot of some of the plants I curated that were included in the last update we released this week. We try to generally release this amount of new scan content every week (yes, it’s a lot of work :D).
Depending on the scan subject, I also do a scan fully zoomed in on one or a couple of leaves to maximize resolution, in case somebody needs a high resolution asset for a close-up render.
Here is one of the leaves, rendered in Toolbag 2. The scanned maps I plugged in for this render were albedo, gloss, specular, normal, displacement, translucency and transparency. This single leaf is 4K.
We also provide 3D scanned branches and debris among other things, that are scanned with the same machine. These scans prove to be very useful to add a highly realistic level of detail, especially in more close up shots.
What kind of user cases of Megascans do you see? Where and when can artists benefit the most from the materials he actually gets from the Megascans Library? Can you give some examples?
The initial wave of Megascans content included thousands of natural surfaces, plants, rocks, wooden debris and so on which is perfect for any type of natural environment work, and next up we are unlocking more man made surfaces and archviz scans. The included software package called Megascans Studio is a very versatile way to make any surface scan unique and catered to your specific purpose. Right now people have mostly used it to make a wide range of ground textures, but I personally can’t wait to get my hands on some metals to throw in there.
The service has already been used extensively by companies like ILM, MPC Film, Capcom, Ubisoft, 343 Industries, Bungie and From Software just to name a few. The end user is practically anyone working with 3D.
What do you think are the benefits of the material production with the approach of photogrammetry? How does this help you to build better and more realistic looking materials? How does it help to achieve the desirable effect faster? I’d really like to feature more benefits on the use of photogrammetry?
The process itself removes much if not all of the guesswork associated with material creation. You no longer have to ballpark things like specular or diffusion values, unless you want to of course, which should be done within reason. The ability to fastly just mix things together in Megascans Studio also gives the user the ability to create more complex materials from the source data, or turn scans into something completely different if knowledgeable enough. The first time I saw an early beta version of Studio I knew it would be an environment artist’s dream, even when it had nothing but the core features. This program will also be expanded upon to elevate what it can do even further.
At the end the day, being outside for a change isn’t that bad at all. I grew up on the countryside, and after starting with digital art my exposure to fresh air drastically decreased. I feel a little bit back at my roots when I’m running around in the woods, chopping down things like a madman with an oversized blade. Mostly only accompanied by my music and sometimes my dog, on the rare days she’s not too grumpy and/or lazy.
Billy Lundevall, 2D and 3D artist at Quixel and freelancer
Interview conducted by Kirill Tokarev.