Generating The Universe in Elite: Dangerous

Doc Ross from Frontier Developments talked about the way he helped to create the amazing universe of Elite.

Doc Ross from Frontier Developments talked about the way he helped to create the amazing universe of Elite.

Introduction 

Hello! I’m Doc Ross, current lead render programmer on Elite: Dangerous via a rather convoluted route. I used to be a particle physics researcher with Lancaster University (UK), attached to the DZero Experiment at Fermilab in the United States. My focus was looking at differences between matter and anti-matter in sub-atomic interactions. Five years ago I made the jump to games programming with Frontier Developments, where I cut my teeth on some gameplay functionality for Elite: Dangerous, before helping create the StellarForge system which simulates the galactic data forming the environments you play within. This crossed over with graphics programming to display all this data on the screen, which has eventually lead to my current role.

Generating Planets

Hierarchical data is useful here, where the smallest details on planets are informed by the results of planet-scale information generation, which are informed by star-system scale information, which are informed by galactic information. Everything is simulated top-down, where parent data is always available for generating the sub-objects – for example having the average star system values on hand before we create planets within the system. The body will only store information relevant to it but will have a link to its parent and access to its siblings if it is required.

We have galaxy-scale material and age distribution functions, the mass of the Milky Way which informs us of how much “stuff” is in a volume of space (a sector) and approximately how long it has been there. The top-down view of the distribution is one of the only non-programmatic resources used in StellarForge, and it ensures the spiral arms and galactic bulge form the correct shape.

This allows the simulation of star systems with stars of the correct type and age of a primary star. The leftover materials that are designated to that star-system are used to generate the parameters of a proto-planetary disc, providing distributions of elements that can be accumulated over epochs that the simulation steps through during generation. Things like solar winds blowing away materials, catastrophic events, tidal locking, and gravitational heating can all be tracked on a per proto-planet basis.

This produces a simulated planet which knows its rough classification based on how much mass it has, the types of materials it is made from, it’s volcanic parameters, a temperature differential of the crust, mantle and core, trapped gases/atmosphere, and heating from tidal effects and direct radiation from the parent stars.

These parameters can be used to infer ranges for parameters which control geological feature forming concepts, such as current/past tectonic activity and volcanism. This, in turn, influences the average properties of things like basins, undulating terrain, mountains, fissures, craters, perma-ice and snow from nearby ring systems.

These average properties, which are planet specific, are packed up into a buffer which is sent to the GPU when calculating the surface of the planet. They modulate noise functions which are combined together to form geological shapes.

All of these values are simulated according to calculations of their plausible ranges. The exceptions come at a star-system level, where information from the Hipparcos and Gliese stellar catalogs are used to seed our generated Milky Way with real stars.

Details

We currently have two types of planets rendering in Elite: the ones which shipped with the initial release and the ones with “landable” surfaces which came as part of the Horizons update. The geometry of the former planets is more straightforward as they are perfect spheres; their sense of height difference is provided by the normal mapping of their surfaces. 

The “landable” surfaces start as a cube with square sub-dividing faces which behave as quadtrees. These faces are uniform tri-meshes which, as you get closer to one of them, further sub-divided into four sub-patches with closer points. The points are used to form triangle sheets, and the patches continue to subdivide depending on your distance from the surface, ensuring that the vertex density is higher where most useful, down to a target final resolution.

The vertices in these patches are uniformly spaced because we not only have to generate patches to render, but potentially separate patches to form physics meshes. We have a very efficient pipeline for requesting and generating these uniform patches to ensure that there is always a physics patch available for any player or non-player entity that requires it. 

Planets tend to be a little curvier than cubes though. Using compute shaders on your GPU, the patches undergo spherification via a mathematical function and they are subject to noise graphs. These are collections of noise equations which take the points position and unique planet ID as input information, along with the astronomical data mentioned above, to form geological feature shapes which can span the entire planet surface if needed.

In terms of scope and scale, though, planet-scaling noise across accurately sized planets is a challenge. To ensure consistent visuals and gameplay on the screen and between users we need millimeter precision, but the input values for the noise functions depend on the point on the surface of the world relative to the planet’s center, which can be of the scale of tens of billions of millimeters. We could use 32 bit – i.e. single-floats if the scales involved were 100s of km instead of 10s of thousands of km. To tackle this, we have written alternate libraries to create the functions in 64 bit – i.e. double precision and dual-float precision. The former is native 64-bit handling floating point numbers and the latter is emulated 64-bit functionality using two 32 bit floats. The reason for this is some GPUs handle one better than the other, or not at all, and we need good graphics card coverage. This comfortably gives us the number-space required for millimeter precision.

Painting Planets

When Horizons originally launched we used entirely computer simulated planets. Rock and dust surfaces were entirely described mathematically using pseudo-random number generation. In an update, artist-generated textures were introduced for rock, sand, and scree per material on a planet, blending depending on the gradient of the terrain. Wang-tiling was used to break up the pattern of the texture. Tri-planar blending is also utilized to make sure no stretching happened to the textures across the curved 3-D surface of the entire planet. For most hardware, using blended artist textures driven by the simulated masks based on the terrain performed fastest and looked the best.

A collection of materials is assigned to each planet, chosen depending on the planet’s physical properties. The materials blend into each other, modulated by the type of geology for the planet being rendered. These material collections and their blending masks have undergone an overhaul for rocky planets in our Beyond – Chapter One update.

Above a certain LOD level of the planet surface patches, flat geometry is used with textures generated for the look, normals, and height of the terrain. This provides additional surface detail from orbit. However, it can lead to the surfaces looking flatter than they would be. To tackle this, each patch also generates lighting information used by it and its neighbors. The shapes of shadows that would be cast across the surface and regions that would appear brighter due to sub-surface scattering are provided through a bespoke patch generation system. Once the player is close enough to the surface, traditional shadowing techniques can take over.

Managing the Size 

The secret is a development team with a lot of dedication to a game series they love and a strong sense of collaborative spirit. I’ve been working on Elite Dangerous since April 2013, others have been on since the even earlier days of the Kickstarter campaign, and we’re still here expanding what StellarForge and the game as a whole can offer.

Hierarchies of data and unique identifiers help keep things organized under the hood. Due to the procedural nature of the game, every object needs a unique identifier so that each client and the server knows you’re talking about the exact same object. A 64-bit integer number can store the x, y, z coordinate of a sector of space, the sector layer (sectors come as part of an eight-layer octree), the ID of the star-system within the sector and the ID of the body within the star-system. This is the result of careful tuning of the sizes of the sectors and the number of things each one can generate.

This helps server and client know which astronomical location the player finds themselves in, what may be nearby, and which other players share this space.

Prioritised resource streaming systems mean that assets such as models and textures are only loaded in when required of the scene, when there is an available memory and when it is more important to the user that other potential items.

We have a suite of useful development and debugging tools as part of our in-house Cobra engine, which helps artists, designers, and programmers organize many disparate elements into a larger whole. 

I’d like to emphasize, however, that our job is not done yet. There are still a lot of galactic content and mysteries that are being brought to life in Elite Dangerous!

Doc Ross, Lead Render Programmer on Elite: Dangerous

Interview conducted by Kirill Tokarev

Join discussion

Comments 3

  • Michael

    Hi!
    Can You also tell me as I am simply damn curious how is the skybox generated with all those stars and nebulas? As all the points (stars) are actually real stars we can hyperspace to (possibly using multiple jumps), also the nebulas. Are they pregenerated for each system or what, are they generated client side during the witchspace animation? Best regards Michael

    0

    Michael

    ·5 years ago·
  • Anders Backman

    Thanks for the article. I'm curious how the low level shadowing data is generated, before 'traditional shadowing techniques' kick in. Presumbaly this data is then used by the shader to generate faux shadowing?

    0

    Anders Backman

    ·5 years ago·
  • Tango-Zero-One

    Since the horizon release, the SLI users (with older graphics cards, like gtx 570) are having rendering issues, as in checkered planets. The support told, that you know the issue but won't fix the issue, hence it is a small base of users.

    Will you get that fixed with the 3.4 release?

    0

    Tango-Zero-One

    ·5 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more