@Tristan: I studied computergrafics for 5 years. I'm making 3D art now since about half a year fulltime, but I had some experience before that. Its hard to focus on one thing, it took me half a year to understand most of the vegetation creation pipelines. For speeding up your workflow maybe spend a bit time with the megascans library. Making 3D vegetation starts from going outside for photoscanns to profiling your assets. Start with one thing and master this. @Maxime: The difference between my technique and Z-passing on distant objects is quiet the same. (- the higher vertex count) I would start using this at about 10-15m+. In this inner radius you are using (mostly high) cascaded shadows, the less the shader complexety in this areas, the less the shader instructions. When I started this project, the polycount was a bit to high. Now I found the best balance between a "lowpoly" mesh and the less possible overdraw. The conclusion of this technique is easily using a slightly higher vertex count on the mesh for reducing the quad overdraw and shader complexity. In matters visual quality a "high poly" plant will allways look better than a blade of grass on a plane.
Is this not like gear VR or anything else
The Weather Channel used tech known to game dev community on Thursday broadcast in its first-ever live simulation to show Hurricane Florence’s severity before the storm hit land.
The Atlanta-based television network has recently found a way to implement graphics processing in order to set up immersive mixed reality visuals to accompany meteorologists’ talks in live broadcasts.
The Weather Channel started working with mixed reality back in 2016 as an initiative to display the severity of conditions with graphically intense simulations using high-performance computing. In June, The Weather Channel began using such experiences for live broadcasts with the help of The Future Group and its own teams of meteorologists and designers. Michael Potts, vice president of design at The Weather Channel, states that the team experiments with such tech to find ways that can help them show weather severity better.
“Our larger vision is to evolve and transform how The Weather Channel puts on its presentation, to leverage this immersive technology,” he stated.
The Weather Channel uses a simple green screen to tell people about wind speed, direction, rainfall and a number of other data points fed thanks to a tremendous amount of real-time processing, enabled by NVIDIA GPUs. The whole project is a collaboration with Norway-based The Future Group, a mixed reality company with U.S. offices. The Future Group’s UE4-based Frontier graphics platform allows the team to deliver photorealistic immersive mixed reality backdrops.
“The NVIDIA GPUs are allowing us to really push the boundaries. We’re rendering 4.7 million polygons in real time,” stated Lawrence Jones, executive vice president of the Americas at The Future Group. “The pixels that are being drawn are actually changing lives.”
You can find more details here.