The team at The Future Group shared some information about their Augmented Reality and how it can be used by different companies.
About the Company
The team at The Future Group is currently 45 people, and we’re continuing to grow. Our team is located in Norway, UK, the US, China, Croatia, Poland, and anywhere else we find great people. At its core, Pixotope is formed by professional creatives with years of experience working in visual effects and production. It’s this background that fuels our saying ‘For Creatives by Creatives’ – we really want to emphasize how connected we are to the creative process and how much it continues to directly inspire our work.
It was our extensive experience that drove the founding of The Future Group in 2013, breaking new ground by reimagining broadcast content in the future. Initially, our focus was to build a completely new type of gameshow that combined virtual studios, gaming, and e-commerce. In 2017, we premiered the world’s first Interactive Mixed Reality game show, Lost In Time, on Discovery Networks’ TV Norway – this was an integral experience.
For Lost In Time, we invented a unique studio graphics software, mobile front, and back-end technology, as well as a world-class service department for mixed reality production, as none of this was readily available to purchase.
We realized that the expertise and technology we had developed for Lost in Time was a game-changer, and could enhance the ingenuity of video and film creators the world over. In the summer of 2018, we expanded into real-time visual effects, before creating and releasing the first iteration of our flagship Pixotope® solution in February 2019. We combined our knowledge and expertise in Mixed Reality production and made it available to everyone through a turnkey software solution.
Since then, we’ve become the leading solution for Mixed Reality events in Sports, Esports, and entertainment, working with The Weather Channel, Riot Games, League, Accenture, China Central Television (CCTV), and many more.
When we began creating Lost In Time, we quickly realized that there was no solution out there that could support a multi-camera, multi-studio, live-to-tape, VFX workflow that maintained a high-end level of production. To start with, traditional real-time graphics systems for broadcast and venue production didn’t have the graphical fidelity we were looking for. We also wanted to be able to share code and assets with our mobile games. The natural solution was found in game engines.
While there are many good game engines available, Epic Games’ Unreal Engine was the perfect choice for us at the time. Epic Games had recently opened its source code and the engine was already delivering best-in-class visuals for both studio and mobile games. The Unreal Engine had been used for many years as a visualization tool in the film and media industry, and many of the people at Epic Games had backgrounds in visual effects and media production. We had a natural affinity with them, they understood what we were trying to achieve and have been immensely supportive and helpful.
Initially, we were building this just for our own production and out of necessity, but as the technology progressed, we realized that this could change the whole industry. One of the key lessons we learned from making Lost in Time was how crucial workflow and reliability are when creating a live production. Decisions are being made in split seconds and everything needs to work at all times. Virtual Production can be very complex, integrating many different technologies and systems, and it often requires highly skilled engineers and artists to even get started. In addition, we experienced firsthand how important it is to manage turnaround time on large productions with 100s of people working. Any delay will cost the production a lot of money.
Being mindful of this, we set out to make a turnkey software solution that could take away some of the complexity of configuration and operation, ensuring that the graphics are always live and responsive, and providing individuals that might be less-adept using Unreal Engine the ability to create high-end real-time visual effects.
Pixotope® is a software-based virtual production platform that leverages the full power of the Unreal Engine to combine live film and broadcast-quality video with high-end 3D graphics, and all in real-time. Users can merge real characters and elements with a virtual set (VS) and combine on-air graphics with real scenes using Augmented Reality to create next-generation motion and on-air graphics. Together, the features are integrated into an accessible, turnkey solution, built to fit into modern live event and broadcast workflows.
Pixotope also supports industry-standard video and synchronization formats, including DMX, HDR, and UHD, alongside content from all DCC software through to industry standards such as FBX, Alembic, and the Unreal data smith system.
We’ve built Pixotope as a modular production platform that includes a range of features designed to work seamlessly together.
Pixotope Director is our central user interface allowing easy and synchronized configuration and control of multiple Pixotope systems across multiple computers in real-time. The driving force behind the software, the Pixotope Engine, is our own version of Epic’s Unreal Engine. It can render and composite real-time photorealistic CGI characters, VFX, and environments at full speed, without degrading the video quality. We’ve built from the incredible technology found within the Unreal Engine and made something that isn’t possible with UE4 alone.
To create something that can easily fit into any production’s pipeline, we have tweaked the Unreal editor and added broadcast-specific functionality, such as our unique WYSIWYG functionality that allows artists and graphics operators to have live tracking and video I/O while editing the graphics in the editor.
Alongside this, Pixotope Control is a graphical and user-friendly system for creating custom control panels and the Pixotope Datahub is our proprietary, low-latency data bus, with an API that makes it easy to integrate multi-camera workflows and automated workflows.
One of Pixotope’s key features is its standalone, high-performance Tracking Server, connecting Pixotope with all of the major real-time camera and object tracking systems. This isn’t just passing a signal, this allows people to adjust and transform signals in a robust way, work alignment of graphics and video. We also provide tools for error correction and filtering if there are problems with the tracking data, such as drop frames. There are plugins for Unreal that attempt some of these features but none of them achieve the holistic and complete control of our tracking server.
The Pixotope Pipeline is our video processing application. It’s a standalone service that processes the video going in and out with all processing completed on fast GPUS. It then exchanges textures with the Unreal Engine. The advantage of this is that we have really good control and can easily build processing pipelines for video without having to involve Unreal. From Unreal’s point of view, the video coming in is just a texture so it’s easy to understand and use for Unreal artists. The Pixotope Keyer, a real-time chroma keyer, is one such process that runs in the Pixotope Pipeline.
Finally, the Pixotope Cloud is our online license and user management system that allows customers a large amount of control and flexibility in how they use Pixotope. We support traditional subscription and continuous licenses, as well as short-term event licenses. We try to keep our plans as flexible as possible for the different types of users. They can easily move licenses between machines. We have offline activation too for studios that don’t have their machines connected to the internet, which is sometimes the case for larger broadcasters.
Examples of Pixotope's Usage
Virtual production and mixed reality are a huge part of Pixotope’s offering. One of the most notable early examples of this was for The Weather Channel. We created an augmented reality tornado experience showcasing a tornado, crashing telephone poles, smashing cars, flying debris, and the destruction of the studio. For live broadcast programming such as The Weather Channel, interactive experiences such as this heighten the impact dramatically. The reporter was able to interact with the graphics on-air and directly illustrate the dangers of the coming tornado.
An area where Pixotope comes into its own is in live, broadcasted entertainment, especially during big sporting events. In 2020, The Famous Group used Pixotope for The Super Bowl 54, creating an on-screen mixed reality that combined live-action from multiple moving cameras with augmented virtual stages and graphics. The overall production was live and seven minutes long and won an Emmy for Outstanding Studio Show. Similarly, The Famous group used Pixotope to enhance the live broadcast of the Baltimore Ravens with mixed reality. Using Pixotope, a giant, CG Raven was able to circle the stadium, even landing on one of the football posts like a branch. All of the color grading and post-processing was completed live using Pixotope.
A pertinent example is the generation of virtual fans during FOX MLB while social distancing measures were in place in the US. Instead of an empty Baseball stadium, Silver Spoon Animation used Pixotope to populate a CG crowd live, with different individual characters seated and animated – some were even dressed in the team colors. This is an incredibly inventive way of using mixed reality technology to influence live sporting viewership. It also links directly back to our aim, which is to enhance storytelling with technology. Users can adapt their productions using PIxotope in exciting ways.
E-sports is now an established global phenomenon enjoyed by millions. It’s in many ways the perfect match for mixed reality and virtual production due to its direct link with gaming technology. It makes perfect sense to generate CG characters using the Unreal Engine, and it’s amazing to see Pixotope leading the charge in this sector too. All the way back in 2018, Pixotope technology was used to create augmented versions of characters from League of Legends, as they joined real K-Pop stars on-stage at the Worlds 2018 opening ceremony. We also worked on the LPL Pro League finals in Shanghai with Riot Games, recreating League of Legends game characters in stunning augmented reality. This project also included the live integration of body motion capture and facial capture with ray-tracing enabled – one of the first showcases of this integration in live event production. Later, during the WePlay! Esports event, we generated augmented characters on-screen, alongside vast CG cityscapes.
Lastly, I’d like to mention Pixotope’s use in general broadcast news the world over. In 2019, TIMES NOW India, created an augmented reality edition of its politics show, MANDATE 2019 which included a virtual recreation of the Lok Sabha, generating the on-air graphics with Pixotope. During the recent live municipal election coverage in France, the channel, TF1, used Pixotope to create a range of mixed reality depictions of different candidates live and on-air during the coverage, placing the images in the TV studio and in other real-life locations. It’s a great example of the expansion of mixed-reality techniques used in everyday television.
Future of Augmented Reality
We believe that virtual production techniques and technology – such as mixed reality, real-time visual effects, and Augmented reality – will become a staple of video-based media and advertising production over the next 10 years.
Taking a birds-eye view, we can see that the landscape of media use and advertising has changed dramatically over the last 20 years, driven by the onset of online media, gaming, extended reality media, and programmatic online advertising. On the content side, there’s a clear need for content owners to be able to deliver new and existing content that matches the next generations of audiences’ expectations when it comes to relevance, interactivity, shareability, personalization, and multi-platform presence.
The overall market for advertising continues to grow, but television, formerly the king of advertising, is losing the fight against online advertising due to the dramatic loss of young viewers and its slow adoption of technologies that allow for programmatic advertising.
Ad and banner blindness, which is starting to lessen the effectiveness of traditional and online advertising, has created space in the market for embedded advertising, product placement, and sponsorships to grow, but like traditional television, they are hampered by the lack of inherent programmability. The rapid growth of gaming and the expected dominance of virtual production-based media in the future are propelling companies like Unity and Epic games to the forefront of the media industry and they give us a hint about what the advertising and content business might look like some years down the line.
Within all the advertising media, there are now two main focuses moving forward:
how to convert existing inventory so that it can be produced, sold, and distributed automatically through programmatic and targeted advertising techniques, and how to create a new ad inventory that follows the viewership trends and combat ad blindness while remaining highly programmatic and targetable.
We believe Virtual Production is the perfect platform to answer these challenges in the short and mid-term, and the right foundation to build a new form of next-generation video content and advertising in the long term.