logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Creating Immersive Experiences from Point Data

RubenFro talked about Future Cities project, an immersive art experience based on photogrammetry, and his own real-time photogrammetry experience Dissolving Realities running in Unity.

Introduction

Hi, I’m Ruben, a Designer & Developer based in Japan since 2006, working mainly on app development and web design for local clients (Honda, ANA, LoFT). Since last year, I increasingly started doing more VFX and interactive experiences in Unity using photogrammetry.

I started back in 2001 as a Flash Actionscript developer for websites and games. That allowed me to explore both coding and visual design, and I’ve been moving between those directions since then.

I’ve also always been very interested in visual arts, interactive installations, and computer graphics. In particular, the minimal and essential nature of shader programming is extremely fascinating to me.

Future Cities Project

I started exploring and combining my interest in design, interactive art, and technology in 2018, with the Future Cities installations with photographer Cody Ellingham and musician SJF. 

We were experimenting with different ideas and came up with the concept of an experience that would take people on dream-like journeys through deconstructed, fragmented cities. 

I had experience with Unity and shaders, so I started right away to build the engine to run the installation and the coding to render the clouds, and after a few months (and quite several long nights) had a first version ready.

Different parts of the city were scanned using photogrammetry and recombined in unusual ways - streets mirrored, turned upside down, or leading to completely different areas, creating a world that still looked real but yet unsettlingly different. 

The audience could interact with the installation using a controller in the middle of the room, while we were performing live, dynamically altering the 3D world projected on the walls. 

In 2019 we had shows in Tokyo, Taiwan and with the support of Sony New Zealand, we brought the experience to Wellington for a 1-hour live performance at the National War Memorial. 

Here's our last show FutureCities Memories of Tsukiji:

Volumetric Capture

I see volumetric capture as a new medium to record reality in the era of VR/AR. We have been using photos and videos to tell our stories, for quite a long time. But I believe in a few years it will be natural to take 3D snapshots of our life or to document historic events that way.

In my works, I follow that direction, trying to get as close as possible to capture real-life and street scenes using 3D data.

To be able to scan as quickly as possible, I use a couple of 360 cameras and sometimes drones and small laser levels to acquire the footage. 

I then process the data with a series of scripts for image analysis and optimization, before feeding it to the photogrammetry pipeline.  

This technique allows for extremely fast scans of reasonable quality. That means I can basically “freeze” the hectic nature of a morning market in Asia in less than a minute, capturing people in their daily life. 

And here's point cloud raw data:

1 of 9

Working on Dissolving Realities

While photogrammetry is being used quite often to create photo-realistic models, I’m more interested in embracing the fuzzy nature of point data and work on the real-time shader to build a more cohesive and organic 3D image. 

In Dissolving Realities, I use point clouds of around 50 million points. I load the point data in Unity in batches of different sizes depending on the distance and direction from the camera, and passing those data to a geometry shader I developed to “draw” those points on the screen, while also taking care of all the effects and animations.

There's a lot of optimization in making sure I only draw what I need to draw, with specific levels of detail.

I’m using a function in my shader to move the points organically in specific directions, taking into account also their angle, luminosity as well as a rough calculation on density to simulate a cascading effect. Besides falling I'm also changing the way they're rendered, pushing up their brightness and creating the "burning" effect.

I see the effect as closely resembling lost memories. Clear images, almost in reach, but suddenly collapsing, dissolving and fading when you try to get a closer look. 

In my latest video (see below), I've also experimented with other effects of "fragmentation of reality" and a way to transform points into splines. I like experimenting with the idea of deconstructing worlds for then reassembling them into their original state. 

Here're several screenshots:

1 of 10

Until a few months ago, I was using mainly a MacBook, so I had the necessity to keep everything as simple and optimized as possible. Another thing that helps is that every single aspect of the visualization, from drawing the points to vertex lighting and animation is managed by the shader. In this way, I can move and animate easily tens of millions of points on modest hardware.

Below is a small demo of the engine using Kinect to control two lights with the hand tracking and camera rotation with the movement of the head:

RubenFro, Web Developer & Designer

Interview conducted by Kirill Tokarev

Keep reading

You may find this article interesting

Join discussion

Comments 3

  • Anonymous user

    Always worth mentioning some sources of inspirations like Benjamin Bardou and pretty much all his work since last year : https://vimeo.com/275615469

    1

    Anonymous user

    ·4 years ago·
  • Nyhlén Valdemar

    Love the projects Ruben! I'm curious to how you made the shader? Are the points created straight in the shader?

    0

    Nyhlén Valdemar

    ·4 years ago·
  • Anonymous user

    Yes, Benjamin is a great artist, and probably the first person I've seen using point cloud in a creative way. We are also friends and had a beer together last year in Paris :)
    (By the way @viadurieux, you and Ben work in the same studio right? we should all meet next time I'm in Paris!)

    When I started working on this though it was mainly a real-time experience in Unity, while he was using Houdini (but also switched recently to Unity as well).
    Worth saying that I'm pretty sure we'll see many other artists working with point clouds as a medium, in the same way a photographer or a video artist does. It will be really interesting to see how different styles will emerge.

    0

    Anonymous user

    ·4 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more