Creator of the first digital holographic comic and artist Jake Adams walked us through the production process behind his imaginative holographic comics, talked about his software of choice, spoke about the challenges of such projects, and discussed the Looking Glass holographic display.
Most times, I don’t feel real, but, at times, I can be too real. I was given the name Jake Adams (aka Valholo (Vail-ha-low). I am an artist, professor, and developer with a BFA in Painting from the Maryland Institute College of Art and an MSc in Design and Digital Media from the University of Edinburgh.
Currently, I work out of a creepy dark attic in Rochester, NY. I call it Valholo Studios. A lot of artwork and interactive things happen there.
Coinciding with my studio practice, I am a Lecturer at the Rochester Institute of Technology: Interactive Games and Media and New Media Interactive Development departments. Recently I started a private course as well through the International School of Holography.
Previously, I created assets for The SandBox through Animoca Brands on the Polygon blockchain. Before that, I was heavily involved in the “NYC and Hudson Valley art scene”, showing my work as much as I could. Additionally, I worked for the Dia Art Foundation, Ethan Cohen Fine Arts, and was an assistant to Artist Michael Zelehoski.
Valholo started in Scotland through a CodeBase Scottish Government grant. This helped me develop the world’s first digital holographic comic book (holo-comic) in 2019 before I moved back to the United States.
At one point, Valholo operations took place inside a tipi in my father’s backyard, before that it was in a laundry room somewhere. We have recently upgraded to the aforementioned attic dwelling equipped with a triangle-shaped bathroom. Strange and beautiful.
For the most part, Valholo Studios is a one-person show (me). Sometimes I’ll get sound design assistance from my friends in Europe and the UK. Simon Howard, Dillon Robinson, and Niovi Kitsiou. Much of the work is commission/contract based. Thus far, creating for the Looking Glass Factory has taken up the bulk of my studio practice, along with practice-based research.
Occasionally my studio receives random fun opportunities. Most recently, the International School of Holography invited me to teach a workshop on using the Looking Glass Display and developing for it via 3D software. My class, Art Through the Looking Glass, starts soon, and we're still taking students. The main emphasis of the class involves intermediate lessons on setting up the Looking Glass plugin in Unity, optimizing assets, and creating content that lives in our spatial-temporal existence.
The Looking Glass Display
Imagine living in a tiny incessantly dark and damp flat in Edinburgh, Scotland in the shape of a hallway with a bathroom so small, they had to put the toilet in the shower. During the height of Brexit, you wake up to drunk men singing loudly every night at 4 AM, and when you are just about to fall asleep again a seagull begins to tap on your window around 4:30 AM begging for a snack (yes I did feed it every morning, its name was Marge). In your attempt to doze off once more, it is cut short by bagpipes around 6:30 AM. When you finally give up, you see an ad for the old standard Looking Glass display glistening through the ether of interwebs on your computer during a bout of frustrating research.
Like a crystal ball but square, it spoke to me and felt like the future (if the future was an object). The chance to have my digital creations live in my space and time had come. I created content with it for my master’s dissertation in 2018-19, and it solved all of my research struggles. The magical crystal square has since and will forever live in my heart.
Creating Holographic Comics
I wanted to make the intangible more tangible and play with the idea of diegetic and extradiegetic space in context. I made the first iteration of holo-comics in 2019, Maldacena: MFR. It was a crucial part of my practice-based research for my dissertation. The idea for it came about in late 2018, after reading Scott McClouds, Understanding Comics. For me, comics always seemed to play with space in the same way, I wanted to change that.
I yearned to do what Chris Ware did with Building Stories in 2012 Or what Nick Sousanis did with Unflattening… But wrap it up in a futuristic block of spatial experimentation and interactivity.
I thought to myself, how would these same techniques in scripted and visual storytelling apply to a comic book that was interactive via Leap Motion/buttons and manifested as a semi-free-floating object in my tiny dank apartment? By the spring of 2019, I had begun the process and quickly realized the Looking Glass display was the perfect medium for my endeavors.
During the process, my best friend Alex passed away which drastically changed the story of Maldacena: MFR halfway through production. The world’s first holo-comic had trans-mutated alongside me. Not only was it a critique of itself as a comic book living in a digital holographic state, but now it was also a story of loss and the perseverance of reshaping who we are, represented by the holo-comic transforming alongside the narrative from two dimensions to three.
Aphid Through the Looking Glass is my most recent holo-comic, completed a month ago. This holo-comic iteration utilizes the Looking Glass Portrait display. My hopes are that it packs a chromatic, psychedelic, socio-emotional punch. It's now available for purchase/donations here until August 5th (you may also get the latest news on Instagram).
Tools for me are always just a different version of a brush. Even programming feels like a brush. So, I start with an actual brush. All of the 2D work that you see is done in the real world with paints, charcoal, graphite powder, colored pencils, you name it. Even some of the 3D materials/textures are created by hand. They all end up going through some form of pre-production before entering the Unity game engine.
For the 2D work, it's Adobe Photoshop, Adobe Illustrator, Adobe Animate, then Piskel. For the 3D work, my workflow looks a little bit like this: Autodesk Maya, Autodesk Mudbox (to get Bump Maps/high poly Normal Maps to place on low poly models), occasionally Substance 3D Designer, then Substance 3D Painter, and into the Unity game engine.
For sound, I used recorded audio of my voice/sounds around me, free sounds, and sounds created by my friends. Then they go into Audacity for editing, until final import into the game engine. For programming, it is C# all the way, utilizing Visual Studio with Unity.
The Production Process
We start by making more problems for ourselves. Half kidding. I began by asking myself questions, "How can I make a comic book that is better than the last while using the discoveries I made from it?" Then, I asked myself, "Who am I making this for? If it is more of an outlet for myself, how can I make it more inclusive? How might I adapt a story around a selfish need to tell my own story through a character while keeping it open enough for others to relate? How does the portrait display change the way I create for interactivity, the composition of scenes, etc..? Am I passionate about the chosen theme?"
Suffice it to say, I went through and deciphered what production issues needed to be addressed, and what questions needed to be asked before I dove into my sketchbook.
There is a weird balance out there. All things are connected or connecting but some connections have to wait longer for something to come by with the right parts. Like driving a car or walking down the street, if one suddenly finds a solution to a nagging problem, it smacks them in the face. "Why didn't I see that before!?" This only works when you stop working and start living.
This fact traverses all creative industries and is a crucial part of my methodology and process. Like true romance, it comes to you when you least expect it.
Mixing 2D and 3D Elements
The most important part of my 2D assets is to layer them in pop-up book form to create a 2.5D appearance in the display. Depth is controlled via the plugin itself, however, I have my own pre-production techniques which I’ll dive into later. For now, I’ll explain how I find the balance between 2D and 3D assets.
Maldacena: MFR was different from Aphid Through the Looking Glass, in that it was a critique of itself as a comic book living in a digital holographic space and spoke of transformation of self. Therefore, mixing the 2D and 3D elements was more of a gradation from 2D to 2.5D to 3D. I even put 3D stuff behind a UI 2D element emulating a window and put 3D stuff in front of that… But it was sequential. Aphid Through the Looking Glass was somewhat sequential as well but it was a rougher gradation, keeping that visual theme wasn’t as important for the story.
With Aphid Through the Looking Glass, I just had to see what fit best for each section/scene. What was needed for the mood or motion of a scene? What needed a push or a pull? In a way, I see the 2D and 3D elements like a sandwich. The bread (2D) is a sort of bookend for the juicy stuff (3D) in between. Eventually, the way I mixed them became a little bit more complex.
There was potential for more, I thought, why am I relying on the display's ability to give depth? What if I put more depth on top of that depth? What kind of sandwich would that make? Double cheeseburger big daddy “baconator” moment.
When I thought of my roots in oil painting I quickly realized that, Aside from perspective, Sfumato (renaissance painting technique) was our first real attempt at mimicking how the human eye sees and perceives depth. So why am I not making my assets in that manner? This revelation solved many problems including eye fatigue. More importantly, it did what I thought it would do, it enhanced clarity and depth.
It also assisted with text clarity which I found intriguing. Who would have thought that a 550-year-old painting technique would find its place in a digital holographic display one day? A deeper understanding is detailed in my course this summer.
Preparing the Content for Looking Glass
The process can be easy or hard, it depends on what you are doing. If you are simply trying to get one of your 3D objects to display via the Unity plugin, it is quite simple if you know Unity and how the plugin functions.
The process will be more of a challenge for Unity beginners, but the Looking Glass Factory has awesome documentation. Using Animator, programming in C#, and getting the buttons to control the changing of pages or animations is more advanced. Not to mention the various pre-production and post-process techniques I implement. On top of that, import settings can be crucial for optimization, which can add another layer of difficulty, especially when importing custom materials. When it comes to preparing my own content, these are the general steps:
For 2D assets, I will first draw or paint them. Then they are captured, edited in Adobe Photoshop, cleaned up, and sometimes layered on top of one another after being cut out from another drawing or painting. Then I cut things out, I give special attention to edges based on where the 2D object will sit in relation to what we call the frustum target (the place with the most clarity in the display). Further from the frustum target = blurrier edges, the closer to the frustum target = less blurry. I essentially mimic what the display does while mimicking how our human eye sees (remember: depth on depth, Sfumato). After, they are exported and changed into a vector/.png. Sometimes Illustrator is used to parse out artifacts in the images (jagged stepping, noise, etc). This is mainly used for text.
Some 2D assets are animated before import. I use Adobe Animator or Photoshop, then export as GIFs. My sprite sheets from those animations are generated with the help of Piskel to expedite the process. Unity uses sprite sheets to assist in optimizing 2D animation (Think of each frame as a separate .png file but put together like a film strip). Once I have all my 2D assets for a scene in my Unity project I begin to compose. Like a pop-up book, they are exported and arranged into a 2.5D composition within the Unity game engine and animated further via the animation and animator windows.
For 3D assets, without explaining too many optimization techniques, I model a low poly version in Autodesk Maya using hand-drawn references, if animation is necessary I will do this in Maya as well. I then export the object as a .fbx format and import it into Mudbox. I then bump up the subdivisions and sculpt crazy details into the model. After which, I generate a Normal Map from the high poly to be placed on the low poly version. This gives me high detailed textures without putting too much strain on the system. Afterward, I bring the model into Substance 3D Painter and customize a material for it. Then I export the textures in a Unity-friendly way, using a custom template I’ve made, and I bring everything into the game engine. Once import settings are set and all of my materials have textures attached and placed on the object, it is ready to be a part of the story.
I would describe my two-year experience as a rabbit hole in a rabbit hole. Constantly feeling like I was falling, trying to grab things that I could assemble while I fell. Occasionally, I would stumble upon a happy moment.
Ultimately, the story of Aphid Through the Looking Glass came about during a sort of life shift or more accurately a life shift of the masses. Enter 2021, the pandemic is still on. I developed a fungus on my face from wearing a mask at that point. Along with many other health problems that followed later in 2022 where I lost feeling in the left part of my body. Currently still recovering from that.
General details aside, the story for the holo-comic must have been rewritten about a hundred times, no exaggeration. At one point I had to completely flip the sequence of events, it was a nightmare, but I needed to get it right. If this was to be anything good, it deserved a certain amount of attention, and I was determined to give it my all.
One thing that struck me throughout those two years was the need to feel at home. In many ways, I never really knew what that feeling was. Nor do I really know why. The pandemic isolation amplified that feeling tenfold, and with the polarization of beliefs, decisions, politics, etc. Like most people, I just wasn’t mentally well.
The most interesting part about all of this is that I was making a story about finding a home while also searching for a home to buy for myself, my now ex, and her seven-year-old daughter. I didn’t know what I had signed up for, house purchasing is quite the task. While I was racing to get a house before our lease was up I was also racing through the creative vortex of what is now Aphid Through the Looking Glass. I was analyzing how to tell a story about home, not knowing what home really was while also trying to purchase a physical embodiment of home. By the time I had finished purchasing a home, our relationship had fallen apart. Now I was alone in a home that didn’t feel like home creating a holo-comic that had a character trying to find home. At that point, I was completely lost. In multiple ways. To make matters worse a thunderstorm cut my power out and I had somehow lost half of my project.
Then one night I dreamt of a tiny island with a creepy cute tardigrade taking up the entire surface. I was floating down above it, so its pink skin didn’t look like skin at first. More like a pale red dot sitting on waves of blue. When I was close enough, its head popped up. Holy macaroni it had bunny ears! The bunny bug mouth opened. I went in. The esophagus resembled a kaleidoscope of comic book panels all separated in blue and red, gyrating. If you looked far enough down it had all turned violet.
Somehow in the moment, that specific shade of violet gave me a sense of home. When I woke up I realized that blue and red were chromostereopsis, hot and cold, positive and negative, left and right, etc. So I found a way to make it into an opposing force and a metaphor for the balance in the main character's (Violet) journey to find a home. Unfortunately, I also had to change the color theme and remake a bunch of assets, but I had lost half my work anyhow… A blessing in disguise I suppose.
All in all, I needed an outlet, and the Looking Glass Factory gave me one. For that, I am eternally grateful. I am especially thankful for Missy, Nikki, Arturo, Bryan, Alex, and anyone else from Looking Glass that helped out.
What am I most proud of? I am most proud of finding my own feeling of home by the end of the project. It didn’t come from the house I bought or a geographic location, it came from within and the people and creatures I chose to be in my life. I started the commission in the early summer of 2021 and just finished, mid-summer 2023.
I advise you to remain curious and to make many mistakes. Don’t be afraid to let the old inform the new, and the new inform the old.
One thing you can do is take my summer course. We will go over the basics of Unity, the Looking Glass plugin, intermediate topics in optimization, and how to create work for the Looking Glass Display. Currently, we are accepting late arrivals due to the optional async nature of the course.
Big thanks as well to everyone who gave me hope along the way.