Jack. First of all, I want to apologize for offending you. We published this just to show how the tech could be used. We don't actually care about the message. But you do bring up a viable point, that for some people - this might be an issue, so I take this post down.
What European universities would you recommend?
How about you don't associate with a left leaning partisan news site assuming all video game artists lean the same way. I'll be blocking your content from here on out.
Developers of TARTARUS showed how you can create an interactive terminal in your Unreal Engine 4 game. This article is also available in German and in Turkish.
Our team consists of two members:
My name is Sertaç, I am a Gameplay Programmer. I have been developing projects with Unreal Engine for about 3.5 years. Although my job description is Gameplay Programmer, I enjoy UI (Interface) programming. In the meantime, I have worked and still work on different projects that have been published on Steam.
I also organize the Unreal Engine Meet-Ups in Turkey I organized numerous events in the past year and continue to organize them. Along with events, I am thinking about organizing seminars about game development with Universities and High Schools. There are many people who want to develop games in Turkey and I enjoy passing on the experiences I have made. Because information grows as it is shared!
I am also among the community developers of Unreal Engine version 4.18. Of course, being just a tiny piece of an amazing engine provides happiness and pride.
My name is Kemal, I am a 3D Artist. I have worked in the film and game industry in different areas for almost 10 years. Generally over modeling and lighting. I have more concentrated on lighting for the past 4 years. I have been using Unreal Engine for 3.5 years. I have found throughout this time the opportunity to work on different game projects and on many short- and full-length films. I am trying to combine the things that I learned from the cinema with what I learned in the process of developing games. I share all of what I learned as a class on my Artstation and Youtube pages when possible and try to help other developers
I also try to answer questions asked by participating in seminars and webinars and in events organized at universities.
About the Project
We are trying to explain in this article how terminal and puzzle systems, which are the “Core Mechanic” that we used within the TARTARUS game, work.We think it could be useful in terms of both being helpful for other developers and in showing what all Unreal Engine 4 and the Blueprint system can do. We will try to talk as much as possible about the problems and solutions that we came across. But we want to briefly mention the project before we come to terminal and puzzle systems.
TARTARUS is a fundamentally a writing-based, first-person game. Your task is to save both yourself and the ship that is crashing into the planet Neptune and whose systems are malfunctioning. We are also using the “Terminal” that we will mention later and some mechanical tools while doing this. We published Unreal Engine 4 along with our developed game on November 22.
We designed and implemented the Terminal System as it is below based on our own needs. But it is fundamentally applicable for all projects. The rest relates to how creative you are.
Programming the Interface
I want to explain before everything else how processes are logically executed. Actually, if it is necessary to explain it in the simplest manner, it is formed around taking and processing inputs from the user. We created a Widget Blueprint especially for this.The most important section here is that we have to make some adjustments to be able to get user inputs. Otherwise, the OnKeyDown function wouldn’t be able to capture the keystrokes.
After making the necessary adjustments and giving all focus on the interface, we can now capture the inputs coming from users.We will use the OnKeyDown function for this. This function is triggered the moment the user presses on the keyboard.
The most important point here is us taking the names of the keys that the users pressed thanks to the Key Get Display Name function. We will carry out the necessary processes in the proceeding phases in this respect.
Any more we can get input from the user and write down whichever keys these are. It is actually important to divide the problem into parts in this manner. We successfully completed the first part of our problem.
We are going to process the inputs that we got in later steps and create a decision mechanism based on this.
In this respect, we integrated the inputs that we got from the users. But we have a little problem at this point. The Key Get Display Name function returns values like Space, Comma, Period, Hyphen because it returns the names of the keys to us. So this is why we have to push keys like this through a process and convert them to values that we want.
We created a function for this that returns the symbol that corresponds to that key in place of the names of the pressed keys. We later integrated this into the function that we had already written.
The next step will be to process the commands that we received based on the standards we created and identify the flow based on this. But because the standards here will be different than the system that we want to make, I am going to briefly mention the standards that we used for Tartarus.
In the terminals that we developed for Tartarus we had some commands defined to the system and parameters for these commands. We were identifying these and carrying out the necessary processes based on the returned commands and parameters.
We are carrying out these processes the moment the users pressed the Enter key – the moment that they verified the command they wrote. We separate the commands verified by the user based on the standards that we specified.
Some commands can be used without parameters. We determined this, especially by whether or not there was a space used in the entered command. We later created a decision mechanism based on the two situations.
Like I said at the start of the article, the section I explained above can vary based on the system you want to create. You should therefore specify your own standards.
We will design the interface in the next step and make the system functional. But I first want to explain the working logic of the system.
It is formed as two Widget Blueprints, including the system interface and the command lines. The first is the MainWidget that the Widget in which the background and design side is found.The second is the ItemWidget in which we wrote the commands above. The command lines – the ItemWidget – is on the ScrollBox found on the MainWidget, and a new version is added the moment Enter is pressed.
What we need to do after finishing the design phase is process the commands after pressing Enter and then add a new ItemWidget to the ScrollBox.
The system works logically in this way. We are going to reflect our two-dimensional interface screen on a model in the scene after this phase.
We wanted the system we designed to be changeable as we wish. And like this, we weren’t going to have to rethink for each “Terminal”, we were going to be more comfortable in the creative process, and we were going to struggle many fewer possible issues. Firstly, we needed a special “material”. At first, we needed to identify a type that is suitable for our universe. Lo-fi
Sci-fi became the key for us. We wanted the CRT screens… the Terminal screens to look bright, lively, and nostalgic. We are going to try and explain how we created this in a few steps.
We first need to know the “visibility” limit on our Terminals. The “Capture Camera” that we placed in the scene should constantly watch a “surface” and should have shown the inputs of the users coming from here and where within the system hierarchy of the game and at what it is looking with minimum delay and maximum performance. To enable this, we appointed it to the “Texture Target” section that is one of the “camera” section by creating the “Canvas Render Target”. We later began to create our “Terminal” materials using this texture.
With this quite simple but effective route, we started to answer many questions like should we do the limits visible on the screen, our character limits, how much need to we have for solubility, and how should we do performance improvements. We later ended our materials by adding details.
The scratches on the glass screens of the terminals were provided with a “Roughness Map” as in the example. (Image source Google)
Another important parameter is the maximization necessary for the “Scanline” effect and that creates the illusion of the scanning on the screen. Thanks to this texture, along with acquiring a more aesthetic image that broke the monotony, we think that we captured the retro mood.
You can comfortably prepare your design drafts in the 2D program that you use. The area in the more bright tone on the upper side is the place that creates the actual designed scan effect. We are going to look a little later at how it works.
As you can see, we use 3 basic textures. Let’s look now at how and why we control these materials. We especially felt the need to control the sizes on the screen, especially of the “Roughness” and “Scanline” textures. They should neither be very big nor very small. We need to overcome the problem of the appearance of the fake “Ghosting” effect, which forms especially with the concentration of many design lines, like the texts disappearing late or becoming pointed. We also didn’t want to touch the texture connected to the Roughness value while doing this. We therefore went the way of dividing it into two sections.
The section you see on the upper left side is the section where we control how big the scratches on the “Terminal” screens are going to be. We repeat our texture with the help of the “Multiply” node, using the “Texture Coordinate” node and “Scalar Parameter”, and we are able to get the effect we want. Again on the upper right side, we repeated our “Scanline” texture and increase the number of lines by using the same nodes again, and just like we decreased it, we control how much we want it to move on which access and how fast we want it to move with the “Panner” node. Another important point is the fact that the Terminal screens are made from glass materials. But instead of the translucent or masked material properties that provide the classic permeable glass effect, we wanted to use “Opaque”, and we wanted to provide brightness with the Roughness value. Its preparation is as you see below.
Lastly, we controlled and tried to obtain the desired brightness by using the “Emissive” value for screen brightness, again using the “ScalarParameter”. The obtained result somewhat resembles the example you can see within the Material Editor. If we look closely, we can now see the scratches.
Finally, we produced a “Material Instance” from this Material that we created, so that when we want to make changes, we won’t be forced to go back again and again and can work faster and more efficiently.
Because all the terminal screens we use within the game have a standard size, we weren’t later forced to make any kind of adjustment or change. Therefore, after the materials are prepared, all that remains is to specify which screens we see in which Terminal, to preserve the folder hierarchy, and to bring the “Puzzle” interface to the screen by designing it and holding the same effect during certain events.
We came across many problems while designing the puzzle interface. It should match with the narrative of the game, it should remind each time that we are on a spaceship, and it should look pleasant. We can roughly say this, if ordering the steps that we monitored while doing all these is necessary. I’m going to try and explain a little further in the continuation.
1) Identification of the Action/Problem wanting to be Done/Solved.
2) Reduction of the Action/Problem to the simplest and adding steps wanting to be Done/Solved/
3) Design of the interface as a “concept” for the Action/Problem wanting to be Done/Solved.
4) Improvement (Design, Programming)
1) The point at which whether the activity/problem that the game is going to do/solve is going to be overcome by a physical activity or using the Terminal. This section, which constantly varies based on the story, can be passed very quickly should there be a physical activity (ex. opening valves in order). Because an “interactive system” modularly designed is the same everywhere where there are situations like this. Like press [E]. But if there is a situation in which the Terminal screen needs to be solved, our job was generally a lot more difficult and the same system wasn’t useful here.
2) For example, let’s think that were going to connect to another section and open a door over the Terminal.The gamer wandering around within the folder hierarchy that changes for each terminal is relatively easy. After understanding which commands, how and where they need to use, of course. But things change when coming to the Puzzle screen. To make a simple thing like opening a door difficult but understandable and making this from one into several steps became the section with which we struggled the most. If we are to continue over the door example, the things the gamer has to do could be as listed below.
A) Obtain the special program that opens the door or the required information within the Terminal or from a physical object within the Level.
B) Find the “Special Section” where you will be able to use this information or connect it by using a command system.
C) After ensuring access to the Special Section, use the proper information, which you previously acquired, only for this section. (Piston Puzzle Gif)
3) If we are wanting to have a Puzzle solved using the Terminal screen, we prefer to start by designing a concept for the interface. In this respect, we both gain time and are able to foresee the problems that might arise ahead. At the same time, we are able to understand whether it will suit the theme of the game. The biggest difficulty here was making the screen fit inside the screen and making everything visible. It should be simultaneously functional, visual when its place comes, or should provide auditory feedback. We also wanted some parts to be moving. Therefore, all of the puzzle elements were individual parts and sometimes were designed in blocks. You can see below some of the concept designs, the versions used in the game, and some puzzle parts.
4) Lastly, there is an improvement step. The thing we did here was how we can improve and still keep understandable the section of the programming and interface that we finalized. Even if we sometimes didn’t want it, it was difficult sometimes to remove or change parts that came across as beautiful, but it was something we were forced to do.
Colors & Fonts
The Tartarus terminals are generally “monochrome”. In other words, they consist of single color values or a single color. The reason for it to be like this was that we wanted to create primarily simple and understandable designs. Another reason was that it was quite harmonious with the general atmosphere and theme of the game. The fairs in which we participated and the first responses we received overwhelmingly told us that we were on the right path. We, therefore, didn’t have to later make many changes. In the sections separating from the general design (interactive areas), we preferred to use more “red” and “green”. And we tried like this to make it access more comfortable and make it more understandable, both in terms of meaning and subconsciously.
Another issue related to the colors of the terminal is that the user has the option to change the “Terminal” text color with a specific command. Players can do this using the “Color” command. Even if we had thought at first that nobody would use it. We saw in the “gameplay” videos we watched that many gamers played the game in different colors. There were two reasons that we wanted to add the option for gamers to change the colors and to keep this. The first was that it was one of the things a DOS user or someone with experience related to a command system was familiar with and that we wanted to resurrect their memories. The second reason was that it would be able to more personal experience of those without a prior experience with a command system. If we have to briefly summarize, we thought that it wouldn’t kill anybody to make the terminal screens sufficiently monotone and to add and animate the colors a little bit. We think that we made the right decision.
Another important issue was the typo that would be seen on the terminal screens. It shouldn’t be too thick or too thin. Because the terminal screens had a curved surface like a CRT monitor. We didn’t want the characters coming to the borders of the screen to be lost or to become unintelligible. It also shouldn’t be illegible or in a form that made it difficult to read. It should fit the general theme of the game and the terminals. Again, we chose a font that was suitable for us, legible, and strong, inspired by DOS and the Commodore.
Another problem we experienced was the FPS losses in the game. We finished the prototype stage of the terminals, and after integrating them into the game, we experienced an average of 30% FPS loss. Utilizing the performance tools, we determined that the problem originated from the Capture Camera. Tartarus is a game, settled over solving puzzles mostly by using terminals. But at the same time, we have to solve all other problems within the ship by controlling our main characters. Therefore, there was no reason for the Capture Camera to capture in every frame while in character control. We deactivated some settings that enabled it to capture in every assumed frame. For this, we deactivated the Capture Everyframe and Capture On Movement settings from the details panel while Capture Camera was selected.
We had solved the FPS problem we experienced in this respect. If your game is being controlled only by terminals, there’s no reason for you to have such anxiety.
Apart from these, we made sure to focus as much as possible on not carrying out unnecessary processes within Event Tick. Event Tick leads to the performance loss of unnecessary processes because it works in every frame.
Another one of our problems was that the terminals lose focus in situations like Alt+Tab. This isn’t a problem if we are in the menu and the user is in a state of interaction with the mouse. But we didn’t want the user to have to use the mouse in a setting when they only use the keyboard. Besides, there might even be people who thought this was a glitch. This is why we defined an instance as laid out below. In this case, the user is constantly focusing on the terminals unless they get up from the terminal.
Event On Focus Lost is triggered the moment focus is lost. And we return focus to the terminals. But an important point here is that if the user wants to leave the terminal willingly, it makes the Lost Focus variable true. Even if the situation is triggered in this case, focus doesn’t automatically pass to the terminals.
The things we shared with you above are what we acquired from our own experiences and also what we learned from doing. We think it’s a wonderful example of what all you can do with just a Blueprint system and a basic material setup. We hope that it was helpful, even if just a little. With the wish to do better.
Lastly, we thank Kirill Tokarev for the support he gave for the article.