A cloud based platform for interactive archviz applications. It had its origins in my Bachelor dissertation. It then advanced to become the focus of my Master's Degree project, eventually leading to the establishment of a startup, supported by two government grants.
As an Unreal Engine Generalist, I was responsible for crafting all the 3D environments used in the project prototypes. This involved tasks such as 3D modeling, creating material shaders, gameplay mechanics and configuring lighting and graphics quality in the engine to meet the product's expected standards.
As a Tech Lead, I directed a team of six developers in the construction, assembly, and configuration of essential software components. I played a key role in designing the architecture of the cloud platform, shaping technology choices, and contributing directly to the development of various components.

Early experiments with an interactive Unreal Project being rendered remotely and transmitted to a mobile device through network.

Proof of concept created for UI Testing with end-users. The 3D models composing the level were acquired from online libraries.

Phase 1 - Custom UI
The project began in 2012 with the goal of creating real estate marketing experiences that could seamlessly blend three initially mutually exclusive characteristics: interactivity, high visual quality, and availability on various devices. This unique combination could be enabled by the use of emerging Cloud Gaming technology, which is based on remote 3D graphics processing.
My initial step in turning this vision into reality involved delving into the world of game engines. With a background in traditional 3D modeling for architectural visualization, my objective was to establish a workflow that would facilitate a seamless transition for archviz artists like me into real-time rendering in an efficient way.
With a proper interactive archviz scene created with good visual quality, the next step was producing a proof of concept to show that it could run remotely and stream the visuals to the user's device smoothly. Tests showed that rendering a game far away and sending the images through the Internet usually adds latency and inconsistencies that could be frustrating. To address these issues, I proposed a series of User Interface guidelines and mechanisms aimed at mitigating these problems. This phase was the focus of my Master's Degree project.
The key feature of a suitable UI for this scenario was to minimize interactions that depended on continuous visual feedback, such as dragging the mouse to rotate the camera or requiring users to keep buttons pressed while the character moves. Instead, the controls were primarily transformed into discrete alternatives, such as "click and wait".
Rather than creating the menus within the engine using UMG and being susceptible to latency effects, they were developed in the web layer. This allowed for instant visual responses to user interactions.
All of these concepts and functionalities had to be developed from scratch or with the use of external libraries, such as Nvidia NVENC for video encoding and RabbitMQ for communication with web layer as Pixel Streaming would only become available many years later.
Phase 2 - Streaming approach
Having successfully developed a proof of concept, the logical progression was to transform the idea into an actual product. The prototype earned me a government award under the FAPESP Innovative Research in Small Business (PIPE) funding program, which is designed to support scientific and technological research carried out by startups.
Now, as a co-founder of the startup Sureale, in collaboration with friends, our aim was to create a prototype of a functional platform ready for commercial deployment. In addition to advancing the technology required for remote application processing, we also had to construct various components to meet the market's specific demands.
Instead of creating complete monolithic Unreal Engine package applications, our goal was to establish a connection between the content and external databases. This approach ensured that the assets remained synchronized with the client's stores and webpages, thereby allowing end-users to engage with the products in a more meaningful manner.

Unreal Engine project created with prefab assets after being processed through the Sureale Platform's guidelines.

Content creation guidelines
As an Unreal Engine Generalist, I was responsible for both crafting a UE project for use in the prototype and creating guidelines that would enable any 3D artists to produce projects compatible with the Sureale Platform. The aim was to maintain high visual standards while minimizing deviations from common archviz workflows.
I selected assets from online libraries that would represent the typical architectural visualization projects our potential clients were interested in. I then studied methods to optimize and adapt these assets to meet the platform constraints.
For house structures, it was essential to have each wall face as a separate mesh to enable individual selection via raycasts in the engine. Additionally, these meshes needed to have properly unwrapped UVs for lightmaps.
Furniture assets required optimization, consolidating them into a single mesh with multiple material IDs to reduce draw calls. In the engine, LODs were generated to maintain consistent performance across levels with multiple objects.
The most challenging part of the guidelines was balancing striking visuals with acceptable level of interaction. At that time, in the early iterations of UE4, we had to choose between static light with Lightmass for photorealistic but non-moving assets, or dynamic light with Light Propagation Volumes (LPV) for movable assets at the expense of lower-quality Global Illumination.
In the end I adopted a hybrid approach that combined static and dynamic elements. The primary house components, like ceilings, walls, and floors, which were unlikely to require movement interactions, were designated as static meshes with lightmaps. This served as a stable foundation for the lighting system and allowed for the propagation of Volumetric Lightmaps throughout the level.
The volumetric lightmaps effectively illuminated movable furniture, enabling them to seamlessly blend with the high-quality static meshes in the scene. Furthermore, Distance Field Ambient Occlusion and Distance Field Indirect Shadows were selectively applied to specific dynamic objects, providing more realistic indirect shadows that grounded these objects in the environment.
However, movable assets with intricate details still lacked the level of detail achievable with static lightmaps. For these, I added a step to bake Ambient Occlusion in the 3D software to emulate AO through material nodes, as detailed in Modern Barn project.
Countless hours of study and testing were necessary to establish a well-optimized setup for Lightmass and Volumetric Lightmaps. This effort was aimed at attaining striking visuals that would align with the quality standards of potential clients accustomed to traditional offline renders or fully static lightmap setups in Unreal.
Lightmass studies on house static structures.
Lightmass studies on house static structures.
"Light only" visualization of static lighting.
"Light only" visualization of static lighting.
Stationary artificial lights in the interior enable high quality GI for the static elements, and acceptable GI for dynamic objects (volumetric lightmaps).
Stationary artificial lights in the interior enable high quality GI for the static elements, and acceptable GI for dynamic objects (volumetric lightmaps).

Demonstration of Sureale Platform working with game being processed remotely and menu UI in the web layer. The experience starts with a video sequence that can be paused and interacted with. Later the user takes control of character movement.

Gameplay mechanics
We employ a first-person control mechanism, similar to conventional games, but with adjustments to make it compatible with cloud streaming technology. When there is a high latency in the Internet connection, the controls transition to a "discrete" mode, where clicks and taps take precedence over dragging. 
The implementation proceeds in a manner similar to that of a traditional game. We use a straightforward "Move To Location" node to relocate the character to specific positions. The objects within the scene are all equipped with accurate collision configurations, enabling them to be appropriately selected using Raycasts and also to contribute to the Navmesh generation.
All selectable objects in the scene are based on common blueprints with essential functions for platform operation. For instance, when a sofa is chosen, the game sends an ID to the web client to fetch relevant database data for customization menus. If users want to change the sofa material, the web client sends a "change material" request to the game, specifying the material ID and the target object part. The blueprint is prepared to handle this command effectively.
The base blueprints also function as tools to register data in the database. These blueprints store essential information, including name, category, compatible materials, and more. When 3D artists create a new furniture piece, they create a child blueprint of the appropriate base blueprint, add static meshes and other components that form the object, and fill in the required variables. At the end of the workflow, a script processes all created furniture, extracts the relevant data, render thumbnails and populates the database. We employ various base blueprints tailored to different categories of interactive objects, such as furniture, house parts, and materials, aligning with our product's needs.
Specific interactions like turning lights on and off, controlling TVs, fans, radios, etc., were implemented within child blueprints, all using a common event trigger from parent class.

Demonstration of Sureale Platform Phase 3, with hybrid solution that combines 360 panorama with real-time remote processed Unreal applications.

Phase 3 - Hybrid approach
Even with a robust platform performing exceptionally well with streaming technology, we faced a challenge in delivering quality and interaction in a more scalable and cost-effective manner. Given the high expenses and limited availability of cloud GPU resources in the subsequent years, we needed to explore ways to enhance the platform's hardware efficiency.
To address this, we adopted a hybrid approach, combining client-rendered 360 virtual tours with remote server-based 3D graphics processing in Unreal. This approach provided unmatched interaction freedom, allowing users to explore the environment without pre-rendered constraints and customize furniture and materials limitlessly.
While users engage in local client interactions with the 360 panorama, like rotating the camera, zooming, selecting objects, and navigating menus, the Unreal application on the server can concurrently handle requests from other users. This approach strikes a balance between server responsiveness and resource utilization during peak loads, accommodating several users simultaneously sharing the same application.
Naturally, this required a complete overhaul of the gameplay mechanics to manage multiple user states without data conflicts. We also had to streamline the process of capturing 360 cubemaps as quickly as possible, which meant we had to forsake graphical features that heavily relied on temporal stabilization and camera angles, such as TAA (Temporal Anti-aliasing) and SSR (Screen Space Reflection). These features were replaced with alternatives like FXAA and resolution supersampling, alongside optimized ray-tracing reflections.
We also introduced a feature where object IDs are encoded into the stencil buffer to create masks for each cubemap face. These masks store the shapes and IDs of every object in the view, allowing users to select and highlight them in the client without requiring server-side raycast operations.
Credits:
Sureale Team (sureale.com)

You may also like

Back to Top