Next Article in Journal
Weight Reduction of a Ship Crane Truss Structure Made of Composites
Previous Article in Journal
An Improved High-Resolution Network-Based Method for Yoga-Pose Estimation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Innovating Industrial Training with Immersive Metaverses: A Method for Developing Cross-Platform Virtual Reality Environments

by
Lucas G. G. Almeida
1,
Nalini V. de Vasconcelos
2,
Ingrid Winkler
2,3,* and
Márcio F. Catapan
1
1
PPGEM—Graduate Program in Manufacturing Engineering, Federal University of Paraná, Curitiba 81530-000, Brazil
2
Department of Management and Industrial Technology, SENAI CIMATEC University Center, Salvador 41650-010, Brazil
3
Institute for Science, Innovation and Technology in Industry 4.0/INCITE INDUSTRIA 4.0, Salvador 41650-010, Brazil
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(15), 8915; https://doi.org/10.3390/app13158915
Submission received: 22 May 2023 / Revised: 27 July 2023 / Accepted: 31 July 2023 / Published: 2 August 2023
(This article belongs to the Section Applied Industrial Technologies)

Abstract

:
The metaverse has garnered significant attention for its potential to provide engaging and social experiences in virtual reality. Despite substantial investment and interest from industry, there remains a lack of academic research on the development and implementation of metaverses for industrial training. Notably, research indicates that virtual reality training is, on average, four times faster than classroom-based training. This study proposes a method for developing immersive metaverses for industrial training, leveraging specialized tools like Epic Games’ Unreal Engine software version 4.27.2. To assess the efficacy of this method, a cross-platform metaverse was developed, and a questionnaire was administered to game developers. The results indicate that even junior developers and those with limited experience can comprehend the method, suggesting that it is possible to develop immersive virtual worlds with an emphasis on professional training even without prior experience with 3D modeling or third-party licensing.

Graphical Abstract

1. Introduction

During the early 2020s, the COVID-19 pandemic caused significant disruption to commercial enterprises in all industries worldwide. To slow the spread of the disease, many countries implemented physical distancing measures, including travel restrictions and bans on public events, which in turn led to an increased demand for remote work [1]. The resulting need for innovative solutions, such as virtual reality (VR) technology and online platforms for real-time communication, has intensified.
Immersive virtual reality (IVR) technology, for example, allows people to engage in shared virtual worlds simultaneously, even when physically distant [2]. Advancements in head-mounted display technology have enabled a growing number of applications for immersive virtual reality, including those in healthcare, education, fashion, and tourism [3].
Figure 1 provides an illustration of metaverses that employ immersive virtual reality by combining three-dimensional spaces with a virtual representation of the user. These technologies enable efficient collaboration on remote projects and allow essential activities and meetings to take place even when pandemic constraints make in-person gatherings impossible.
The utilization of immersive technology within a multiuser platform, such as the metaverse, has the potential to enhance training and information retention in specific applications. However, the development of such an application requires careful consideration of various factors, including network performance challenges, user synchronization, and system security [4].
Despite significant investments in the metaverse industry leading to explosive growth, there are also several problems that must be addressed [5]. Research in this area has the responsibility to study these problems and provide guidelines for successful metaverse development. The lack of academic discussion on the topic of the metaverse is a major concern.
In manufacturing environments, it is important to question solutions to problems, direct the search for best practices, and verify that they are being implemented correctly. These environments can be disorganized, and participant observation can help identify the need for methods or tools to guide results more effectively [6].
Despite their various applications, according to the authors’ knowledge, there is no previous work in the literature specifically addressing the technical issues related to the development of VR applications for standalone devices like the Meta Quest 2 (from Meta Platforms, currently the world’s best-selling virtual reality headset because of its high quality, competitive price, and ability to operate without cords, giving users freedom of mobility) [7]. Recent works researched the use of immersive virtual reality to enhance industrial training; however, they lack in-depth exploration of the software development process, particularly in terms of creating a framework for implementing such training programs [8,9]. Several studies have explored the application of virtual reality headsets; however, they have not provided a justification for the use of outdated equipment like the Oculus Rift S, which demands a high-performance computer [8,9]. Recent research has highlighted the significance of optimizing 3D models to enhance performance in virtual reality applications; nonetheless, there is a noticeable gap in the literature concerning comprehensive investigations into the optimization of 3D models, particularly tailored for industrial applications with standalone headsets, such as the Meta Quest 2 [8,9,10]. As a result, there is still a notable deficiency in the existing knowledge on this subject.
Since these standalone devices operate on the Android operating system, the development process involves multiple stages and optimization to ensure smooth performance above the accepted framerate with high quality, thereby preventing motion sickness [10]. In order to scale the development of VR applications for industrial training and skill development, addressing such technical challenges is essential to enable the scientific community to create applications that cater to a broader audience of developers, thus contributing to the advancement of the scientific community.
With these considerations in mind, our study aims to propose a method for developing immersive metaverses for industrial training, leveraging specialized tools like Epic Games’ Unreal Engine software version 4.27.2.
Our approach draws from common methods and approaches used in multiuser VR game development but adapts them specifically for industrial training purposes. What sets our method apart is its emphasis on creating cross-platform virtual reality environments that can be readily utilized by professionals, including junior developers without prior experience in 3D modeling or third-party licensing. Furthermore, our study highlights the potential benefits of using immersive metaverses in industrial training, including improved information processing and memory retention. As such, our research provides a solid foundation for further exploration and development of immersive metaverses to meet industrial training needs, contributing to the ongoing discussion about the potential applications of this technology in various industries.
The result of this research is expected to make two significant contributions (one academic and one for the industry). From an academic standpoint, the work could serve as a reference source for future research in the field of immersive metaverses, which is particularly important given the lack of academic discussion on the topic of the metaverse. From an industry standpoint, the hope is that the resulting technological product can help improve problem-solving in various industrial sectors.
This document is structured as follows: Section 2 describes the materials and methods used, Section 3 presents and analyzes the results, and Section 4 provides our conclusions and suggestions for further research.

2. Methods

The study is exploratory, which means that there has been little research on intuitive virtual reality authoring tools. This concept must be explored and comprehended, and qualitative research is particularly useful when the researcher does not know the important variables to examine [11].
As preceded by the authors of Ref. [12], an expert in virtual-reality-based industrial training defined the initial research strategy, which was then assessed individually by one senior researcher and two other senior researchers peer debriefing the strategy for validating findings. Qualitative research is interpretative research [13]; therefore, it is relevant to the outcomes of this study that the authors have a strong background and experience with 3D visualization tools, graphical design, creating experiences for virtual reality devices, and game engines such as Unity and Unreal Engine 4, among others.
This study adopted the Design Science Research paradigm. In addition to a knowledge contribution, effective DSR should make clear contributions to the real-world application environment from which the research problem or opportunity is drawn [14]. Our method parallels that described by Chamusca et al. [12], which includes six steps: (1) Identify problem; (2) Define solution objectives; (3) Design and development; (4) Demonstration; (5) Evaluation; (6) Communication.
In steps 1 and 2, we carried out a systematic literature review to examine the problem situation and define the solution objectives. The review adopted a qualitative approach to identify the central issues in the field (i.e., summarize the literature by pointing out the central issues) [11]. The project is exploratory, which implies that little research has been done on the metaverse and multiuser settings in virtual reality for industrial training. These notions must be studied and understood, and qualitative research is especially effective when the researcher is unsure of the essential factors to investigate [11]. The first search technique was established by an expert in virtual-reality-based industrial training, and it was then evaluated individually by a senior researcher. Because qualitative research is interpretive research [11], the authors’ extensive knowledge and experience with 3D modeling is significant to the study’s findings. The validation technique was carried out in four stages: the review protocol, data collecting, data filtering, and data synthesis, as explained in the parts that follow.
The first stage involved the development of the review protocol, which began with an analysis of the syntax of the research topic to generate a search string in various databases. As the topic is relatively recent, the search string was simplified to include a greater number of articles related to training, education, and problem solving.
Scopus and Web of Science were chosen as scientific knowledge databases for data collecting. These databases were chosen because they are trustworthy, multidisciplinary scientific databases of international scale with extensive citation indexing coverage, delivering the finest data from scientific publications. Scopus presently has 87 million vetted documents [15], whereas Web of Science contains over 82 million items [16]. Table 1 depicts the outcomes of the search conducted in these databases, which retrieved 1509 articles in Scopus and 603 articles in Web of Science.
Table 2 shows the preliminary inclusion and exclusion criteria established in the data filtering stage. The filtering process consisted of four filters and included the exclusion of duplicate references, reading of the title, abstract, and keywords, reading of the introduction and conclusion, and full reading of the articles. After the filtering process, 26 articles remained.
In Step 3 of our Design Science Research approach, an innovative method to developing an immersive metaverse for industrial training was designed.
In Step 4 we demonstrated a proof-of-concept of our proposed method and tested it.
DSR includes gathering evidence that the artifact is useful, meaning that the artifact works and does what it is intended to do [14]. So, in Step 5 we evaluated the proposed method in terms of validity criteria, by carrying out a simulation with 5 metaverse application developers of different levels of knowledge in the field.
Step 6 entails communicating our findings from this work.

3. Results and Discussion

In the subsequent sections, we present and analyze the outcomes of the literature review, the proposition of a method of developing an immersive metaverse for industrial training, the implementation and testing of the proposed method, and, finally, the application of the proposed method with metaverse developers to verify its applicability.

3.1. Current Knowledge on Virtual Reality and Metaverse

The advantages, usability, and development difficulties, such as connectivity, of a multiuser environment in immersive virtual reality of the selected documents were examined in order to base the propositions of our method. We delve into the concepts of metaverse, virtual immersion, and address the benefits of employing virtual reality as a learning tool, with a particular emphasis on industrial training. Furthermore, we discuss the subject of multiuser environments, with a focus on user representation, their social interactions, and the crucial role of a voice system in augmenting immersion.
Virtual reality is a technology enabling users to interact with computer-simulated environments [17]. Due to its highly visual and immersive 3D environments, virtual reality is extensively employed for education and training purposes. As an innovative teaching medium, virtual reality offers students practical and interactive experiences, significantly enhancing academic performance. The next section addresses how important it is for a comprehensive examination of industrial training, with an emphasis on its relationship with the metaverse and learning.
The metaverse concept was initially introduced by Neal Stephenson in his science fiction novel, Snow Crash [3]. The novel features characters embodying avatars and functioning within a three-dimensional (3D) virtual reality called metaverse. The term metaverse signifies a virtual reality transcending physical reality, originating from the combination of meta, meaning transcendence, and universe. The metaverse represents a digitized world, accessible via internet-connected devices, where users can experience an alternate reality and collaborate in real-time. This study adopts this metaverse definition as it proceeds with the research.
In terms of immersive industrial learning and training, incorporating learners’ skills into real-world professional settings presents a complex challenge due to safety, personnel costs, liability issues, repetitive work, and insufficient feedback [17]. Nevertheless, immersion and fidelity are crucial in promoting active learning among participants, with increased immersion resulting in improved learning outcomes [17,18]. Virtual environments hold the potential to serve as educational platforms, facilitating real-world simulations, professional training, synchronous interaction, and global collaboration. This, in turn, creates immersive and interactive learning experiences and enhances student engagement [17,19].
A literature review on the efficacy of immersive metaverse industrial training [20] revealed that virtual reality improves knowledge retention by 80% [21]; VR-based training is, on average, four times faster than classroom training [22]; students utilizing virtual reality in training are 400% more focused compared to those employing traditional methods [22]; and individuals trained via virtual reality demonstrated reduced performance errors and increased precision compared to those trained through conventional approaches [21]. Consequently, immersion, as the most subjective characteristic, is directly proportional to learning. In the context of industrial training, employing virtual reality technology can add value due to its potential to augment information retention.
Regarding multiuser environments, the growing popularity of virtual reality will drive the demand for more diverse multiplayer experiences, necessitating multiuser experiences [23]. Social interactions among participants form the foundation for collaborative interaction and can be amplified through specific virtual reality features, including shared vision, contribution, and communication [24].
On the other hand, user representation within the virtual environment is a crucial consideration. Different representation levels entail varying implications and benefits in terms of immersion, virtual awareness, and computational cost [23]. For single-player experiences, increased avatar complexity does not significantly enhance immersion but incurs performance costs, which are vital to virtual immersion.
Regarding social interactions in multiuser environments, in multiuser contexts, facilitating seamless interactions among users is crucial. This necessitates accurate replication of the environment, well-defined interaction schemes, and appropriate feedback mechanisms to ensure that users are aware of their engagement with others. An effective multiuser virtual reality framework should aim to achieve these objectives while also providing ease of integration for various types of social interactions [23].
Another crucial aspect relates to voice communication systems in multiuser environments since verbal communication is a highly desirable feature within multiuser virtual environments [23]. Contemporary virtual reality devices are equipped with integrated microphones to facilitate this functionality. For a truly immersive metaverse experience, it is imperative to incorporate key elements such as multiuser environments, user representation, immersive interactions, and a robust multiuser voice communication system.
The theoretical foundation underscores the merits of integrating immersive virtual reality technology with the metaverse concept. It is essential that a metaverse provides multiuser environments, user representation, immersive interactions, and a multiuser voice communication system to achieve its intended impact. The potential application in industrial training highlights the necessity and benefits of employing an immersive metaverse for learning. The development of a virtual reality environment that fosters social interaction, visual representation, and dialogue is crucial for enhancing industrial training outcomes.
The method proposed in this study will leverage these concepts to optimize the impact of the technology; the subsequent section delineates the methodological approach.

3.2. Proposition of a Method of Developing Immersive Metaverses for Industrial Training

Figure 2 presents the method proposal for developing an immersive metaverse. To provide a clearer understanding of the subject, the method will be described in topics.

3.2.1. Game Engine

The first step of the proposed method is to choose the game engine for developing the immersive metaverse. Game engines are structures or frameworks used in creating digital games [25]. These tools promote software reuse and facilitate the implementation of recurring tasks in the game, allowing developers to focus on content creation. To create immersive experiences that are believable to the human mind, virtual reality requires complex scenes rendered at high frame rates [26].
With the evolution of the electronic gaming industry, it is possible to develop immersive virtual reality projects with popular game engines such as Unity and Unreal Engine 4; the latter known for its quality of graphic rendering. Both engines are extensively used by professional studios, flexible, free for personal use, and offer a large amount of free online lessons and resources. For this project, Unreal Engine 4 (UE4) was chosen over Unity because it offers two key advantages: better graphics [27,28,29] and simpler code. In contrast to Unity, which uses C# scripts through Visual Studio, UE4 also supports C++ coding through the use of blueprints [28]. Moreover, the UE4 open-source code further adds to its appeal, providing users with the freedom to customize and tailor their experiences [27]. In UE4, a blueprint is a visual representation of code that enables input addition, component spawning, and variable configuration without the use of traditional coding. Additionally, compared to Unity’s capabilities, UE4’s platform navigation, asset usage, and placement are thought to be more beginner-friendly and intuitive.
In addition to these benefits, the Unreal Engine 4 also boasts a comprehensive network stack specifically designed for facilitating communication between servers and clients, primarily tailored for multiplayer gaming scenarios [27]. This built-in capability enables seamless collaboration among various users, whether they are located in the same physical space or working remotely. Consequently, it empowers the creation of immersive metaverses that foster cooperative interactions and shared experiences for industrial training purposes, as shown in Figure 3.

3.2.2. 3D Modeling

Understanding the project’s goal is crucial to carrying out the industrial training project since it will help to estimate the degree of precision and detail needed for 3D modeling. 3D scanning and hard-surface modeling are two modeling methods that are frequently utilized in industrial training.
3D scanning may be achieved using photogrammetry or lasers, allowing for the reconstruction of 3D models, although it typically contains a high quantity of polygons, which degrades performance. To address this issue, 3D object retopology is a good choice for simplifying the model and making it suitable for more demanding game engines [30].
3D modeling can be divided into two main categories, which are organic modeling and hard-surface modeling [31]. Hard-surface models are usually with everything human-made or constructed items [13,32], such as automobiles, robots, and other mechanical equipment, and may be accomplished using programs such as 3ds Max, Sketchup, Maya, Modo, and Blender, among others. While organic modeling relates to characters, animals, trees, plants, and other living objects, hard-surface modeling consists of techniques that are used to make machines, vehicles, robots, weapons, and generally any non-living objects that have smooth, static surfaces [31]. Thus, if mechanical engineering is the focus, software such as SolidWorks, Inventor, and Catia, among others, can be used.
3D objects are fundamental in the development of the virtual reality environment, and their quality can influence the user’s perception of the virtual world [33]. Although engaging with the virtual world offers many opportunities for viewers, the performance of the current technology is severely constrained. The Meta Quest aims for 300,000–500,000 triangles and 150–175 draw calls per frame, compared to the Oculus Rift S’s 1–2 million triangles and 500–1000 draw calls per frame [7]. Simply put, this implies that in order to maximize speed, the number of polygons in the 3D models had to be drastically reduced, and the quantity of materials, texture sizes, and lighting had to be optimized. The intricate drawings used to create the high-polygon-count 3D models were converted into a more common type of mesh, which was then remeshed and retopologized to lower the polygon count [7]. So, for an industrial training project, hard-surface modeling is the most appropriate choice. 3D scanning can be used to accelerate modeling, providing a point cloud as a reference. Using software like 3ds Max 2022 or Blender 3.5 allows developers to optimize the model for mobile applications like Android and the Meta Quest 2 virtual reality device. It is important to note that we chose Meta Quest 2 due to its popularity and affordability, making it accessible to a wider audience. Additionally, Meta Quest 2 is compatible with the Unreal Engine software version 4.27.2, allowing us to test the proposed methodology in a real-world scenario.
Hard-surface modeling was chosen and will be developed using Autodesk’s 3ds Max 2022 software. Figure 4 and Figure 5 illustrate a modeled hydraulic press.

3.2.3. Data Transfer Method

The next step involves deciding how the 3D modeling will be exported to Unreal Engine 4. Currently, there are two ways to import files directly into UE4: the FBX transfer method, which is standard in most game engines, and the Datasmith method, which is exclusive to UE4 and allows for easier and faster transfer of 3D modeling to UE4.
Regarding the FBX transfer method, before importing the 3D model into UE4, some modifications need to be made in the modeling software, such as changing the object’s pivot and manually mapping the light (lightmap). The 3D model is saved in FBX format and transferred to the game engine. Additional configurations, such as positioning each object, applying lighting, and directly creating and applying materials, must be made within the engine.
The Datasmith transfer method was created to address the issues faced by those outside of the gaming industry who wish to utilize UE4 for real-time rendering and visualization in fields such as architecture, engineering, building, manufacturing, and live training, to name a few [35]. Datasmith supports the widest range of 3D design applications and file formats, already working with a wide range of sources, including Autodesk 3ds Max, Trimble Sketchup, and Dassault Systèmes SolidWorks, among others, with each update [36].
The Datasmith transfer method enables easy and simple import of 3D models into Unreal Engine 4. It runs the majority of the difficult work to ensure that the scene in the 3D tool appears as close as possible to the same within UE4. It also supports a variety of CAD file types. A plugin from the Epic Games website needs to be installed before using Datasmith (Figure 6).
For virtual reality projects, Unreal Engine offers a specific template that comes with pre-configured resources, such as means of locomotion, hand animations, and other extras. By importing the modeling using Datasmith, the project already has a ready-made foundation to start development, with materials applied and lightmaps generated automatically (Figure 7 and Figure 8).

3.2.4. Application Programming

UE4 has a marketplace where various types of tools, 3D models, and plugins that can help with development can be purchased. The Advanced Framework (AF) tool was developed in 2019 by the German company Human Codeable to facilitate the development of applications in Unreal Engine 4, with a focus on immersive virtual reality, as shown in Figure 9.
The Advanced Framework is a virtual reality framework for developing virtual reality, mobile, and desktop (i.e., cross-platform) applications such as games, experiences, architectural visualization, training, or product presentations, and implementing a variety of functionalities such as level management, menus, interaction objects, panels, maps, and catalogs, as shown in Figure 10.
This proposed solution has the significant benefit of making it feasible to export the program to numerous platforms, including Windows, Mac, Android, and nearly all current virtual reality devices, through a single project [38].
The entire project can be controlled using the UE4’s visual programming language Blueprints. Blueprints allows the entire project to be programmed in an easier and practical way, thus it is used to integrate teams more easily, where programmers can talk to the team, and demonstrate in a more practical way how to use such logic and how to modify it within Unreal with Blueprints.
It is also possible to program in the standard software language, C++, using Visual Studio 2017, and mix it with Blueprints; it is all a matter of knowledge and practice.
As the AF can be programmed entirely using the visual script Blueprints, all interactivities, the voice system, avatar system, and connection system were made without the need to directly modify code lines.

3.2.5. Immersive Metaverse

The next section presents how to transform the project into an immersive metaverse, bringing a way to create a multiuser environment where it is possible to communicate through voice, interact, and create a virtual avatar.
In terms of the connection system, it is necessary to choose a connection system for the environment to become multiuser, consequently transforming it into a metaverse. For the example to be developed, Epic Online Services (EOS) was chosen, free multiplatform services created to facilitate the launch, operation, and scalability of gamified applications in an easier and faster way [37].
The Advanced Framework development team also developed the EOSLink tool, which will be used as the basis and is demonstrated in Figure 11.
EOSLink works in conjunction with the Advanced Framework to connect to EOS and enable solutions that require multiplayer in a simple way [38]. The tool connects to the Advanced Framework through the AFUEOSLink developed code. While Epic Online Services provides a wide variety of multiplayer features, the EOSLink tool does not implement these features to streamline the multiplayer experience in just two clicks, all from within virtual reality or on a desktop or mobile environment such as a cell phone.
Since it will be necessary for users to log in and create their avatars in the metaverse, we created the entire interface of the project using Figma software. Screens were created so that the user can choose his or her name, host an online session, and connect to other online sessions, as shown in Figure 12.
It is necessary to export the interface screens created for the game engine. It is possible to export in common formats such as jpg and png. After importing, it is necessary to work with the image within the Unreal Motion Graphics (UMG) UI Designer, which is a visual UI tool that can create UI elements such as menus and other user interfaces [39] (Figure 13).
The Blueprints visual scripting language is used to program all menu interactions after the appropriate interfaces have been developed, making it feasible to host or join an online session.
The literature review indicated that is relevant to have a voice system in a multiuser application. Including this will add another layer of immersion, as well as facilitate communication between players [40]. The communication service provider for online games, Vivox platform, will be used for a multiuser voice system [41]. Vivox was chosen for its low cost, ease of applying the platform to UE4 software projects, spatial (3D) sound, and support for the Advanced Framework in conjunction with EOSLink [38]. Figure 14 shows that the entire Vivox connection was programmed through Blueprints.
Concerning the interactivity and avatar system, for maximum immersion it is relevant to add interactions to the application. It is crucial to analyze the purpose of immersive metaverse development to develop the necessary interactions for its application, such as interactions with industrial equipment or virtual simulations.
As the 3D modeling becomes ready, some components provided by the Advanced Framework will be used to add these interactions. The Drag component and the Select component are two tools provided by the AF that facilitate the programming of object interaction. The Drag component allows the programming of a 3D model to have a movement relative to other 3D models through sliders, buttons, or other complex shapes [38], as Figure 15 illustrates.
The Select component allows transfer of information to other objects in a quick and easy way, as explained in Figure 16.
Combining these two elements allows simulation of an industrial training utilizing a hydraulic press that is modeled, such as two-hand control training. One can interact with the desired 3D models using the Drag component and program feedback using the Select component through the components.
It is also possible to simulate a regulatory standard training, such as NR-10, for example, where it is necessary to know how to perform maintenance on an electrical panel. Through the Grab component it is possible to grab and drop 3D models in the desired way, provided they have a collision box [38], as Figure 17 illustrates.
After adding the collision, if the object is with the Grab component, as shown in Figure 18, it will be possible to interact on all desired platforms, including immersive virtual reality.
After programming the desired interaction, the creation of avatars is programmed. Figure 19 illustrates the additional graphics created for UIs that offered the ability to generate 3D avatars.
Similar to how the UI menus were implemented using UMG, the avatar system was developed utilizing imported textures and Blueprints programming, as Figure 20 and Figure 21 illustrate.
After programming is complete, it is possible to export the same multiplatform programmed interactions thanks to the Advanced Framework.

3.2.6. Exportation of the Multiplatform Project

Once programming is complete, the next step is the project’s packing process, also known as the package project, as Figure 22 illustrates. In this phase, the project is compressed into an executable file compatible with Windows and Mac operating systems or an app for Android and iOS mobile devices. With this document, the project can be run on computers, smartphones, or portable virtual reality devices like the Meta Quest 2 headset, provided the necessary hardware is available to ensure the application’s adequate performance.
This procedure will combine all previously created documents and save one as an application, but it must be the final step since, once completed, the document will be permanently sealed, making changes or corrections impossible. The UE4 makes the application export procedure nearly automatic; all that is required is choosing the destination platform and a location to save the final document [42].
The following section presents the evaluation process for the prototype.

3.3. Implementation and Testing of the Proposed Method and Prototype Evaluation

DSR includes gathering evidence that the artifact is useful, meaning that the artifact works and does what it is intended to do [19]. Thus, the prototype evaluation allowed us to verify if the planned features worked correctly and an experiment was carried out to discover possible problems and propose improvements.
The experiment involved a pilot project for industrial training in the metaverse. After exporting the application, it was necessary to start the executable and follow a tutorial to choose an avatar. One of the users created a room for the peer-to-peer system to work, and other users were able to connect and converse through the Vivox voice system. The user who hosted the room was also able to select other environments in the metaverse, including the hydraulic press training scenario. Figure 23 illustrates the experiment.
The experiment produced satisfying results, demonstrating that the project developed using the method worked as intended without connection or voice issues.

3.4. Application of the Proposed Method with Metaverse Developers to Verify Its Applicability and Prototype Validation

Then, a survey was developed to find out from other developers whether the proposed method is understandable. The method and the questionnaire were provided to the participants, who were game engines experts. The authors of Ref. [43] observed the following issues: the amount of time volunteers spend working with the software; the type of gaming engine they are using; the level of their familiarity with the gaming engine; whether they have already developed any multiplayer applications; the degree to which volunteers were able to understand the method used; whether they are capable of reproducing the findings after reading the method; and whether they are capable of doing so.
Every interviewee was required to have a basic understanding of the game engine Unreal Engine [43]. Due to asking them about their usage period, it was possible to conduct an individual analysis and determine that there is a connection between the usage period and the technical concept. It was shown that most volunteers are above the median in terms of their level of understanding of the game engine that is being used.
In response to the question about developing multiuser applications, 85.7% of the volunteers said they have never developed such a program, which made it easier to evaluate the method suggested in this article generally. Most participants affirmed that the method is understandable and that they would be able to reproduce it. Conducting an individual analysis, it was possible to see that users with only one to six months of experience using game engines were able to understand the method. It is important to note that the timeline for developing immersive metaverses for industrial training may vary depending on the complexity of the project and the experience of the development team and may take several days up to a few weeks, especially if the 3D CAD models are available and the user interface design is defined well.

4. Final Considerations

The use of immersive virtual reality and multiuser environment technologies may benefit industrial training by improving learning outcomes as immersion increases [17,18,44]. Our proposed method allows for the development of an immersive metaverse with a voice system and avatars, without experiencing any connectivity issues. We observed that even junior developers could understand the method, demonstrating that it is possible to develop immersive virtual worlds with an emphasis on professional training within six months of working with the Unreal Engine 4 tool, even without prior experience with 3D modeling or third-party licensing. Therefore, our proposed method has the potential to support industry in advancing projects involving immersive metaverses and other projects involving gaming engines and industrial training.
As for this study’s limitations, there are still several issues that must be addressed for successful metaverse development, which we did not detail. Future research into this topic could carry out a systematic literature review to map the current problems and the recommended guidelines to address them.
Additionally, our study focused on a method for developing immersive virtual worlds for professional training purposes using the Unreal Engine software version 4.27.2. While other game engines such as Unity and GoDot can also be used to develop VR applications, our proposed method using Unreal Engine software version 4.27.2 can provide benefits such as better graphics quality, more advanced physics simulation, and better support for VR devices. Future research could explore the use of other game engines and compare their capabilities and limitations in developing immersive virtual worlds for professional training purposes. Regarding the advantages of using the Unreal Engine software version 4.27.2 over other game engines, the software’s latest generation technologies such as Lumen Global Lighting System, Virtual Shadow Maps, Hardware Raytracing, and MetaHumans can provide significant benefits in terms of graphics quality and realism. However, the hardware of Meta Quest 2 still lacks the capabilities to utilize these new features of Unreal Engine 5. Additionally, Unreal Engine’s software version 4.27.2 better supports VR devices, making it easier to develop VR applications that are compatible with a wide range of HMDs. Also, high-end HMD displays such as Varjo XR-3 or Valve Index can provide significant benefits in terms of resolution, field of view, and overall visual quality. However, our method can still provide benefits even with more affordable HMDs such as the Meta Quest 2, making it accessible to a wider audience.
Future research might also validate our proposed method and explore user experience and user interface design in immersive metaverses. This could involve increasing the number of developers participating in the experiment since the number of developers using the Unreal Engine software tends to be lower than that of similar software such as Unity, and making a comparison between the two main software platforms in terms of immersive metaverse development. Another requirement is to determine in real-world industry training sessions whether using virtual reality technology in conjunction with the metaverse can improve information use and memory retention. This validation could also enable a comparison between physical industrial training and virtual immersive exercises.
This comparison may create a new need for research on user experience and user interface (UX/UI) applied in an immersive metaverse and how they may affect user quality and usage. An in-depth investigation of this issue may lead to better uses of the technology beyond industrial training, including any application that makes use of immersive virtual reality technology in metaverses.

Author Contributions

Conceptualization, L.G.G.A. and M.F.C.; Methodology, L.G.G.A., I.W. and M.F.C.; Software, L.G.G.A. and M.F.C.; Validation, L.G.G.A., I.W. and M.F.C.; Formal analysis, L.G.G.A. and M.F.C.; Investigation, L.G.G.A. and M.F.C.; Resources, L.G.G.A. and M.F.C.; Data curation, L.G.G.A., N.V.d.V., I.W. and M.F.C.; Writing—original draft, L.G.G.A., N.V.d.V., I.W. and M.F.C.; Writing—review & editing, L.G.G.A., N.V.d.V., I.W. and M.F.C.; Visualization, L.G.G.A., N.V.d.V., I.W. and M.F.C.; Supervision, L.G.G.A. and M.F.C.; Project administration, L.G.G.A. and M.F.C.; Funding acquisition, L.G.G.A. and M.F.C. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to thank the financial support from the National Council for Scientific and Technological Development (CNPq). Ingrid Winkler is a CNPq technological development fellow (Proc. 308783/2020-4).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tea, S.; Panuwatwanich, K.; Ruthankoon, R.; Kaewmoracharoen, M. Multiuser immersive virtual reality application for real-time remote collaboration to enhance design review process in the social distancing era. J. Eng. Des. Technol. 2021, 20, 281–298. [Google Scholar] [CrossRef]
  2. Born, F.; Sykownik, P.; Masuch, M. Co-Located vs. remote gameplay: The role of physical co-presence in multiplayer room-scale VR. In Proceedings of the 2019 IEEE Conference on Games, CoG 2019, London, UK, 20–23 August 2019. [Google Scholar]
  3. Kye, B.; Han, N.; Kim, E.; Park, Y.; Jo, S. Educational applications of metaverse: Possibilities and limitations. J. Educ. Evaluation Health Prof. 2021, 18, 32. [Google Scholar] [CrossRef] [PubMed]
  4. Parthasarathy, V.; Simiscuka, A.A.; O’Connor, N.; Muntean, G.-M. Performance Evaluation of a Multi-User Virtual Reality Platform. In Proceedings of the 16th IEEE International Wireless Communications and Mobile Computing Conference, IWCMC 2020, Limassol, Cyprus, 15–19 June 2020; pp. 934–939. [Google Scholar]
  5. Duan, H.; Li, J.; Fan, S.; Lin, Z.; Wu, X.; Cai, W. Metaverse for Social Good: A University Campus Prototype. In Proceedings of the MM 2021-Proceedings of the 29th ACM International Conference on Multimedia, Virtual Event China, 20–24 October 2021; pp. 153–161. [Google Scholar]
  6. Paniago, A.L. KAIZEN–Implementação na Indústria de Autopeças: Resultados na Redução das Perdas na Área Produtiva. Master’s Thesis, Departamento de Engenharia Mecânica, Escola Politécnica da Universidade de São Paulo, São Paulo, Brazil, 2008; p. 130. [Google Scholar]
  7. Liaskos, O.; Mitsigkola, S.; Arapakopoulos, A.; Papatzanakis, G.; Ginnis, A.; Papadopoulos, C.; Remoundos, G. Development of the Virtual Reality Application: “The Ships of Navarino”. Appl. Sci. 2022, 12, 3541. [Google Scholar] [CrossRef]
  8. Wolfartsberger, J.; Zimmermann, R.; Obermeier, G.; Niedermayr, D. Analyzing the potential of virtual reality-supported training for industrial assembly tasks. Comput. Ind. 2023, 147, 103838. [Google Scholar] [CrossRef]
  9. Joshi, S.; Hamilton, M.; Warren, R.; Faucett, D.; Tian, W.; Wang, Y.; Ma, J. Implementing Virtual Reality technology for safety training in the precast/prestressed concrete industry. Appl. Ergon. 2021, 90, 103286. [Google Scholar] [CrossRef] [PubMed]
  10. Kersten, T.; Drenkhan, D.; Deggim, S. Virtual Reality Application of the Fortress Al Zubarah in Qatar Including Performance Analysis of Real-Time Visualisation. KN J. Cartogr. Geogr. Inf. 2021, 71, 241–251. [Google Scholar] [CrossRef]
  11. CRESWELL, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches; Sage Publications: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  12. Chamusca, I.L.; Ferreira, C.V.; Murari, T.B.; Apolinario, A.L.; Winkler, I. Towards Sustainable Virtual Reality: Gathering Design Guidelines for Intuitive Authoring Tools. Sustainability 2023, 15, 2924. [Google Scholar] [CrossRef]
  13. Levinski, R. O Que é Modelagem 3D? Available online: https://revospace.com.br/artigo/o-que-e-modelagem-3d/ (accessed on 15 November 2022).
  14. Gregor, S.; Hevner, A.R. Positioning and Presenting Design Science Research for Maximum Impact. MIS Q. 2013, 37, 337–355. [Google Scholar] [CrossRef]
  15. How ScopusWorks. Available online: https://www.elsevier.com/solutions/scopus/how-scopus-works/content (accessed on 25 April 2023).
  16. Matthews, T. LibGuides: Resources for Librarians: Web of Science Coverage Details. Available online: https://clarivate.libguides.com/librarianresources/coverage (accessed on 25 April 2023).
  17. Birt, J.; Vasilevski, N. Comparison of Single and Multiuser Immersive Mobile Virtual Reality Usability in Construction Education. Educ. Technol. Soc. 2021, 24, 93–106. [Google Scholar]
  18. Dalgarno, B.; Lee, M.J.W. What are the learning affordances of 3-D virtual environments? Br. J. Educ. Technol. 2010, 41, 10–32. [Google Scholar] [CrossRef]
  19. Huang, Y.-C.; Backman, S.J.; Backman, K.F.; McGuire, F.A.; Moore, D. An investigation of motivation and experience in virtual learning environments: A self-determination theory. Educ. Inf. Technol. 2018, 24, 591–611. [Google Scholar] [CrossRef]
  20. Catapan, M.F.; Mercado, F.A.P.; Almeida, L.G.G.; Zem Neto, D.; Martins, L.O.; Araujo, J.L.; Strobel, C.S. Acessibilidade Em Treinamentos De Cirurgias Endocópica Através da Virtualização Imersiva-O Estado Da Arte. In I Latin American Congress of Applied Technologies, 2021; I Latin American Congress of Applied Technologies; Latin American Publicações: São José dos Pinhais, Brazil, 2021; Volume 1, p. 1. [Google Scholar]
  21. Sankaranarayanan, G.; Wooley, L.; Hogg, D.; Dorozhkin, D.; Olasky, J.; Chauhan, S.; Fleshman, J.W.; De, S.; Scott, D.; Jones, D.B. Immersive virtual reality-based training improves response in a simulated operating room fire scenario. Surg. Endosc. 2018, 32, 3439–3449. [Google Scholar] [CrossRef] [PubMed]
  22. Chai, M.T.; Hafeez, U.A.; Mohamad, N.M.; Aamir, S.M. The Influences of Emotion on Learning and Memory. Front. Psychol. 2017, 8, 1454. [Google Scholar]
  23. Novotny, A.; Gudmundsson, R.; Harris, F.C. A unity framework for multi-user VR experiences. In Proceedings of the 35th International Conference on Computers and Their Applications, CATA 2020, San Francisco, CA, USA, 23–25 March 2020; Volume 69, pp. 13–21. [Google Scholar]
  24. Brenner, C.; Desportes, K.; Ochoa Hendrix, J.; Holford, M. GeoForge: Investigating integrated virtual reality and personalized websites for collaboration in middle school science. Inf. Learn. Sci. 2021, 122, 546–564. [Google Scholar] [CrossRef]
  25. Garcia, F.E. Um Motor Para Jogos Digitais Universais. Master’s Thesis, Universidade Federal de São Carlos, São Carlos, Brazil, 2014. [Google Scholar]
  26. Unreal Engine. Advanced Framework–VR, Mobile & Desktop. Available online: https://www.unrealengine.com/marketplace/en-US/product/advanced-vr-framework (accessed on 15 November 2022).
  27. Hilfert, T.; König, M. Low-cost virtual reality environment for engineering and construction. Vis. Eng. 2016, 4, 2. [Google Scholar] [CrossRef] [Green Version]
  28. Fonseca, D.; Cavalcanti, J.; Peña, E.; Valls, V.; Sanchez-Sepúlveda, M.; Moreira, F.; Navarro, I.; Redondo, E. Mixed Assessment of Virtual Serious Games Applied in Architectural and Urban Design Education. Sensors 2021, 21, 3102. [Google Scholar] [CrossRef] [PubMed]
  29. Michalík, D.; Jirgl, M.; Arm, J.; Fiedler, P. Developing an Unreal Engine 4-Based Vehicle Driving Simulator Applicable in Driver Behavior Analysis—A Technical Perspective. Safety 2021, 7, 25. [Google Scholar] [CrossRef]
  30. Brito, A. Tutorial Modo 401: Como Fazer Retopologia de Modelos 3D. Available online: https://www.allanbrito.com/2009/08/20/tutorial-modo-401-como-fazer-retopologia-de-modelos-3d/ (accessed on 15 November 2022).
  31. Haapala, K. Utilizing Different Design Principles for Hard-Surface 3D Art: A Study about Achieving Realistic 3D Mechanical Designs; Theseus: Cambridge, MA, USA, 2022. [Google Scholar]
  32. Kochetov, K. Modern Approach to Hard-Surface Modeling for Games. Bachelor’s Thesis, South-Eastern Finland University of Applied Sciences, Helsinki, Finland, 2018. [Google Scholar]
  33. Xu, Z.; Zheng, N. Incorporating Virtual Reality Technology in Safety Training Solution for Construction Site of Urban Cities. Sustainability 2020, 13, 243. [Google Scholar] [CrossRef]
  34. Braz, S.M.; Winkler, I.; Catapan, M.F. Analysis of the Feasibility of Immersive Virtualization in Technical Training. Int. J. Prof. Bus. Rev. 2023, 8, e01815. [Google Scholar] [CrossRef]
  35. Unreal Engine. Datasmith Overview. Available online: https://docs.unrealengine.com/5.1/en-US/datasmith-plugins-overview/ (accessed on 15 November 2022).
  36. Unreal Engine. Datasmith Export Plugins. Available online: https://www.unrealengine.com/en-US/studio/downloads (accessed on 15 November 2022).
  37. Epic Games. What Is Epic Online Services (EOS)? Available online: https://dev.epicgames.com/docs/services/en-US/GameServices/Overview/index.html (accessed on 31 May 2022).
  38. Human Codeable. Documentation. Available online: http://ansgarjahn.de/Downloads/Documentation/Documentation_AFCore_4.1.pdf (accessed on 15 November 2022).
  39. Unreal Engine. Umg ui Designer. Available online: https://docs.unrealengine.com/4.27/en-US/InteractiveExperiences/UMG/ (accessed on 15 November 2022).
  40. Hansen, A.; Andersen, K.; Sievert, B.; Kiswani, J.; Dascalu, S.M.; Frederick, C.H. Let’s VR: A multiplayer framework for virtual reality. In Proceedings of the 27th International Conference on Software Engineering and Data Engineering, SEDE 2018, New Orleans, LA, USA, 8–10 October 2018; pp. 51–56. [Google Scholar]
  41. Unity. Vivox Voice and Text Comms. Available online: https://unity.com/products/vivox (accessed on 31 May 2022).
  42. Oliveira, G.M. Realidade Virtual em Apresentação de Projeto Arquitetônico. Master’s Thesis, Universidade Federal de Santa Catarina, Santa Catarina, Brazil, 2016. [Google Scholar]
  43. Almeida, L.G.G. Método para Criação de Metaverso Imersivo com Foco em Treinamentos Industriais. Master’s Thesis, Universidade Federal do Paraná, Paraná, Brazil, 2022. [Google Scholar]
  44. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
Figure 1. Multiuser environment.
Figure 1. Multiuser environment.
Applsci 13 08915 g001
Figure 2. Flowchart of the proposed method.
Figure 2. Flowchart of the proposed method.
Applsci 13 08915 g002
Figure 3. Interactive scene developed in Unreal Engine 4.
Figure 3. Interactive scene developed in Unreal Engine 4.
Applsci 13 08915 g003
Figure 4. Hydraulic Press digitizing process [34].
Figure 4. Hydraulic Press digitizing process [34].
Applsci 13 08915 g004
Figure 5. Hydraulic press modeled in 3ds Max 2022 software.
Figure 5. Hydraulic press modeled in 3ds Max 2022 software.
Applsci 13 08915 g005
Figure 6. Datasmith export plugins [36].
Figure 6. Datasmith export plugins [36].
Applsci 13 08915 g006
Figure 7. Imported scenario in Unreal Engine.
Figure 7. Imported scenario in Unreal Engine.
Applsci 13 08915 g007
Figure 8. Scenery with lighting ready inside Unreal Engine 4.
Figure 8. Scenery with lighting ready inside Unreal Engine 4.
Applsci 13 08915 g008
Figure 9. Human Codeable Framework on Unreal Marketplace [37].
Figure 9. Human Codeable Framework on Unreal Marketplace [37].
Applsci 13 08915 g009
Figure 10. Advanced Framework Presentation [38].
Figure 10. Advanced Framework Presentation [38].
Applsci 13 08915 g010
Figure 11. Presentation of the Advanced Framework [38].
Figure 11. Presentation of the Advanced Framework [38].
Applsci 13 08915 g011
Figure 12. User interface of the metaverse.
Figure 12. User interface of the metaverse.
Applsci 13 08915 g012
Figure 13. UE4 UI creation tool.
Figure 13. UE4 UI creation tool.
Applsci 13 08915 g013
Figure 14. Programming the Vivox voice system.
Figure 14. Programming the Vivox voice system.
Applsci 13 08915 g014
Figure 15. Drag Component.
Figure 15. Drag Component.
Applsci 13 08915 g015
Figure 16. Select component.
Figure 16. Select component.
Applsci 13 08915 g016
Figure 17. 3D model with collision box.
Figure 17. 3D model with collision box.
Applsci 13 08915 g017
Figure 18. Plate with Grab component.
Figure 18. Plate with Grab component.
Applsci 13 08915 g018
Figure 19. UI developed for the avatar system.
Figure 19. UI developed for the avatar system.
Applsci 13 08915 g019
Figure 20. Creation of the avatar system in UMG.
Figure 20. Creation of the avatar system in UMG.
Applsci 13 08915 g020
Figure 21. Programming the avatar system.
Figure 21. Programming the avatar system.
Applsci 13 08915 g021
Figure 22. Unreal Engine 4 package project feature.
Figure 22. Unreal Engine 4 package project feature.
Applsci 13 08915 g022
Figure 23. Metaverse pilot project for industrial training.
Figure 23. Metaverse pilot project for industrial training.
Applsci 13 08915 g023
Table 1. Search string and the number of articles in their respective databases.
Table 1. Search string and the number of articles in their respective databases.
Search StringScopusWeb of ScienceTotal
(virtual reality) AND (multiplayer OR
multiuser OR metaverse)
15096032112
Table 2. Inclusion and exclusion criteria.
Table 2. Inclusion and exclusion criteria.
Inclusion CriteriaExclusion Criteria
ScopeResearch that addresses the topic related to the use of immersive virtual reality in multiuser scenarios, with a focus on training, capacity building, problem solving, or the development and testing of multiuser applications.Research that does not address the topic related to the use of immersive virtual reality in multiuser scenarios, focusing on training, capacity building, problem solving, or the development and testing of multiuser applications.
Document TypePublished journal articles, conference proceedings articles.Interviews published in journals.
AccessPapers accessible through: CAPES Portal, Google Scholar, and publishers’ open access websites.Works whose viewing requires paid registration or works whose legality is questionable.
IdiomPapers written in languages mastered by the author: English and Portuguese.Papers that require paid subscription to view or papers whose legality is questionable.
YearPapers published in the last five years.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Almeida, L.G.G.; Vasconcelos, N.V.d.; Winkler, I.; Catapan, M.F. Innovating Industrial Training with Immersive Metaverses: A Method for Developing Cross-Platform Virtual Reality Environments. Appl. Sci. 2023, 13, 8915. https://doi.org/10.3390/app13158915

AMA Style

Almeida LGG, Vasconcelos NVd, Winkler I, Catapan MF. Innovating Industrial Training with Immersive Metaverses: A Method for Developing Cross-Platform Virtual Reality Environments. Applied Sciences. 2023; 13(15):8915. https://doi.org/10.3390/app13158915

Chicago/Turabian Style

Almeida, Lucas G. G., Nalini V. de Vasconcelos, Ingrid Winkler, and Márcio F. Catapan. 2023. "Innovating Industrial Training with Immersive Metaverses: A Method for Developing Cross-Platform Virtual Reality Environments" Applied Sciences 13, no. 15: 8915. https://doi.org/10.3390/app13158915

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop