Next Article in Journal
Shape Discrimination of Individual Aerosol Particles Using Light Scattering
Previous Article in Journal
Design and Evaluation of an Alternative Control for a Quad-Rotor Drone Using Hand-Gesture Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Visual Gait Analysis Based on UE4

The School of Electronic Information, Qingdao University, Qingdao 266071, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(12), 5463; https://doi.org/10.3390/s23125463
Submission received: 9 May 2023 / Revised: 31 May 2023 / Accepted: 6 June 2023 / Published: 9 June 2023
(This article belongs to the Section Physical Sensors)

Abstract

:
With the development of artificial intelligence technology, virtual reality technology has been widely used in the medical and entertainment fields, as well as other fields. This study is supported by the 3D modeling platform in UE4 platform technology and designs a 3D pose model based on inertial sensors through blueprint language and C++ programming. It can vividly display changes in gait, as well as changes in angles and displacements of 12 parts such as the big and small legs and arms. It can be used to combine with the module of capturing motion which is based on inertial sensors to display the 3D posture of the human body in real-time and analyze the motion data. Each part of the model contains an independent coordinate system, which can analyze the angle and displacement changes of any part of the model. All joints of the model are interrelated, the motion data can be automatically calibrated and corrected, and errors measured by an inertial sensor can be compensated, so that each joint of the model will not separate from the whole model and there will not occur actions that against the human body’s structures, improving the accuracy of the data. The 3D pose model designed in this study can correct motion data in real time and display the human body’s motion posture, which has great application prospects in the field of gait analysis.

1. Introduction

Virtual reality technology is a new emerging technology in recent years through the super-high computing power of the computer to achieve the visualization of complex data and the creation of interactive scenes, and has completely exceeded the interactive mode of traditional human–computer interface. The concept of virtual reality refers to a whole simulated reality that is built with computer systems using digital formats [1]. With the development of science and technology, virtual reality technology has developed rapidly, and its content covers many fields [2], such as retail [3,4], education [5], tourism [6], health care [7], entertainment [8] and research [9,10]. Virtual technology has been widely used and, relying on the current virtual technology for high precision and high simulation needs, it can almost perfectly show the object state model, as well as the elements of the environment rendering; more importantly, its powerful interaction ability is unprecedented.
Three-dimensional gait analysis is used to collect and analyze the data on posture movement, the center of gravity fluctuation and joint bending. In the medical field and professional sports field, the application of human gait analysis is very extensive [11,12,13,14]. Teuf et al. described an inertial sensor (IMU) system that accurately measures the ROM of human gait and the special characteristics of the 3D kinematics of the lower limbs, and is suitable for patients after THA and healthy people [15]. However, they did not use the biomechanical model to show the human lower limb movements in real time. Figueiredo et al. proposed a wearable inertial sensor system for real-time detection of 3D angular velocity and 3D acceleration for up to six lower limbs and trunk segments, and sagittal joint angle for up to six lower limb joints [16], but they do not have a 3D model which was used to show the human gait. Tham et al. demonstrated that when measuring joint angles using inertial sensors, NARX can avoid the influence of magnetometers, accurately estimate 3D knee joint angles and measure 3D joint angles in the long term [17]. In the same year, a full-body wireless wearable motion-sensing system was reported by Lee et al. to study the motion of human lower limbs and arms, which can reconstruct simple 3D human body models in real-time using quaternion data measured by sensors [18]. However, their 3D mannequin is composed of simple stick structures, not a complete mannequin, and, to some extent, can only reflect part of the limb movement and cannot vividly show the human body movement. Xie et al. proposed GaitTracker, an IMU-based three-dimensional (3D) skeletal tracking system, which is able to accurately perform 3D skeletal tracking of the lower limbs for gait analysis [19]. Nevertheless, they built the 3D model to show the human body of lower limb movement, but there is no comprehensive analysis of the upper limb movements. In fact, most gait analysis research is about the movement of part of human limbs and has not been visualized, while there are few reports about the visual gait analysis of the whole human body movement.
In this study, a 3D pose model was created by using UE4 software Version 2.24. The model can be combined with the motion capture module based on inertial sensors to obtain the motion data of 12 parts of the human body and display the various postures of the human body in real time in the form of animation. Using virtual reality technology, the gait data of humans is obtained by gait analysis of the model, which makes the analysis process clearer and the operation simpler. The experimental results show that the 3D posture model can meet the requirements of gait analysis, and the real-time visual change of virtual reality presentation is the core function of the current intelligent requirements, which can be widely used in medical and sports fields of 3D gait analysis.

2. Thesis Research Content

Shanshan Chen et al. pointed out that gait analysis is usually performed by subjective and qualitative approaches in current clinical settings, although some severe gait disorders can be observed by the human eye without quantitative measures, subtle changes can go unnoticed, thus affecting disease staging, severity assessment, and subsequent treatment planning [20]. So, sometimes there may be a deviation in analyzing the cause of disease by observing the motion trajectory with the naked eye, which makes the doctor unable to see the internal focus of the patient. Meanwhile, 3D gait analysis is generally used in medicine for the objective evaluation of human walking and can be used to supplement standard clinical evaluation in identifying and understanding gait problems [21]. A primary advantage of 3D gait analysis is the ability to quantify joint and segment movement in all three planes throughout the gait cycle (stance and swing), and this information can help clinicians identify gait deviations that are difficult to recognize and, in many cases, impossible to appreciate through observation alone [22]. Therefore, through the signal import of inertial sensors combined with virtual reality, the patient’s motion posture is displayed in an animated form, thereby assisting the doctor to diagnose the patient’s condition. In this study, the human model is built and imported into the UE4 platform, and the action programming is carried out by using C++, blueprint language and its port technology, to display the 3D pose animation model and for the medical and sports fields of 3D gait analysis.
Based on the analysis of human gait, this paper studies the modeling of the human model and the use of blueprint language and animation logic processing for the model, which constructs the basic mesh skeleton with the ability of independent movement and makes the human model joint linkage in order to complete the action demonstration under the instruction. Finally, the gait algorithm is combined with the model so that the model can be combined with the motion capture module based on inertial sensors, and data transmission can be carried out to facilitate the real-time display of various postures of human motion.

3. Model Design

Unreal Engine is a popular game engine for creating high-fidelity video games and one of the best choices for virtual reality developments [23]. At present, the use of UE4 can be described as very extensive, and it has a far-reaching impact on games, film and television, medical treatment, sports and so on [24,25,26,27,28,29,30,31]. The real-time visualization capability of the UE4 engine is not available in traditional virtual technology. Compared with traditional modeling software, its advantage lies in real-time rendering, which enables designers to create in the WYSIWYG state. The UE4 platform supports the interaction between software and hardware of different types of virtual reality devices and can receive files in various formats as creative materials, which greatly improves the richness of operation.
UE4 can be programmed in blueprint or C++, which makes it suitable for almost any virtual reality simulation scenario. Blueprints, also known as visual blueprint scripts, work through node wires and are a visual programming language that requires compilation, providing an intuitive, node-based interface for creating new types of Actor and Level Script events. Blueprints are a special asset type that can create logic, set variable data in an intuitive and node-based way, plan to create custom characters, events, features, etc., and quickly complete Gameplay iterations. Blueprint can also inherit from C++ classes, defined variables in C++, call functions or implement events in C++. In its basic form, a blueprint is a visual script that you add to your game. It uses wires to connect nodes, events, functions, and variables together to create complex gameplay elements. Blueprints have the advantage of being intuitive and convenient, compiling quickly, and you can choose blueprints when creating some complex programming.
The blueprint interface may be understood as a function set in the UE4 platform, and one or more functions may be added to other blueprints, and the blueprints may have the functions of the interface after being assigned to the interface. The summary is that the blueprint interface needs to define a function, called an interface, so that the whole project exists this function for us to use, but in different blueprint classes, it can play different roles.

3.1. Skeleton Model

A skeleton mesh is a bone-bound model that you can use to create animation. In this study, a hierarchical skeleton model of the 3D human body is established, and the motion data of each node is corresponding to the 3D human body model for motion visualization. The building of the model requires a character skeleton, a mesh body, and a physical structure. Many of these modeling resources are included in the external resources owned by the UE4 platform, such as geometry, character body, etc. When adding skeletons and skins to the model, the resources of the model needed to be downloaded for use and put into the content folder. Then, open my character blueprint class, switch to the viewport window, select the Mesh object, choose to import skeleton in the details window on the right side, and adjust the position of the character model. With the skeleton in the create stage, the field of each limb can input all the values to define the joint setting of the role. Inside the skeleton settings panel can change the skeleton’s hierarchy or the number of joints. The character components are programmed by UE4 internal code to become a model with multiple blueprint-like functions, including skeleton construction, mesh construction, joint chain and so on.

3.2. Research on Model Coordinate System

When studying the motion of a model, the parameters such as motion angle and motion direction are inseparable from the coordinate system. The precise location of the nodes of the model is also inseparable from the coordinate system. The space coordinate systems mainly involved in attitude algorithms include the navigation coordinate system and the carrier coordinate system. A navigation coordinate system is a coordinate system used to transform and calculate the attitude change behavior of a moving carrier, and a carrier coordinate system generally refers to a follow-up coordinate system established on the carrier.
The model studied in this paper mainly uses the navigation coordinate system and the carrier coordinate system in each part of the model, as each part has its own independent coordinate system, according to its own coordinate system for XYZ axis rotation and displacement movement. Generally, the right-handed coordinate system is used for attitude algorithm, and because the coordinate system in UE4 is a left-handed coordinate system, in this case, the hardware-generated quaternions directly input into the model will lead to attitude mismatch. From the point of view of a vector product, the attitude can be consistent by adjusting the order and sign of the quaternion.

3.3. Data Transmission

In the general motion recording process, the motion data should be transmitted to software such as 3DMAX and adjust the animation effect in the motion capture software. Reorient the character’s skeletal structure and save the animation in FBX format to view. The main function of UE4’s live link is data transmission, which can directly import data into the engine and view the action effect. However, there is usually a big problem: the displacement information for the ROOT point is missing. The data from the external software is recorded by the hip as the model’s motion patterns change, and the bone structure corresponds to the marked points on the motion capture suit. During this process, the root skeleton, which marks the actor’s position information in project space, does not move because reality does not have these data.
In order to record the correct displacement information of the model animation, it is necessary to ensure that the root point of the actor can correctly capture the transmitted displacement data, and then drive the model bone animation and capture the same effect. In some cases, it is also necessary to correct the position through the collision of the character’s capsule body.
The hierarchy of character control in UE4 is capsule to root bone to hip pelvis. Therefore, pelvic data cannot be used in the other two levels. This effect was equivalent to using a low-level child object to drive a high-level parent object, and the movement of the high-level parent object would in turn drive the child object. The cycle would be endless, and the character would collapse in an instant. Thus, it needs to have the displacement and rotation values related to the low-level hip, and then import the high-level capsule body and root bone. Because the height of the pelvis is not equal to the height of the character, a new root point must be calculated at the same time to ensure that it is in the same position as the capsule body.

3.4. Model Joint Linkage

The human body model can be simplified to be composed of 19 segments and 20 joints. This paper mainly studies the movement angle and displacement changes of the 12 parts of the human body, including left and right feet, left and right calves, left and right thighs, waist, head, left and right upper arms and left and right small arms, and the movement data measured by Based on the mpu6050 inertial sensors can control the movement of the corresponding parts of the model.
As shown in Figure 1, each relevant part of the model has an independent coordinate system; that is, while the model is in motion, any part of the model will consist of three-axis gyroscope signals, three-axis acceleration signals and three-axis magnetometer signals. Therefore, the motion angle and displacement of each part of the model are calculated independently and then integrated into one piece to control the motion of the model. Due to various reasons, the motion data may have errors, and lead to possible separation of limbs. As shown in Figure 2, the left calf and left thigh of the model are separated. In order to avoid this situation, we use C++ and a blueprint language program to realize the ergonomic linkage of the model. With the human’s waist joint as the core, when a certain joint had to deviate from the whole model due to the measurement data of the inertial sensor, the model can calculate the offset of the joint in a certain direction and automatically minus a certain offset in the measurement data after to ensure it can reflect the human body movement posture so that the motion data for the calibration, correction and compensation part of the error of inertial sensor measurement. For example, the movement of the left thigh is associated with the movement of the left calf, and the influence of the movement of the left thigh must be considered when calculating the displacement of the left calf, so each joint of the model must be linked, and when an error in the movement data causes a certain part to be separated from the model, the model will correct the movement data to connect the limb parts, and the measurement accuracy is further improved. In the meantime, restrictions were also placed on some parts of the model to prevent unscientific human movements, such as rotating the head 180° horizontally.

3.5. Overall Introduction of the Model

The three-dimensional posture model is used to combine with the motion capture module based on inertial sensors to display the motion posture of each part of the human body in real-time for gait analysis. Therefore, this study used blueprint language and C++ to program, created a virtual serial port, received the original data of motion in real-time through the serial port, and processed the data with a Kalman filter and complementary filter attitude algorithm to reduce certain errors. Finally, each set of quaternion data is transformed into Euler angles, which are applied to 12 parts of the model to display the motion posture in real time. Part of the code is shown in Figure 3 and Figure 4.
Figure 3 is a virtual serial port established by using blueprint language, which transmits motion data by changing the corresponding serial port number and baud rate. Figure 4 shows the blueprint for receiving lumbar joint data. The function of this code is to convert and apply the lumbar motion data transmitted through the virtual serial port to the model.
In this study, in order to better gait analysis, an interface for real-time display of each part of the human body movement angles was created, as shown in Figure 5. When the model receives the data movement, the angle of joint movement can be displayed on the interface in real time. In turn, you can control the animation of bones by inputting data.
The three-dimensional posture model can receive the motion data and perform corresponding actions. The variables of the joint motion angle are set in the mode, and through the variables in the mode to communicate. The windows of 12 joints of the human body were set up in this study, and the sub-windows of each joint can input the corresponding angle of movement and rotation. Considering the actual situation of human motion, some restrictions are added to the corresponding joints, so that it cannot take unreasonable actions (such as rotating the foot by 180°). By using the Control Rig plug-in, when the data is greater than the bending rotation angle of a joint, it will automatically be set to the maximum variable angle. Examples are shown in Figure 6 and Figure 7 below.
As shown in Figure 6 and Figure 7, when you enter 200 (greater than the maximum angle that can be reached) in the right knee angle, the system automatically changed the parameter to 90, avoiding unscientific actions of the character.
In the operation of the model, sometimes in order to facilitate the debugging of different posture movements, it is very cumbersome to input data again and again, but it is more convenient to use the mouse to directly change the posture of the model. Therefore, the mouse input blueprint is added to use the mouse to move and rotate the limbs in real time. The function of this blueprint is to make the mouse control the joint to rotate and swing around the rotation axis. The blueprint is shown in Figure 8.
The custom event blueprint had an execution pin and an optional output data pin, which is equivalent to an initial switch of the model. When the event is set, it is equivalent to turning on the switch, which can call the event anywhere in the blueprint sequence, enabling the model to have the blueprint function of executing the pin. So, we set up the event blueprint, and then connected the blueprint for limb movement and rotation to the event blueprint node, as shown in Figure 9.
The blueprint of the whole model is divided into three parts: forward and backward movement, left and right movement and rotation. Therefore, the blueprint contains three groups of Delta Location X, Delta Location Y and Delta Location Z, and made the model body able to move in XYZ three directions and rotate. The movement blueprint of the limb is similar to the body movement blueprint, which includes the movement data of the XYZ three-axis directions and can realize the function of limb movement and rotation. These blueprints are assembled by connecting nodes, starting with event blueprints, and finally forming the control blueprint for the entire model.
The blueprint of the left arm part is shown in Figure 10, driving the motion of the bones by applying the variables in mode to the bone model. The left arm blueprint is divided into two parts. One part controls the lateral elevation, forward and backward swing and pivot movement of the left upper arm, and the other part controls the movement of the left lower arm around the elbow joint in three directions with the left elbow as the node.

4. Experiments and Discussions

4.1. Partial Joint Experiment

The joint experiment of the model is related to its 12 parts, including the head, left and right arms, left and right elbows, left and right legs, left and right ankles, left and right knees, and waist. In order to make the model meet the action of three-dimensional gait analysis, the joint test is carried out first to test whether the joint can complete the corresponding action. Part of the joint input tests are shown in Figure 11. Through many experiments, it is concluded that each joint of the model can complete the action display according to the motion data.

4.2. Whole Model Experiment

The main interface during the model operation is shown in Figure 12. The character model consulted the small white man model inside UE4. Each joint of the model is independent but interrelated; that is, each part has a small coordinate system, which can independently control the movement of a certain part, and each joint is linked with other joints to ensure that it will not be separated from the whole model and ensure the reliability of data. The lower part of the main interface can display the change of each joint angle when the human body moves in real time. While there is no data transmission, the posture of the model can be controlled by changing the joint angle value in the main interface, as shown in Figure 13.

5. Conclusions

In this study, a 3D animation model for gait analysis is designed which can be combined with the motion capture module based on inertial sensors to vividly display the human motion posture through animation. The model can accept human motion data for gait calculation, and perform corresponding action demonstrations to achieve visual gait analysis. First of all, the model can receive the motion data measured by the motion capture module based on the inertial sensor through the virtual serial port, and perfectly present the motion of the signal source of the sensor and the angle of joint motion on the UE4 platform. At the same time, in order to avoid actions against the human body structure, all joints of the model are related to each other to achieve ergonomic linkage, so that the activities of the characters will not collapse while correcting part of the motion data to improve the accuracy of gait analysis.
During the experiments, the entire model was able to show the rotation and motion of 12 parts of the human body based on the motion data. At present, it is mainly used for gait measurement and analysis of low-speed main joints, but the details such as hand joints and toes have not been measured, and when the whole model moves at a high speed, the range of motion of each joint remains to be verified. All in all, this study improved the human three-dimensional posture testing system, which can visualize the human motion posture, and has a great application prospect in the field of medical rehabilitation and motion analysis.

Author Contributions

Conceptualization, E.C.; methodology, E.C., S.W., R.L., L.L., G.M., S.F., Y.M. and D.M.; software, R.L.; validation, R.L., E.C., and S.W.; formal analysis, R.L.; investigation, R.L.; data curation, R.L.; writing—original, R.L.; writing—review and editing, E.C. and S.W.; visualization, R.L.; supervision, E.C. and S.W.; project administration, E.C. and S.W.; funding acquisition, E.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The study did not report any data.

Acknowledgments

The authors would like to thank the graduate programme in Electronic Information at the University of Qingdao. All authors would like to acknowledge the help from Enlin Cai and Shuying Wang in the system design.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Martín-Gutiérrez, J.; Mora, C.E.; Añorbe-Díaz, B.; González-Marrero, A. Virtual Technologies Trends in Education. Eurasia J. Math. Sci. Technol. Educ. 2017, 13, 469–486. [Google Scholar]
  2. Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2016, 21, 1–17. [Google Scholar] [CrossRef] [Green Version]
  3. Van Kerrebroeck, H.; Brengman, M.; Willems, K. Escaping the crowd: An experimental study on the impact of a Virtual Reality experience in a shopping mall. Comput. Hum. Behav. 2017, 77, 437–450. [Google Scholar] [CrossRef]
  4. Bonetti, F.; Warnaby, G.; Quinn, L. Augmented Reality and Virtual Reality in Physical and Online Retailing: A Review, Synthesis and Research Agenda. In Augmented Reality and Virtual Reality; Springer: Cham, Switzerland, 2018. [Google Scholar]
  5. Moro, C.; Birt, J.; Stromberga, Z.; Phelps, C.; Clark, J.; Glasziou, P.; Scott, A.M. Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis. Anat. Sci. Educ. 2021, 14, 368–376. [Google Scholar] [CrossRef]
  6. Griffin, T.; Giberson, J.; Lee, S.H.; Guttentag, D.; Kandaurova, M.; Sergueeva, K.; Dimanche, F. Virtual Reality and Implications for Destination Marketing. In Proceedings of the Travel & Tourism Research Association International Conference, Quebec City, QC, Canada, 20–22 June 2017. [Google Scholar]
  7. Freeman, D.; Reeve, S.; Robinson, A.; Ehlers, A.; Clark, D.; Spanlang, B.; Slater, M. Virtual reality in the assessment, understanding, and treatment of mental health disorders. Psychol. Med. 2017, 47, 2393–2400. [Google Scholar] [CrossRef] [Green Version]
  8. Lin, J.-H.T.; Wu, D.-Y.; Tao, C.-C. So scary, yet so fun: The role of self-efficacy in enjoyment of a virtual reality horror game. New Media Soc. 2017, 20, 3223–3242. [Google Scholar] [CrossRef]
  9. Bigné, E.; Llinares, C.; Torrecilla, C. Elapsed time on first buying triggers brand choices within a category: A virtual reality-based study. J. Bus. Res. 2016, 69, 1423–1427. [Google Scholar] [CrossRef]
  10. Meißner, M.; Pfeiffer, J.; Pfeiffer, T.; Oppewal, H. Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research. J. Bus. Res. 2019, 100, 445–458. [Google Scholar] [CrossRef]
  11. Bensoussan, L.; Viton, J.M.; Barotsis, N.; Delarque, A. Evaluation of patients with gait abnormalities in physical and rehabilitation medicine settings. J. Rehabil. Med. 2008, 40, 497–507. [Google Scholar] [CrossRef] [Green Version]
  12. Lee, H.; Sullivan, S.J.; Schneiders, A.G. The use of the dual-task paradigm in detecting gait performance deficits following a sports-related concussion: A systematic review and meta-analysis. J. Sci. Med. Sport 2013, 16, 2–7. [Google Scholar] [CrossRef]
  13. Aich, S.; Pradhan, P.M.; Chakraborty, S.; Kim, H.C.; Kim, H.T.; Lee, H.G.; Kim, I.H.; Joo, M.I.; Jong Seong, S.; Park, J. Design of a Machine Learning-Assisted Wearable Accelerometer-Based Automated System for Studying the Effect of Dopaminergic Medicine on Gait Characteristics of Parkinson’s Patients. J. Heal. Eng. 2020, 2020, 1823268. [Google Scholar] [CrossRef] [PubMed]
  14. Bouchrika, I. Parametric elliptic fourier descriptors for automated extraction of gait features for people identification. In Proceedings of the 2015 12th International Symposium on Programming and Systems (ISPS), Algiers, Algeria, 28–30 April 2015. [Google Scholar]
  15. Teufl, W.; Taetz, B.; Miezal, M.; Lorenz, M.; Pietschmann, J.; Jollenbeck, T.; Frohlich, M.; Bleser, G. Towards an Inertial Sensor-Based Wearable Feedback System for Patients after Total Hip Arthroplasty: Validity and Applicability for Gait Classification with Gait Kinematics-Based Features. Sensors 2019, 19, 5006. [Google Scholar] [CrossRef] [Green Version]
  16. Figueiredo, J.; Carvalho, S.P.; Vilas-Boas, J.P.; Goncalves, L.M.; Moreno, J.C.; Santos, C.P. Wearable Inertial Sensor System Towards Daily Human Kinematic Gait Analysis: Benchmarking Analysis to MVN BIOMECH. Sensors 2020, 20, 2185. [Google Scholar] [CrossRef] [Green Version]
  17. Tham, L.K.; Osman, N.A.A.; Kouzbary, M.A.; Aminian, K. Biomechanical Ambulatory Assessment of 3D Knee Angle Using Novel Inertial Sensor-Based Technique. IEEE Access 2021, 9, 36559–36570. [Google Scholar] [CrossRef]
  18. Lee, K.; Tang, W. A Fully Wireless Wearable Motion Tracking System with 3D Human Model for Gait Analysis. Sensors 2021, 21, 4051. [Google Scholar] [CrossRef]
  19. Xie, L.; Yang, P.; Wang, C.; Gu, T.; Duan, G.; Lu, X.; Lu, S. GaitTracker: 3D Skeletal Tracking for Gait Analysis Based on Inertial Measurement Units. ACM Trans. Sens. Netw. 2022, 18, 1–27. [Google Scholar] [CrossRef]
  20. Chen, S.; Lach, J.; Lo, B.; Yang, G.Z. Toward Pervasive Gait Analysis With Wearable Sensors: A Systematic Review. IEEE J. Biomed. Health Inform. 2016, 20, 1521–1537. [Google Scholar] [CrossRef] [PubMed]
  21. Fouasson-Chailloux, A.; Menu, P.; Dauty, M. Lower-Limb Arthropathies and Walking: The Use of 3D Gait Analysis as a Relevant Tool in Clinical Practice. Int. J. Environ. Res. Public Health 2022, 19, 6785. [Google Scholar] [CrossRef]
  22. Mueske, N.M.; Ounpuu, S.; Ryan, D.D.; Healy, B.S.; Thomson, J.; Choi, P.; Wren, T.A.L. Impact of gait analysis on pathology identification and surgical recommendations in children with spina bifida. Gait Posture 2019, 67, 128–132. [Google Scholar] [CrossRef]
  23. Natephra, W.; Motamedi, A.; Fukuda, T.; Yabuki, N. Integrating building information modeling and virtual reality development engines for building indoor lighting design. Vis. Eng. 2017, 5, 19. [Google Scholar] [CrossRef] [Green Version]
  24. Cavalcanti, J.; Valls, V.; Contero, M.; Fonseca, D. Gamification and Hazard Communication in Virtual Reality: A Qualitative Study. Sensors 2021, 21, 4663. [Google Scholar] [CrossRef] [PubMed]
  25. Fırat, H.B.; Maffei, L.; Masullo, M. 3D sound spatialization with game engines: The virtual acoustics performance of a game engine and a middleware for interactive audio design. Virtual Real. 2021, 26, 539–558. [Google Scholar] [CrossRef]
  26. El-Wajeh, Y.A.M.; Hatton, P.V.; Lee, N.J. Unreal Engine 5 and immersive surgical training: Translating advances in gaming technology into extended-reality surgical simulation training programmes. Br. J. Surg 2022, 109, 470–471. [Google Scholar] [CrossRef] [PubMed]
  27. Chance, G.; Ghobrial, A.; McAreavey, K.; Lemaignan, S.; Pipe, T.; Eder, K. On Determinism of Game Engines Used for Simulation-Based Autonomous Vehicle Verification. IEEE Trans. Intell. Transp. Syst. 2022, 23, 20538–20552. [Google Scholar] [CrossRef]
  28. Zhao, Y.; Zhong, R.; Cui, L. Intelligent recognition of spacecraft components from photorealistic images based on Unreal Engine 4. Adv. Space Res. 2023, 71, 3761–3774. [Google Scholar] [CrossRef]
  29. Matzko, R.O.; Mierla, L.; Konur, S. Novel Ground-Up 3D Multicellular Simulators for Synthetic Biology CAD Integrating Stochastic Gillespie Simulations Benchmarked with Topologically Variable SBML Models. Genes 2023, 14, 154. [Google Scholar] [CrossRef] [PubMed]
  30. Li, C.; Fahmy, A.; Sienz, J. An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion. Sensors 2019, 19, 4586. [Google Scholar] [CrossRef] [Green Version]
  31. Scorpio, M.; Laffi, R.; Masullo, M.; Ciampi, G.; Rosato, A.; Maffei, L.; Sibilio, S. Virtual Reality for Smart Urban Lighting Design: Review, Applications and Opportunities. Energies 2020, 13, 3809. [Google Scholar] [CrossRef]
Figure 1. Human body model.
Figure 1. Human body model.
Sensors 23 05463 g001
Figure 2. Model not linked.
Figure 2. Model not linked.
Sensors 23 05463 g002
Figure 3. Blueprint for virtual serial port setup.
Figure 3. Blueprint for virtual serial port setup.
Sensors 23 05463 g003
Figure 4. Accept lumbar joint data blueprint.
Figure 4. Accept lumbar joint data blueprint.
Sensors 23 05463 g004
Figure 5. Joint angle display interface.
Figure 5. Joint angle display interface.
Sensors 23 05463 g005
Figure 6. Joint angle automatically corrected to 90°.
Figure 6. Joint angle automatically corrected to 90°.
Sensors 23 05463 g006
Figure 7. The system automatically corrects the maximum angle rear interface.
Figure 7. The system automatically corrects the maximum angle rear interface.
Sensors 23 05463 g007
Figure 8. Mouse input blueprint.
Figure 8. Mouse input blueprint.
Sensors 23 05463 g008
Figure 9. Event blueprint.
Figure 9. Event blueprint.
Sensors 23 05463 g009
Figure 10. Left arm, left elbow exercise blueprint.
Figure 10. Left arm, left elbow exercise blueprint.
Sensors 23 05463 g010
Figure 11. Head control and waist control.
Figure 11. Head control and waist control.
Sensors 23 05463 g011
Figure 12. Whole model.
Figure 12. Whole model.
Sensors 23 05463 g012
Figure 13. Whole model experiment.
Figure 13. Whole model experiment.
Sensors 23 05463 g013
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, R.; Liu, L.; Ma, G.; Feng, S.; Mu, Y.; Meng, D.; Wang, S.; Cai, E. Visual Gait Analysis Based on UE4. Sensors 2023, 23, 5463. https://doi.org/10.3390/s23125463

AMA Style

Liu R, Liu L, Ma G, Feng S, Mu Y, Meng D, Wang S, Cai E. Visual Gait Analysis Based on UE4. Sensors. 2023; 23(12):5463. https://doi.org/10.3390/s23125463

Chicago/Turabian Style

Liu, Ruzhang, Luyin Liu, Guochao Ma, Shanshan Feng, Yuanhui Mu, Dexi Meng, Shuying Wang, and Enlin Cai. 2023. "Visual Gait Analysis Based on UE4" Sensors 23, no. 12: 5463. https://doi.org/10.3390/s23125463

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop