Keywords

1 Introduction

Mixed reality is a world consisting of real world objects and virtual objects in a single environment. Unlike virtual reality, mixed reality enables the users to interact with virtual world along with feel of presence inside real world. It makes the human-computer interaction very effective. Due to this reason, concept of mixed reality is prevailing across various industries for creating highly interactive interfaces of their systems. Variety of industries from areas of engineering, marketing, entertainment etc. is using mixed reality for simulating their work and interactions in 3D environment. They found mixed reality as better source of medium to understand the information. If we analyze service of air traffic control and management, it also requires interactive mixed reality interfaces for better user experience [1]. Complexity in service of air traffic control is increasing day by day with the rise in air traffic [2]. It is becoming difficult to monitor and handle air traffic using conventional approaches in air traffic control tower. It needs interactive and more flexible interfaces for better quality of service. Main objective of this study is the provision of mixed reality interfaces for enhancing capabilities of air traffic controllers. The job of air traffic controllers is full of stress because they need to monitor lot of information like airplane take-offs, landings, collision avoidance etc. There are multiple 2D displays representing different information within control tower (See Fig. 1). It is difficult for air traffic controllers to move around these screens for monitoring various information related to air traffic.

Fig. 1.
figure 1

Inside view of air traffic control tower

Moreover, 2D displays could not deliver full understanding of the situations at airfield. Therefore, controllers need to look outside of the window of control tower for visual monitoring of the runways. Sometimes, bad weather conditions come across, which limit the outside view of air traffic controllers from the tower (See Fig. 2). Due to this blocked view, controllers could not monitor the traffic on runways. We have developed this tool to overcome such limitations faced by the air traffic controllers within control tower. This tool assists air traffic controllers in monitoring and managing traffic at real time with the help of 4D view (3D Model + Time) of airfield in mixed reality world. It enables them to track the airplanes in more easy and interactive way. Target of the interaction models developed in this mixed reality tool is to increase the analysis power of air traffic controllers for better management of increasing air traffic. This tool presents two main views for monitoring and control of air traffic. First one is the 4D mixed reality view of whole airport including airport building, runways, taxiways, parking, traffic etc. for monitoring ground traffic. The second one is the map view within mixed reality for tracking the status of the traffic in air and having the globe view. Both can be categorized as the detailed view and broad view respectively. Furthermore, this study presents another concept which is based on provision of the easy and fast access to the information required by controller. Main contributions of this study are as follows:

Fig. 2.
figure 2

Hindrance to the outside view from control tower due to fog

  1. 1.

    Mapped flight trajectories using real world radar data into the mixed reality world of 4D airfield

  2. 2.

    Presented 4D view of airfield in mixed reality for providing more detailed and clear information of air traffic for better analysis.

  3. 3.

    Easy access to the multilevel information about air traffic’s location in real world from a single see-through display at one place.

  4. 4.

    Weather independent air traffic control and management system in case of visual impairments due to bad weather conditions.

  5. 5.

    Facilitation of air traffic management service from any remote location with the help of interactive 4D view of airfield through wireless holographic display.

Better interactivity in work environment could increase the overall productivity. This proposed methodology could impact overall air traffic control systems on large extent. One of the significances of this tool is better interaction between air traffic controllers and system’s interfaces of control towers. It reduces workload of controllers by providing easy access to required information on single holographic display in mixed reality. Concerned personnel do not have to switch between multiple screens for accessing various information based on requirements at runtime. Due to very interactive 3D display, this tool brings monitoring power of air traffic controllers beyond the limits of conventional 2D displays. Moreover, it helps air traffic controllers to analyze the complex airfield from different angles in mixed reality for better understanding of different situations at airfield. By using this tool, controllers would not be restricted by what the naked eye could see out of control tower’s window. Due to this reason, it facilitates the efficient air traffic management in case of bad weather conditions like fog, heavy rain etc. In this way, such outside areas could also be analyzed well which may not be easily visible by human eye through the control tower’s window at real time. Moreover, with the help of this tool, controllers could remotely monitor the air traffic and visualize ongoing situations at airfield using wireless head mounted display.

2 Related Works

Air traffic control is a very sensitive task and from last few decades lot of research is being done to make it efficient. Around 25 years ago, Lloyd Hitchcock of federal aviation administration (FAA) firstly proposed the idea of using augmented reality technology in air traffic control tower [3]. Prototype was not constructed at that time although many researchers recalled Mr. Hitchcock surmising on various methods that could facilitate controllers [4]. In 1996, Azuma and Daily presented advance human-computer interfaces of that time for air traffic management [1]. Furthermore, concept of remotely located air traffic service is also under focus. It includes real time rendering of recreated 360° tower view in virtual reality [5,6,7,8].

In 2010, Bagassi et al. presented a design of 4D interface on 2D screen for controllers to interpret the flight data [9]. This study also presented the method to estimate and display the airplanes at their future locations for detecting conflicts. In the mid of 2014, Reisman et al. executed a flight test for measuring registration error in the technology of augmented reality within the control tower [10]. In 2015, Arif Abdul Rahman et al. also proposed a method for 3D visualization of real time flight tracking data in 2D display [11]. They also aimed at providing better interaction mechanism to controllers using more focused visuals for clear interpretation of the flights around control tower.

In June 2016, Masotti et al. presented an idea of using augmented reality in the control tower [12]. In this study, they have proposed a rendering pipeline for multiple head-up displays for generating overplay between augmented reality layer and outside view from control tower. Afterwards, Bagassi et al. investigated different augmented reality systems in September 2016 for assessing their application to on-the-site control towers [13]. In their project, they have focused on the placement of the information on the actual windows of the control tower. In 2017, Zorzal et al. discussed the construction of a prototype which could place real time ground radar’s information to airplane on the captured image from live IP camera [14].

All the existing systems which are discussed above, played a good role in betterment of air traffic control service. But if we observe them in terms of better human-computer interaction, many limitations are still existent. Although, 3D interfaces are being provided for visualizing traffic on airport but these systems are still using 2D screens for rendering [9, 11, 14]. Comparatively, analysis power of controllers has been increased by these systems but still they are unable to efficiently interpret the situations. It is because they are still limited to 2D screens. There is not such efficient and interactive interface provided for visualizing the radar data about traffic in air. Moreover, the proposed augmented reality based systems for overlaying the real windows with 3D information, require lot of complex apparatus [12].

However, our system presents remedies to these existent limitations. Three dimensional interfaces in our system are not restricted within 2D displays. As our proposed tool is mixed reality based, so controller can efficiently perform analysis of the traffic’s situation by viewing the 4D information from multiple orientations. Possibility of viewing information from multiple angles by moving around the model induces more interactivity and controllers’ interest in the system. Controller will not be restricted to the monitor screens and can move anywhere with the head mounted mixed reality display. It increases the capability of the air traffic controllers to analyze the information more efficiently in any situation. Effects of the bad weather conditions and other circumstances will be very low on the traffic management system.

Detailed view which is proposed in this paper presents very clear and complete information about traffic on the airport. However, in map view we have also made interpretation of the radar data very easy. We have provided these highly interactive interfaces under the single roof with the help of head mounted mixed reality display. Moreover, along with the very interactive interfaces, we have provided very easy management of these interfaces within the mixed reality environment. It is very easy for the controllers to switch between the interfaces without any extra effort in this system. If we talk about the air traffic management service from a distant location, this tool is very remote and helpful here also. Controllers can simply wear the head mounted mixed reality display and have 4D information of the air traffic anywhere. There is no need to setup the complex system for monitoring the air traffic from a distant location. One head mounted device can bear the whole load.

3 Proposed Mixed Reality Tool

We have presented a novel approach of 4D visualization of airfield in holographic mixed reality. Various information is used to be monitored by the controllers at airfield. That information includes airplane landings, takeoffs, ground traffic, traffic in air etc. If we closely observe the information tracked by the controllers, we can categorize it into two main levels; traffic on ground and the traffic in air. Ground traffic is usually monitored by watching outside view from the window of control tower while the traffic in air is monitored using radar data over 2D screens in control tower. Each one has its own limitations based on different circumstances. Outside view from tower could be blocked due to bad weather conditions while radar information on 2D screens could not provide interactive and clear interpretation of the status of air traffic. Moreover, switching among different interfaces and outside view of tower could be time taking and require more manpower. Hence, we come up with the 4D presentation of these two levels of information of air traffic through a single and very interactive holographic mixed reality display. We have proposed detailed view for ground traffic, map view for the distant traffic in air and real time fast switching among them based on requirement. 3D objects are used for representing airplanes in both views. On top of 3D object, it shows flight’s information like flight number, airplane’s speed etc. Detailed discussion about proposed interaction paradigms’ design and development process are as follows:

3.1 Detailed View

As the name suggests, this view holds each minor detail of the airport as 3D scenery file. It includes airport’s building, runways, taxiways, airport apron and major surroundings. In this mechanism, controller has a 3D airport lying on a real world surface. It presents clear details of ongoing activities of airplanes over the airport in mixed reality. Controller can clearly visualize the airplanes flying around the airport, landings, takeoffs, traffic on taxiways and the airplanes which are parked in airport’s apron. Detailed view holds complete 4D mapping of airplanes’ activities on ground and the traffic in air around control tower. Figure 3a and b shows the detailed view of airfield.

Fig. 3.
figure 3

Yellow circle is highlighting the Airplane in detailed view (Color figure online)

3.2 Map View

Map view gives a better and interactive visual presentation of the air traffic around the globe. It presents the abstract level view of the air traffic over the world map. Controller could view and monitor the air traffic on map. In map view, controller could see the 3D objects moving over the map which are actually representing air traffic. Controllers do not have to deeply analyze the radar data for extracting the useful information. Here because of 4D visual presentation of radar data, they could easily interpret that how many airplanes are going to approach the airport in coming duration. It gives an idea about distance of the airplane from airport and which city or location it has reached so far. Mapping of 3D objects over the map in mixed reality is based on the information retrieved from the radar. This view uses the 2D texture of the world map rendered over a surface in 3D mixed reality world as shown in Fig. 4a and b.

Fig. 4.
figure 4

Map view of the air traffic

3.3 Design and Development Process

The process of mapping air traffic into the mixed reality world with respect to real world position consists of multiple steps. Detailed description of each step is as follows (First step is one time process):

Calculation of Scaling Factor.

Calculated the scaling factor value to scale down real world distance in order to map it in mixed reality world. It means that how many units in mixed reality world represents the 1 unit of real world. If ‘D1’ represents the scale of distance you want to keep in mixed reality world and ‘D2’ represents the actual distance in real world then scaling factor is calculated as in Eq. 1.

$$ {\text{Scaling}}\;{\text{Factor}} = \frac{{{\text{D}}1}}{{{\text{D}}2}} $$
(1)

Way of acquiring the actual (real world) distance for detailed view and map view is slightly different as discussed below:

Detailed View. Here, we’ve considered the actual length of one of the runways of our target airport. Then calculated the scaling factor by substituting the two values required by Eq. 1. Runway’s length that we wanted to keep for 3D model within mixed reality world and actual length of that runway.

Map View. For this view, we took a screenshot of map showing the location of our target airport and rendered it over a plane in mixed reality world. Afterwards, we calculated the scaling factor based on the actual distance from airport to top right location as shown in screenshot (see Fig. 5) and the scale we wanted to keep in mixed reality world.

Fig. 5.
figure 5

Arrow head representing the actual distance

Conversion of WGS-84 Geodetic Locations (GPS Readings) to Cartesian Coordinates.

Radar data presents the location of airplane according to World Geodetic System (GPS readings). Whereas, for mixed reality world unity follows the Cartesian coordinates system. Hence, for getting the Cartesian coordinates as x, y and z coordinate, it is required to convert Geodetic coordinates to the local tangent plane [15]. Basis of this implementation is a book written by Farrell and Barth [16]. By using GPS information, the x, y and z coordinates can be derived by using following system of equations.

$$ {\text{x}} = \left( {{\text{h}} + {\text{N}}} \right)\cos\uplambda \cos {\upvarphi } $$
(2)
$$ {\text{y}} = \left( {{\text{h}} + {\text{N}}} \right)\cos\uplambda\sin {\upvarphi } $$
(3)
$$ {\text{z}} = \left( {{\text{h}} + \left( {1 - {\text{e}}^{2} } \right){\text{N}}} \right)\sin\uplambda $$
(4)

Here, ‘h’ represents height, ‘λ’ is for latitude and ‘\( \upvarphi \)’ represents the longitude. Whereas, ‘N’ and ‘e’ can be calculated as follows:

$$ {\text{e}} = {\text{f }}\left( {2 - {\text{f}}} \right) $$
(5)

‘f’ represents ellipsoid flatness and can be calculated as in Eq. 6.

$$ {\text{f}} = \frac{{\left( {{\text{a}} - {\text{b}}} \right)}}{\text{a}} $$
(6)

Where, ‘a’ is the WGS-84 Earth semimajor axis and ‘b’ represents the derived Earth semi-minor axis. Now, ‘N’ can be calculated using ‘e’ from Eq. 5 in Eq. 7.

$$ {\text{N}} = \frac{\text{a}}{{\sqrt {1 - {\text{e }}\left( {{\text{s}}^{2} } \right)} }} $$
(7)

Where, ‘s’ can be calculated as follows:

$$ {\text{s}} = \sin\uplambda $$
(8)

Hence, cartesian coordinates for airport’s control tower and airplane’s location are calculated using Eqs. 2, 3, 4 and other supporting equations as well.

Directional Vector.

After finding the cartesian coordinates for both point/locations, we find the directional vector between both points. Let, T(x1, y1, z1) represents coordinates of control tower and A(x2, y2, z2) represents the coordinates of airplane, we can find the directional vector as follows:

$$ \overset{\lower0.5em\hbox{$\smash{\rightharpoonup}$}}{\text{TA}} = {\text{ < x}}_{2} - {\text{x}}_{1} , {\text{y}}_{2} - {\text{y}}_{1} , {\text{z}}_{2} - {\text{z}}_{1} { > } $$
(9)

Let, \( \overset{\lower0.5em\hbox{$\smash{\rightharpoonup}$}}{TA} = { < }x_{i} , y_{i} , z_{i} { > } \). Now, we calculate the magnitude of this directional vector using Eq. 10 for finding the current distance between control tower and airplane.

$$ {\text{M}} = \sqrt {{\text{x}}_{\text{i}}^{2} + {\text{y}}_{\text{i}}^{2} + {\text{z}}_{\text{i}}^{2} } $$
(10)

After finding the magnitude, we multiply this with scaling factor (calculated in Eq. 1) to down scale the distance for mixed reality world.

Mapping.

Afterwards, by using the directional vector and distance calculated in Step 3, we find the exact coordinates for the location of airplane in mixed reality world. After finding the exact location, we instantiate the 3D object and map it at derived location which represents the airplane in mixed reality world as shown in Figs. 3 and 4.

Our system performs the step 2, 3 and 4 against each airplane when GPS information of the air traffic is received over time.

4 Flight Tests

We have tested our system with real world flight trajectories. These trajectories were recorded during different departure and arrival flights from Jeju Airport, Republic Korea. Number of flights included in our test were 10 in total. Three of them were departure flights while remaining were the arrival flights. Flight codes were as; 9C8625, 9C8913, FE722, KE1237, OZ8995, MU5028, ZE228, ZE230, ZE551 and ZE706. These recorded flight trajectories are collected from an online source named as ‘FlightRadar24’ [17]. We retrieved the flight data from this web application as CSV file. Flight data was comprised of the latitude, longitude and altitude of airplane along with the timestamp during whole flight of each airplane.

5 Conclusion and Future Work

We have presented the interaction mechanisms for better control and management of air traffic. These interaction mechanisms consist of the mixed reality holographic 3D displays. With the help of these paradigms, air traffic controllers can easily understand the situation at airfield in more efficient and interactive way. It eliminates the barriers of bad weather conditions, fixed location for air traffic controller and less interactive interfaces in managing the air traffic. Although, we have not tested the system with the real air traffic controllers on the field, but we have tested our system with the real flight trajectories. Our system accurately mapped the recorded real flight trajectories and provided great interactivity for visualizing the traffic on airport in mixed reality world.

Our future work is planned to design the interaction mechanisms in mixed reality for conflict management in air traffic and aiding the situations caused by bad weather conditions. We have also planned to design the mechanisms using mixed reality technology to alarm the air traffic controllers about different alarming situations. These situations may include future bad weather conditions, airplanes deviation from actual path, flight trajectories which could cause collision, early or late arrivals as well as the departures of flights etc.