Next Article in Journal / Special Issue
Quantification of Coastal Change and Preliminary Sediment Budget Calculation Using SfM Photogrammetry and Archival Aerial Imagery
Previous Article in Journal
Coarse-Clast Storm Deposit and Solitary Boulders on the Island of Mana (NP Kornati, Central Adriatic, Croatia)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Developing Mobile Applications with Augmented Reality and 3D Photogrammetry for Visualisation of Cold-Water Coral Reefs and Deep-Water Habitats

by
Larissa Macedo Cruz de Oliveira
1,*,
Priscila Almeida de Oliveira
2,
Aaron Lim
3,
Andrew J. Wheeler
1,4 and
Luis Americo Conti
2
1
School of Biological, Earth and Environmental Sciences, Environmental Research Institute, University College Cork, T23 TK30 Cork, Ireland
2
Escola de Artes Ciências e Humanidades, Universidade de São Paulo, São Paulo 03828-000, Brazil
3
Department of Geography, School of the Human Environment, Geography, Archaeology and Classics, University College Cork, T12 K8AF Cork, Ireland
4
Irish Centre for Applied Geosciences (ICRAG), Marine & Renewable Energy Institute (MaREI), University College Cork, P43 C573 Cork, Ireland
*
Author to whom correspondence should be addressed.
Geosciences 2022, 12(10), 356; https://doi.org/10.3390/geosciences12100356
Submission received: 9 August 2022 / Revised: 17 September 2022 / Accepted: 20 September 2022 / Published: 26 September 2022

Abstract

:
Cold-water coral (CWC) reefs are considered “hotspots” of biodiversity in deep-sea environments. Like tropical coral reefs, these habitats are subject to climate and anthropogenic threats. The use of remotely operated vehicles (ROVSs) in combination with three-dimensional (3D) modelling and augmented reality (AR) has enabled detailed visualisation of terrestrial and marine environments while promoting data accessibility and scientific outreach. However, remote environments such as CWC reefs still present challenges with data acquisition, which impacts the further understanding of these environments. This study aims to develop a mobile application using structure-from-motion (SfM) 3D photogrammetric data and AR for the visualisation of CWC reefs. The mobile application was developed to display 3D models of CWC reefs from the Piddington Mound area, southwest of Ireland. The 3D models were tested at different resolutions to analyse the visualisation experience and trade-off between resolution and application size. The results from the 3D reconstructions with higher resolution indicate that the combination of SfM, AR, and mobile phones is a promising tool for raising awareness and literacy regarding CWC and deep-water habitats. This study is the first of its kind to showcase CWC habitats accessible to anyone, anywhere with a mobile phone and internet connectivity.

1. Introduction

Cold-water coral (CWC) ecosystems are structural three-dimensional (3D) deep-water habitats that mostly rely on robust mapping technologies to enable detailed analyses and monitoring of these reefs [1,2,3]. CWC ecosystems form complex reef structures on the seabed and are considered “hotspots” for biodiversity in deep-sea environments as they are able to baffle sediments and act as nurseries and shelter to thousands of deep-sea species [4,5,6,7]. Moreover, coral reefs can form large 3D carbonate structures [8,9] and provide important ecosystem services [10,11], thus promoting an increase in biomass relative to their surrounding areas [12,13]. However, research and analyses related to the geographic distribution and conditions of CWC, and other deep-water habitats are scarce mainly because of data acquisition limitations related to the accessibility and extent of these environments [11,14]. These factors lead to challenges in understanding their key processes and controls, hence promoting awareness of these environments to the wider community is difficult [15]. Furthermore, CWC reefs are considered vulnerable marine ecosystems [4,16,17] given their exposure to a range of direct and indirect disturbances such as climate change [18], ocean acidification [12,19,20,21], and fishing activities like bottom trawling [22,23,24]. Given these factors, regional efforts have been made to place CWC reefs and mounds as marine protected areas (MPAs) and special areas of conversations (SACs) [13,25,26]. However, protective measures against global change threats, encouragement of sustainable practices, and raising awareness actions are still limited [5].
Marine habitats and archeologically important underwater sites usually require remote mapping and diving technologies. This dependency leads to limitations in the accessibility to these environments [27]. In coastal areas, activities such as touristic diving is one of the most popular recreational activities. However, these activities can directly impact the marine environments [28] and pose significant risk for divers [29]. In the case of coral reefs, which are sensitive to several environmental factors, the physical contact entailed when exploring these areas can cause direct and indirect impacts on the corals, such as structural damages and habitat disturbance due to sediment resuspension [30,31]. In order to mitigate these risks in shallow and coastal water habitats, researchers have introduced the concept of ‘dry diving’ [27]. In the context of underwater data visualisation, dry diving represents a new alternative way to access study sites without the need for physical diving by utilising augmented reality (AR) [32].
The conditions are different for deep-water environments, as the accessibility is limited by water depth and pressure. In this case, the use of remotely operated vehicles (ROVs) or autonomous-underwater vehicles (AUVs) has emerged as relatively new survey technologies that can aid the mapping of ocean features through the acquisition of high-resolution (HD) bathymetric and video data, physico-chemical measurements, and ecological sampling [33,34,35]. These technologies, when coupled with novel mapping methods such as 3D photogrammetry and structure-from-motion (SfM), can be used for in-depth environmental observations using 2D and 3D perceptions [36,37,38].
SfM is a relatively new photogrammetry technique that has been increasingly applied to geospatially reconstruct different environments such as seabed habitats [2,3], forests and grasslands [39,40], mangroves [41,42,43], and rock outcrops [44]. SfM can provide fine-scale 3D reconstructions with millimetric to centimetric spatial resolutions, which allows for detailed mapping of different environmental and terrain descriptors [45].
Underwater photogrammetry has become progressively more common since the introduction of ROVs [33]. It is considered a non-destructive seabed mapping technique that enables representations and measurements of marine environments [46], combining both metric and interpretative tasks. The use SfM photogrammetry has been widely employed as a time and cost-effective method for high-resolution seabed mapping from AUVs/ROVs derived video data [1,38,47,48,49,50]. The SfM technique is used to generate 3D models from a sequence of 2D images by detecting multiple matching features on these images and reconstructing a 3D point cloud [45,51]. Photogrammetry-derived 3D models have an increasing importance towards the mapping of such environments as these models allow the visualization and high-resolution analyses of otherwise pristine and secluded areas [52].
In contrast with conventional photogrammetry, SfM utilises a set of algorithms such as the scale invariant feature transform (SIFT) algorithm [53] to identify matching features in the image sequence and calculates the orientation and location from the difference of positions of the matched features. From these calculations, a sparse 3D point cloud is derived. The sparse cloud is usually refined to a finer resolution with multi-view stereo (MVS) methods [51]. Unlike laser scanning techniques, SfM is not limited to temporal frequency as it does not rely on laser pulse frequency or beam spacing, and can offer point cloud data with comparable accuracy to point clouds generated from those sources, at lower costs. Therefore, it offers a wide range of opportunities to characterise surface topography in high and multi-temporal resolution to map elevation, volumetric, and position variations, which are key to understanding earth surface processes [51].
Likewise, AR represents a promising technique that has been increasingly discussed in different fields, such as archaeology and management of underwater cultural heritages [54,55]. AR has gained popularity with educational and touristic applications for providing the possibility to disseminate knowledge in sustainable and accessible ways without requiring the physical presence of users at the study site, thus helping with damage prevention of historical sites [56] and supporting environmental preservation [57].
Virtual reality (VR) and AR are similar, but distinct and complementary technologies within a concept of “mixed reality” or more freely translated “immersive reality” (IR) [28,57,58,59,60,61]. AR and VR can be considered symmetrical (and continuous) reflections of each other in relation to what each technology seeks to accomplish and deliver to the user. While a VR environment enables the user’s interaction experience with immersive environments through multisensory interfaces, AR is characterized by projecting computational images on physical surfaces, increasing the informational and, consequently, perceptual and cognitive level we have of the environments, objects, and people around us [56,62,63,64,65,66,67,68,69,70,71,72,73]. In this sense, VR and AR are two forms of innovative platforms essentially focused on the production and consumption of content.
In the scientific field, the use of AR and VR has gained motion in the past few years, especially during the global COVID-19 pandemic, when remote teaching was widely adopted. AR platforms such as Labster [74] and ClassVR [75] have been created to simulate scientific laboratory spaces for teaching experiments. Web platforms such as Sketchfab [76] have increased the data sharing and visualization for the general public. Advances have also been made in the field of geosciences with the development of VR platforms for field mapping, such as and VRGeosciences [77]. However, these technologies are usually not open-source and rely on the use of specific hardware, i.e., desktops and VR sets, which can often be inaccessible. The use of mobile devices, on the other hand, provides a more accessible alternative towards promoting knowledge with AR.
Studies suggest that integrating the visualisation of seafloor elements with AR techniques can be beneficial for increasing situational awareness, especially in the case of deep-water habitats [37], where depth and pressure conditions make diving, or any sort of human access, nearly impractical. VR experiments have been developed to provide remote diving experiences with an alternative to reduce damage to marine environments and risk to divers [28,57] and for representing coral habitats reconstructed with photogrammetry techniques [57]. Similarly, other studies have used AR to represent underwater sites [58,78,79] such as the iMareCulture project [80]. Recent studies have used AR with the aid of waterproof devices to visualise objects during dives [54,55] and to explore shipwrecks [32]. However, marine habitats remain underexplored in terms of AR applications, and questions related to its use in education and training remain unanswered. For example, information regarding incurring costs, efficiency between AR systems, and the methods used [70,81], as well as how to identify factors and conditions that affect the effectiveness of an AR system [82], are yet to be explored. Therefore, it is important to investigate the potential of AR applications to allow novel, more inclusive surveying and optimise educational experiences [56].
The aim of the study is to integrate SfM, AR, and game designing techniques to develop a visualisation platform for CWC and deep-water habitats video data and to analyse its applicability for educational and data accessibility porpoises. By developing the proposed application (APP), this study aims to contribute towards (i) promoting the accessibility to 3D reconstructed datasets via mobile phones; (ii) facilitating the visualisation of deep-water environments such as CWC reefs; (iii) applying AR visualisation frameworks to CWC reefs and adjacent deep-sea habitats; and (iv) evaluating the outcomes of the APP in relation to 3D model resolutions. To this end, game engines and 3D photogrammetry were combined to develop a mobile APP of 3D models at different resolutions and user acceptance was assessed. The APP will provide further understanding on the resolution changes and optimal parameters for an application built for a particular environment such as CWC of the Piddington Mound, southwest of Ireland. To the best of our knowledge, the study herein is the first study to combine these three applications to enable the visualisation of deep-water habitats as 3D reconstructions.

2. Materials and Methods

The proposed framework was applied to CWC reefs of the Piddington Mound area, located in the Purcupine Seabight, the NE Atlantic (Figure 1). This study was executed in three stages: (1) data acquisition; (2) 3D modelling with SfM photogrammetry; and (3) Android APP development.

2.1. Data Acquisition

2.1.1. Study Area

The Piddington Mound is a highly dynamic CWC mound [83] located approximately 300 km southwest of Ireland, in the Belgica Mound Province (BMP) at a depth of approximately 960 m (Figure 1). Given the presence of CWC mound features, including giant carbonate mounds and important deep-sea ecosystems, part of the BMP province is a designated SAC under the EU Habitats Directive [84]. The BMP comprises a range of mound structures, also including small CWC reefs (approximately 30 m across and 10 m tall) known as Moira Mounds [85,86]. The Piddington Mound is located in the downslope area of these mounds and the main scleractinian framework forming species are Lophelia. pertusa (synonymised to Desmophyllum pertusum [87]) and Madrepora oculata [9,88]. Sponge hotspots of Aphrocallistes sp. have also been documented [2]. Local currents have been estimated to reach between 34 and 40 cm s−1 [89,90]. Owing to the high sediment influx in the area, the mounds have been considered to represent mound formation under stressed conditions [86,91]. The Piddington Mound was selected for this study given the extensive mapping efforts in the area [2,83,86,89] and evidence of temporal changes in benthic and sedimentological facies [2,83,89].

2.1.2. Video Survey

The HD video data used to generate the 3D model were collected using the Holland 1 ROV during the research cruise CE20011 in 2020 [92]. The video surveys were performed with the ROV at approximately 2 m above the sea floor with a survey speed of approximately < 0.2 knots. For the CE20011 survey, the ROV was equipped with an HDTV video camera (HD Insite mini-Zeus with HD SDI fibre output), and Kongsberg OE 14–208 digital stills camera systems were used. The Holland 1 is mounted with two deep-sea lasers spaced at 10 cm for scaling. Positioning data were acquired with a Sonardyne Ranger 2 ultra-short baseline (USBL) beacon with an accuracy of 1.3% of slant range. Video data were acquired at 50 frames/second at 1080-pixel resolution and stored as .mov files [36]. In total, 8 h of HD video data were surveyed.

2.2. 3D Models with Structure-from-Motion Photogrammetry

ROVs HD video data, extracted video frames, and camera positioning information were used for the 3D photogrammetric reconstructions used herein. The photogrammetry pipeline was implemented in Agisoft Metashape Professional v1.6 [93] following the work of [36]. From the SfM workflow, a 3D model and its respective texture file representing the CWC reef images were created. The SfM workflow is represented in Figure 2.
The video frames used for the photogrammetry were extracted in Blender (version 2.78) at a rate of one frame per second and resolution of 1920 × 1080 pixels [36]. Extracted frames were imported into Agisoft Metashape Professional v1.6 together with their respective USBL positioning information. Key and tie point limits were selected empirically according to the results of each photo alignment run. Camera alignment was followed by camera optimisation to refine camera orientation parameters and triangulated tie point coordinates. The 3D model reconstruction parameters were set to arbitrary, with source data from depth maps and the interpolation enabled. The resulting dense cloud georeferenced the frame-relative positioning data (X and Y coordinates, depth, yaw, pitch, roll, and accuracy (°)). Dense clouds were scaled using HD camera lasers spaced at 10 cm as a reference. After the dense clouds were optimised, the meshes and texture from the images were derived. The texture was created using the generic mapping mode with the mosaic blending mode. The hole filling and the ghosting filter were applied in all model reconstructions. Finally, the textured 3D models, orthomosaics, and DTMs were produced. The texture was exported as .jpg format and the 3D models were exported in .obj format for the subsequent AR part of the study (Section 2.3). Habitat characterisation of each site was performed through a visual assessment of dense cloud and orthomosaics considering main seabed morphologies, scaled measurements of sedimentological features (dropstones, pebbles, and so on), identification of the main framework forming CWC, and associated species.

2.3. Three-Dimenisonal Android APP Development

The AR APP was generated with the Vuforia Engine and Android Studio platforms that provide support for Android smartphone APP development in Unity (version 2018.4.30f1) [94]. A simplified workflow of the use of each platform is represented in Figure 3.
Unity is a game engine for creating interactive content and has graphic capabilities for lighting, rendering, and importing 3D models in several formats [94]. When installing Unity, it is important to enable Microsoft Visual Studio, if it is not yet installed on the desktop, and the supports for the Vuforia Engine [95] and Android Studio [96].
The first part of generating the AR project consisted of searching for an image to encapsulate the 3D object. The image serves as an object that the code can read to retrieve the 3D object. The image used in this study was a web-generated HD image of a QR Code to which the URL link with the project license generated by Vuforia was embedded. Although the QR Code was used for didactic purposes, any image can be used. Next, the QR Code was registered on the Vuforia Engine website [97], where a project license was obtained for the application of the AR project on Unity 3D. Subsequently, a database of the device type was created on the same Vuforia Engine website, where the QR Code was added and downloaded in the Unity Editor format.
The second part of the process consisted of setting up the project in Unity 3D, enabling Vuforia augmented reality supported and importing the database downloaded from the Vuforia Engine website. Then, the QR Code and the AR camera, responsible for reproducing the object in AR, were added. In the Vuforia Engine settings, inside Unity 3D, the license generated together with the database was applied. The 3D object and texture referring to the CWC reef were added, with positions and resolutions in the project in relation to the QR Code.
The AR visualisation was tested in Unity 3D by pointing the QR Code image at the desktop webcam. For the mobile version of the APP, the project was exported into Unity 3D with an Android Studio integration. Android Studio enables the creation of an Android Application Pack (APK) file for smartphones with an Android system. Finally, the APP file with the APK extension is generated and is ready to be downloaded and installed onto smartphones. Further examples of the use of these platforms for AR can be found in [97,98]. User perception was evaluated based on the interaction of the APP when used by an average smartphone user, with either a scientific or non-scientific background.

3. Results

3.1. Photogrammetry

In total, four models (named A, B, C, and D) were used for the development of the study. For the reconstruction of the models from the HD video data, the tie point limit and key point limits for Model D and B were set to 2000 and 200,000, respectively. For Models A and C, the tie and key point limits were set to 20,000 and 200,000 points, respectively. The resulting texture resolution was 4096 × 4096 pixels across all models. The 3D models measure from 3.76 m2 to 21.47 m2 in area. In total, 2894 images were used to reconstruct the sites from which the models A, B, C, and D were derived. Table 1 contains further details for each model used.

Habitat Characterisation

Model A

Model A is a reconstruction of an area of approximately 3.7 m2 at a depth of 967 m. The site is characterised by coral patches and thickets varying from 0.2 m to approximately 1.2 m high spread across the extent of the model. The patches are formed by scleractinian coral species Lophelia pertusa and Madrepora oculata in live and dead forms and, more predominantly, coral rubble. There are occurrences of sponges Aphrocallistes beatrix; soft corals and black corals, e.g., Leiopathes sp. and Stichopathes cf. abyssicola; and sea urchins (Echinus sp.). The seabed is composed of soft sediments overlayed by dropstones with pebbles, cobbles varying from 1 cm to 8 cm, and biogenic fragments (shells and coral fragments).

Model B

Model B is a section of a reconstructed area of 14.56 m2 at a depth of approximately 970 m. The study site is characterised by the presence of coral frameworks of Lophelia pertusa and Madrepora oculata in the form of patches and individual colonies of live and dead coral frameworks. The presence of coral rubble and coral fragments is higher relative to the other models. The section used to construct Model B is particularly dominated by coral rubble and coral fragments, with occurrences of sponges Aphrocallistes beatrix and black corals Stichopathes cf. abyssicola. Visual assessment indicates that the seabed is composed of soft sediments (mud and sand) with occurrences of sedimentary bedforms (ripples and sediment waves) and rounded to subrounded dropstones varying from 1 cm to 10 cm, which occur in abundance to the south of the transect.

Model C

Model C represents an area of 21.47 m2 at a depth of approximately 968.7 m. Similar to Model A, the area is characterised by coral patches and thickets varying from 0.2 m to approximately 1.2 m high spread across the extent of the model. The coral species Lophelia pertusa and Madrepora oculata are the most abundant framework forming species, occurring as patches, thickets, and individual colonies in both living and dead forms. The presence of sponges Aphrocallistes Beatrix; soft corals, e.g., Leiopathes sp.; black corals, e.g., Stichopathes cf. abyssicola; and sea urchins (Echinus sp.) was also found. The sediment is composed of soft sediments (sand and mud) and dropstones varying from 1 cm to 8 cm in size with sub centimetric shell and coral fragments.

Model D

Model D is a 3D reconstruction of an area of 10.71 m2 at a depth of approximately 969 m. The section is composed of small colonies < 30 cm high and individual coral colonies of Lophelia pertusa and Madrepora oculata with the presence of sponges Aphrocallistes beatrix. The occurrence of coral rubble is scarce and limited to the surroundings of the small mounds, occasionally forming thickets. The seabed is composed of soft sediments (mainly sand and mud) with the presence of bidirectional sediment waves and subangular to subrounded dropstones. The presence of shell and coral fragments is also scarce in the area.

3.2. The Coral APP

The installation of the APP from files in APK format was easily completed on an Android smartphone with a display of 1280 × 720 pixels and 1 GB of RAM. The installation of each APK takes approximately 3 min, depending on the internet connection speed. The resulting APP was tested using smartphones with Android version 8.1 or superior. The visualisation of the 3D CWC reef models produced by the Coral APP can be seen in Figure 4. By starting the APP and pointing the smartphone camera at the QR Code on the map seen in the background of Figure 4, the 3D model of the CWC can be interactively visualised.
The methodology presented was tested on the four 3D models of CWC runs using the SfM technique. The 3D models A, B, C, and D were used for the development of different APPs to understand the performance of visualisation by reading a QR Code on the smartphone. Figure 5 shows the AR projects of the four 3D models generated in Unity 3D. Experiments in the Unity 3D desktop showed that the visualisation of the models in which the resolution, i.e., relation between the number of faces and the extent of the 3D models, was higher than 200,000 faces/area (m2) allowed the better understanding the environment. In these higher resolution models (e.g., model A), the manipulation of the objects in terms of zoom, rotation, and interaction when viewed via a desktop was superior in relation to the other models (Figure 5). However, they required a higher computational cost for the generation of the APP for mobile visualisation, which lead to a drop in the resulting model resolution.
The results showed that the size of the APP installation file (APK) in MB increased with the number of facies of each model (Figure 6). During the APP mobile assessment, it was possible to identify a few difficulties interacting with larger resolution models, such as Model C, which had 6,285,480 faces, 3,154,610 vertices, and was 163 MB in size. User interaction movements such as intra axis rotation and changing zooms were limited owing to the model resolution (number of faces and vertices) and size. This can be because of the large size of the APP installation file (163 MB), which is the largest of the models tested. On the other hand, Model D, which is two orders of magnitude smaller than Model C, presented a smooth interaction. This is possibly because of the smaller number of faces and vertices of Model D, as well as the size of the APP installation file (49.3 MB), which is approximately 69% smaller than that of Model C. Similarly, Model B, with a similar APP size (56.5 MB), contributed to making the interaction in AR easier, at the expense of providing fewer CWC details (Figure 5). Model A (Figure 5), with 1,124,301 faces, 4,177,311 vertices, and 106 MB in size, which has a higher resolution than Models B and D, presented an easier interaction with the APP and a higher level of detail of objects. This evidences the trade-off between the resolution and the APP file size. For example, Model A is approximately 82.1% smaller considering the number of faces in relation to Model C, and while the latter has a higher number of faces and vertices, the former presents a richer interaction at the expense of representing a smaller area. In Figure 6, it is noted that the number of faces influences the size of the APP. Contrarily, the number of vertices, which reflects the resolution i.e., level of detail of the model, does not vary with the size of the APP.

4. Discussion

Overall, APP demonstrations showing Models D and B presented in Figure 4 (subfigures (a) and (b), respectively) represented smooth-to-run models owing to three criteria: (i) the generated APP occupied less space in the smartphone memory, (ii) the QR Code was read quickly, and (iii) it was possible to manipulate and interact easily with the 3D object, satisfying the proposal of this study. However, the higher resolution Model A (Figure 5 and Figure 7) presented a better trade-off among the three factors. Although Model A represented a small extent, it provided a higher level of detail at a reasonable APP file size that can allow smoother interactions in standard smartphones (Figure 6).
Three-dimensional reconstructions, when represented in a high resolution and at multiple scales, allow observations beyond the visual human aspect, such as the geographical distribution, seabed terrain variability (rugosity and slope measures) [50,99], and morphological variations of CWC [3]. These variations can express environmental indicators that strongly dictate mound structure such as current dynamics, supply of organic particles, and vertical and bottom sediments that influence the growth of coral mounds [9]. In Figure 5 and Figure 7 it was possible to observe the contrast of the texture and roughness of the models and associate it with the seabed elements, such as shells, sediment morphology, and corals. Although this contrast was less pronounced in the models in Figure 4, where the 3D view as well as textures and roughness were compromised, the results herein help outline a minimum threshold for the number of faces and vertices and, consequently, the resolution of the models to be used.
The developed APP can facilitate the dissemination of knowledge by raising awareness regarding the importance of understanding and monitoring these environments towards coral health and conservation, especially considering that CWC has been affected by climate and anthropogenic threats such as rising temperatures, increased ocean acidity [20], and bottom trawling [23].
Scleractinian corals are naturally 3D reef-forming frameworks [13,100]. With the combination of AR and 3D photogrammetry, it is possible to visualise the 3D model of coral reef formations, which is a fundamental aspect in understanding these habitats [38,47,101]. In the marine sciences field, the 3D perception derived from 3D photogrammetry and SfM techniques can leverage remote mapping and explorations for scientists and decision makers looking to understand and manage these deep-sea environments. Moreover, the 3D visualisation using AR can be a didactic and low-cost alternative to show how coral structures can appear in various types of environments. The use of this technology may help students and researchers in studies related to not only CWC, but also other deep-water environments such as submarine canyons and hydrothermal vents. The interaction with the models makes it easy to observe the environment in which the corals are inserted, showing these marine ecosystems in a more visual way. Users can interactively rotate the 3D object and zoom while moving the camera closer or further away from the QR Code, making the learning process more interesting when compared with visualisation provided by videos or images, for example.
Mobile AR can increase data accessibility, as users only need their smartphone and can learn about CWC from anywhere. In a wider scenario, mobile AR creates the potential for scientific dissemination of these environments and encourages the protection of underwater ecosystems by arousing curiosity among agencies that can begin to protect reefs. This is especially true in the case of 3D models built from SfM, which are usually heavy to visualise and, therefore, end up depending on desktop environments and specialized software.
The map for the visualisation of the 3D model built from SfM in the context of the Atlantic Ocean can be seen in Figure 8, and the APP is available online for other users on (https://drive.google.com/drive/folders/1Nf36dtLtCCQKts40Kqha8D9hSCWluFzU accessed on 9 August 2022).

5. Conclusions

The production of a digital representation of CWC topography and constituent elements allowed the reconstruction of the CWC seascape in a virtual 3D environment. The use of AR integrated with GIS (georeferenced data) facilitates exploring large regions at a high resolution, leading to field scale experiences with varying levels of immersion, and it has been used in a wide range of fields including geoscience research and education. CWC reefs were selected for mobile AR visualisation for two main reasons: (i) they are important deep-sea ecosystems that are often located in secluded and complex terrains and (ii) these habitats have a fundamental role in sustaining local biodiversity by acting as hotspots for different species [102]; therefore, there is an increasing need to map and monitor them. The results of the AR visualisation by Android smartphones from the APP exported in the Unity 3D software showed that visualisation was satisfactory for resolutions higher than 200,000 faces/m2. However, there was a trade-off between resolution and APP size as 3D models with many faces and vertices presented a better result in terms of the level of detail of object, but had limitations in the size of the APP.
SfM photogrammetry produces 3D point clouds, orthorectified images, and accurate digital elevation models [103], and has been used to quantify the structural complexity of coral reef habitats [3,49,103]. The integration of SfM and AR potentializes the visualisation of these reefs in new perspectives. Additionally, AR can provide additional support to ROVs mapping surveys. The interactive AR visualisation of CWC via a smartphone can increase the accessibility of data visualisation and awareness of CWC’s environmental importance.
Overall, the results show that it was possible to combine digital technologies such as ROVs, 3D modelling, and mobile AR as a subsidy for the interactive visualisation of CWC. Studies suggest that users who explore virtual spaces can form more cognitive associations [104] with scientific content, and can better learn and retain information related to the causes and effects of different phenomena, such as ocean acidification, for example. Thus, AR visualisation can endorse the environmental importance of underwater ecosystems through educational outreach and ocean literacy actions that encourage the understanding of complex structures such CWC, which can ultimately increase the community’s interest in protecting coral reefs [105]. Furthermore, the dissemination of information about the services associated with CWC can stimulate decision-makers to take initiatives to protect CWC [11]. Future studies should involve a wider user acceptance testing (UAT) survey [106], including AR-related problems faced by teachers and students [66,73], to test the efficiency and usability of the APP in different geographical settings [107].

Author Contributions

Conceptualization, L.M.C.d.O., P.A.d.O. and L.A.C.; methodology, L.M.C.d.O., P.A.d.O. and L.A.C.; software, L.M.C.d.O. and P.A.d.O.; validation, L.M.C.d.O., P.A.d.O., L.A.C. and A.L.; formal analysis, L.M.C.d.O. and P.A.d.O.; investigation, L.M.C.d.O. and P.A.d.O.; resources, L.M.C.d.O., A.L., L.A.C. and A.J.W.; data curation, L.M.C.d.O. and P.A.d.O.; writing—original draft preparation, L.M.C.d.O. and P.A.d.O. writing—review and editing, L.M.C.d.O., P.A.d.O., L.A.C., A.L. and A.J.W.; visualisation, L.M.C.d.O. and P.A.d.O.; supervision, L.M.C.d.O., L.A.C., A.L. and A.J.W.; project administration, L.M.C.d.O.; funding acquisition, L.M.C.d.O., L.A.C., A.L. and A.J.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Science Foundation Ireland, grant number: 16/IA/4528 and the Irish Research Council, grant number: GOIPG/2020/1659. P.A.d.O. and L.A.C. were supported by São Paulo Research Foundation (Fundação de Amparo à Pesquisa do Estado de São Paulo—FAPESP), grant number: 2017/19649-8.

Data Availability Statement

Supporting data including application and models are available the directory: https://drive.google.com/drive/folders/1Nf36dtLtCCQKts40Kqha8D9hSCWluFzU (accessed on 9 August 2022).

Acknowledgments

The authors would like to thank the Science Foundation Ireland (SFI), Marine Institute, and the Irish Research Council (IRC) for funding the initial research, software resources (SFI grant number 16/IA/4528), and L.M.C.d.O., Ph.D. research project ASMaT (IRC grant number: GOIPG/2020/1659). The co-authors P.A.d.O. and L.A.C. would like to thank FAPESP. Special thanks goes to Marine Institute for funding the ship time on RV Celtic Explorer under the 2019 and 2020 Ship Time Program of the National Development scheme and the shipboard party of RV Celtic Explorer and ROV Holland 1 for their support with data acquisition.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Robert, K.; Huvenne, V.A.I.; Georgiopoulou, A.; Jones, D.O.B.; Marsh, L.; Carter, D.O.G.; Chaumillon, L. New Approaches to High-Resolution Mapping of Marine Vertical Structures. Sci. Rep. 2017, 7, 9005. [Google Scholar] [CrossRef] [PubMed]
  2. Conti, L.A.; Lim, A.; Wheeler, A.J. High Resolution Mapping of a Cold Water Coral Mound. Sci. Rep. 2019, 9, 1016. [Google Scholar] [CrossRef]
  3. Price, D.M.; Robert, K.; Callaway, A.; Lo lacono, C.; Hall, R.A.; Huvenne, V.A.I. Using 3D Photogrammetry from ROV Video to Quantify Cold-Water Coral Reef Structural Complexity and Investigate Its Influence on Biodiversity and Community Assemblage. Coral Reefs 2019, 38, 1007–1021. [Google Scholar] [CrossRef]
  4. Hebbeln, D.; da Costa Portilho-Ramos, R.; Wienberg, C.; Titschack, J. The Fate of Cold-Water Corals in a Changing World: A Geological Perspective. Front. Mar. Sci. 2019, 6, 119. [Google Scholar] [CrossRef]
  5. Hebbeln, D.; Wienberg, C.; Dullo, W.C.; Freiwald, A.; Mienis, F.; Orejas, C.; Titschack, J. Cold-Water Coral Reefs Thriving under Hypoxia. Coral Reefs 2020, 39, 853–859. [Google Scholar] [CrossRef]
  6. Henry, L.-A.; Roberts, J.M. Global Biodiversity in Cold-Water Coral Reef Ecosystems. In Marine Animal Forests; Springer International Publishing: Cham, Switzerland, 2016; pp. 1–21. [Google Scholar]
  7. Miller, R.J.; Hocevar, J.; Stone, R.P.; Fedorov, D.V. Structure-Forming Corals and Sponges and Their Use as Fish Habitat in Bering Sea Submarine Canyons. PLoS ONE 2012, 7, e33885. [Google Scholar] [CrossRef]
  8. Roberts, J.M.; Wheeler, A.J.; Freiwald, A. Reefs of the Deep: The Biology and Geology of Cold-Water Coral Ecosystems. Science 2006, 312, 543–547. [Google Scholar] [CrossRef]
  9. Wheeler, A.J.; Beyer, A.; Freiwald, A.; De Haas, H.; Huvenne, V.A.I.; Kozachenko, M.; Roy, O.L.; Opderbecke, J. Morphology and Environment of Cold-Water Coral Carbonate Mounds on the NW European Margin. Int. J. Earth Sci. 2007, 96, 37–56. [Google Scholar] [CrossRef]
  10. Aanesen, M.; Armstrong, C.; Czajkowski, M.; Falk-Petersen, J.; Hanley, N.; Navrud, S. Willingness to Pay for Unfamiliar Public Goods: Preserving Cold-Water Coral in Norway. Ecol. Econ. 2015, 112, 53–67. [Google Scholar] [CrossRef]
  11. Armstrong, C.W.; Foley, N.S.; Kahui, V.; Grehan, A. Cold Water Coral Reef Management from an Ecosystem Service Perspective. Mar. Policy 2014, 50, 126–134. [Google Scholar] [CrossRef]
  12. Miller, G.T.; Spoolman, S. Environmental Science; Cengage Learning: Boston, MA, USA, 2015; ISBN 1305090446. [Google Scholar]
  13. Lim, A.; Wheeler, A.J.; Conti, L. Cold-Water Coral Habitat Mapping: Trends and Developments in Acquisition and Processing Methods. Geosciences 2020, 11, 9. [Google Scholar] [CrossRef]
  14. van Oevelen, D.; Duineveld, G.; Lavaleye, M.; Mienis, F.; Soetaert, K.; Heip, C.H.R. The Cold-water Coral Community as Hotspot of Carbon Cycling on Continental Margins: A Food-web Analysis from Rockall Bank (Northeast Atlantic). Limnol. Oceanogr. 2009, 54, 1829–1844. [Google Scholar] [CrossRef]
  15. Danovaro, R.; Dell’Anno, A.; Pusceddu, A. Biodiversity Response to Climate Change in a Warm Deep Sea. Ecol. Lett. 2004, 7, 821–828. [Google Scholar] [CrossRef]
  16. Auster, P.J.; Gjerde, K.; Heupel, E.; Watling, L.; Grehan, A.; Rogers, A.D. Definition and Detection of Vulnerable Marine Ecosystems on the High Seas: Problems with the “Move-on” Rule. ICES J. Mar. Sci. 2011, 68, 254–264. [Google Scholar] [CrossRef]
  17. Fabri, M.-C.; Pedel, L.; Beuck, L.; Galgani, F.; Hebbeln, D.; Freiwald, A. Megafauna of Vulnerable Marine Ecosystems in French Mediterranean Submarine Canyons: Spatial Distribution and Anthropogenic Impacts. Deep Sea Res. Part II Top. Stud. Oceanogr. 2014, 104, 184–207. [Google Scholar] [CrossRef]
  18. Cathalot, C.; Van Oevelen, D.; Cox, T.J.S.; Kutti, T.; Lavaleye, M.; Duineveld, G.; Meysman, F.J.R. Cold-Water Coral Reefs and Adjacent Sponge Grounds: Hotspots of Benthic Respiration and Organic Carbon Cycling in the Deep Sea. Front. Mar. Sci. 2015, 2, 37. [Google Scholar] [CrossRef]
  19. Frank, N.; Freiwald, A.; López Correa, M.; Wienberg, C.; Eisele, M.; Hebbeln, D.; Van Rooij, D.; Henriet, J.P.; Colin, C.; van Weering, T.; et al. Northeastern Atlantic Cold-Water Coral Reefs and Climate. Geology 2011, 39, 743–746. [Google Scholar] [CrossRef]
  20. Roberts, J.M.; Cairns, S.D. Cold-Water Corals in a Changing Ocean. Curr. Opin. Environ. Sustain. 2014, 7, 118–126. [Google Scholar] [CrossRef]
  21. Turley, C.M.; Roberts, J.M.; Guinotte, J.M. Corals in Deep-Water: Will the Unseen Hand of Ocean Acidification Destroy Cold-Water Ecosystems? Coral Reefs 2007, 26, 445–448. [Google Scholar] [CrossRef]
  22. Huvenne, V.A.I.; Bett, B.J.; Masson, D.G.; Le Bas, T.P.; Wheeler, A.J. Effectiveness of a Deep-Sea Cold-Water Coral Marine Protected Area, Following Eight Years of Fisheries Closure. Biol. Conserv. 2016, 200, 60–69. [Google Scholar] [CrossRef] [Green Version]
  23. Althaus, F.; Williams, A.; Schlacher, T.A.; Kloser, R.J.; Green, M.A.; Barker, B.A.; Bax, N.J.; Brodie, P.; Schlacher-Hoenlinger, M.A. Impacts of Bottom Trawling on Deep-Coral Ecosystems of Seamounts Are Long-Lasting. Mar. Ecol. Prog. Ser. 2009, 397, 279–294. [Google Scholar] [CrossRef]
  24. Coughlan, M.; Wheeler, A.J.; Dorschel, B.; Lordan, C.; Boer, W.; Van Gaever, P.; De Haas, H.; Mörz, T. Record of Anthropogenic Impact on the Western Irish Sea Mud Belt. Anthropocene 2015, 9, 56–69. [Google Scholar] [CrossRef]
  25. Clark, M.R. Seamounts, Deep-Sea Corals and Fisheries: Vulnerability of Deep-Sea Corals to Fishing on Seamounts beyond Areas of National Jurisdiction; UNEP/Earthprint: Cambridge, UK, 2006; ISBN 978928072778-4. [Google Scholar]
  26. Davies, J.S.; Guillaumont, B.; Tempera, F.; Vertino, A.; Beuck, L.; Ólafsdóttir, S.H.; Smith, C.J.; Fosså, J.H.; van den Beld, I.M.J.; Savini, A.; et al. A New Classification Scheme of European Cold-Water Coral Habitats: Implications for Ecosystem-Based Management of the Deep Sea. Deep. Res. Part II Top. Stud. Oceanogr. 2017, 145, 102–109. [Google Scholar] [CrossRef]
  27. Liestoel, G. Augmented Reality Storytelling–Narrative Design and Reconstruction of a Historical Event in Situ. Int. J. Interact. Mob. Technol. 2019, 13, 196. [Google Scholar] [CrossRef]
  28. Chen, T.C.; Ku, K.C.; Ying, T.C. A Process-Based Collaborative Model of Marine Tourism Service System-The Case of Green Island Area, Taiwan. Ocean. Coast. Manag. 2012, 64, 37–46. [Google Scholar] [CrossRef]
  29. Barker, N.H.L.; Roberts, C.M. Scuba Diver Behaviour and the Management of Diving Impacts on Coral Reefs. Biol. Conserv. 2004, 120, 481–489. [Google Scholar] [CrossRef]
  30. Jameson, S.C.; Ammar, M.S.A.; Saadalla, E.; Mostafa, H.M.; Riegl, B. A Coral Damage Index and Its Application to Diving Sites in the Egyptian Red Sea. Coral Reefs 1999, 18, 333–339. [Google Scholar] [CrossRef]
  31. Sorice, M.G.; Oh, C.-O.; Ditton, R.B. Managing Scuba Divers to Meet Ecological Goals for Coral Reef Conservation. AMBIO A J. Hum. Environ. 2007, 36, 316–322. [Google Scholar] [CrossRef]
  32. Liestøl, G.; Bendon, M.; Hadjidaki-Marder, E. Augmented Reality Storytelling Submerged. Dry Diving to a World War II Wreck at Ancient Phalasarna, Crete. Heritage 2021, 4, 4647–4664. [Google Scholar] [CrossRef]
  33. Kwasnitschka, T.; Hansteen, T.H.; Devey, C.W.; Kutterolf, S. Doing Fieldwork on the Seafloor: Photogrammetric Techniques to Yield 3D Visual Models from ROV Video. Comput. Geosci. 2013, 52, 218–226. [Google Scholar] [CrossRef] [Green Version]
  34. De Clippele, L.H.; Gafeira, J.; Robert, K.; Hennige, S.; Lavaleye, M.S.; Duineveld, G.C.A.; Huvenne, V.A.I.; Roberts, J.M. Using Novel Acoustic and Visual Mapping Tools to Predict the Small-Scale Spatial Distribution of Live Biogenic Reef Framework in Cold-Water Coral Habitats. Coral Reefs 2017, 36, 255–268. [Google Scholar] [CrossRef] [PubMed]
  35. Lim, A.; Wheeler, A.J.; Price, D.M.; O’Reilly, L.; Harris, K.; Conti, L. Influence of Benthic Currents on Cold-Water Coral Habitats: A Combined Benthic Monitoring and 3D Photogrammetric Investigation. Sci. Rep. 2020, 10, 19433. [Google Scholar] [CrossRef] [PubMed]
  36. de Oliveira, L.M.C.; Lim, A.; Conti, L.A.; Wheeler, A.J. 3D Classification of Cold-Water Coral Reefs: A Comparison of Classification Techniques for 3D Reconstructions of Cold-Water Coral Reefs and Seabed. Front. Mar. Sci. 2021, 8, 640713. [Google Scholar] [CrossRef]
  37. Laranjeira, M.; Arnaubec, A.; Brignone, L.; Dune, C.; Opderbecke, J. 3D Perception and Augmented Reality Developments in Underwater Robotics for Ocean Sciences. Curr. Robot. Rep. 2020, 1, 123–130. [Google Scholar] [CrossRef]
  38. House, J.E.; Brambilla, V.; Bidaut, L.M.; Christie, A.P.; Pizarro, O.; Madin, J.S.; Dornelas, M. Moving to 3D: Relationships between Coral Planar Area, Surface Area and Volume. PeerJ 2018, 2018, e4280. [Google Scholar] [CrossRef] [PubMed]
  39. Cunliffe, A.M.; Brazier, R.E.; Anderson, K. Ultra-Fine Grain Landscape-Scale Quantification of Dryland Vegetation Structure with Drone-Acquired Structure-from-Motion Photogrammetry. Remote Sens. Environ. 2016, 183, 129–143. [Google Scholar] [CrossRef]
  40. Panagiotidis, D.; Surový, P.; Kuželka, K. Accuracy of Structure from Motion Models in Comparison with Terrestrial Laser Scanner for the Analysis of DBH and Height Influence on Error Behaviour. J. For. Sci. 2016, 62, 357–365. [Google Scholar] [CrossRef]
  41. Otero, V.; Van De Kerchove, R.; Satyanarayana, B.; Martínez-Espinosa, C.; Fisol, M.A.B.; Ibrahim, M.R.B.; Sulong, I.; Mohd-Lokman, H.; Lucas, R.; Dahdouh-Guebas, F. Managing Mangrove Forests from the Sky: Forest Inventory Using Field Data and Unmanned Aerial Vehicle (UAV) Imagery in the Matang Mangrove Forest Reserve, Peninsular Malaysia. For. Ecol. Manag. 2018, 411, 35–45. [Google Scholar] [CrossRef]
  42. Navarro, A.; Young, M.; Allan, B.; Carnell, P.; Macreadie, P.; Ierodiaconou, D. The Application of Unmanned Aerial Vehicles (UAVs) to Estimate above-Ground Biomass of Mangrove Ecosystems. Remote Sens. Environ. 2020, 242, 111747. [Google Scholar] [CrossRef]
  43. Warfield, A.D.; Leon, J.X. Estimating Mangrove Forest Volume Using Terrestrial Laser Scanning and UAV-Derived Structure-from-Motion. Drones 2019, 3, 32. [Google Scholar] [CrossRef] [Green Version]
  44. Weidner, L.; Walton, G.; Krajnovich, A. Classifying Rock Slope Materials in Photogrammetric Point Clouds Using Robust Color and Geometric Features. ISPRS J. Photogramm. Remote Sens. 2021, 176, 15–29. [Google Scholar] [CrossRef]
  45. Smith, M.W.; Carrivick, J.L.; Quincey, D.J. Structure from Motion Photogrammetry in Physical Geography. Prog. Phys. Geogr. 2016, 40, 247–275. [Google Scholar] [CrossRef]
  46. Li, R.; Li, H.; Zou, W.; Smith, R.G.; Curran, T.A. Quantitative Photogrammetric Analysis of Digital Underwater Video Imagery. IEEE J. Ocean. Eng. 1997, 22, 364–375. [Google Scholar] [CrossRef]
  47. Burns; Delparte, D.; Gates, R.D.; Takabayashi, M. Integrating Structure-from-Motion Photogrammetry with Geospatial Software as a Novel Technique for Quantifying 3D Ecological Characteristics of Coral Reefs. PeerJ 2015, 2015, e1077. [Google Scholar] [CrossRef] [PubMed]
  48. Burns; Fukunaga, A.; Pascoe, K.H.; Runyan, A.; Craig, B.K.; Talbot, J.; Pugh, A.; Kosaki, R.K. 3D Habitat Complexity of Coral Reefs in the Northwestern Hawaiian Islands Is Driven by Coral Assemblage Structure. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 61–67. [Google Scholar] [CrossRef]
  49. Anelli, M.; Julitta, T.; Fallati, L.; Galli, P.; Rossini, M.; Colombo, R. Towards New Applications of Underwater Photogrammetry for Investigating Coral Reef Morphology and Habitat Complexity in the Myeik Archipelago, Myanmar. Geocarto Int. 2019, 34, 459–472. [Google Scholar] [CrossRef]
  50. Storlazzi, C.D.; Dartnell, P.; Hatcher, G.A.; Gibbs, A.E. End of the Chain? Rugosity and Fine-Scale Bathymetry from Existing Underwater Digital Imagery Using Structure-from-Motion (SfM) Technology. Coral Reefs 2016, 35, 889–894. [Google Scholar] [CrossRef]
  51. Carrivick, J.L.; Smith, M.W.; Quincey, D.J. Structure from Motion in the Geosciences; John Wiley & Sons, Ltd.: Chichester, UK, 2016; ISBN 9781118895818. [Google Scholar]
  52. Foucault, M.; Miskowiec, J. Of Other Spaces. Diacritics 1986, 16, 22–27. [Google Scholar] [CrossRef]
  53. Lowe, G. Sift-the Scale Invariant Feature Transform. Int. J. 2004, 2, 2. [Google Scholar]
  54. Čejka, J.; Zsíros, A.; Liarokapis, F. A Hybrid Augmented Reality Guide for Underwater Cultural Heritage Sites. Pers. Ubiquitous Comput. 2020, 24, 815–828. [Google Scholar] [CrossRef]
  55. Bruno, F.; Barbieri, L.; Mangeruga, M.; Cozza, M.; Lagudi, A.; Čejka, J.; Liarokapis, F.; Skarlatos, D. Underwater Augmented Reality for Improving the Diving Experience in Submerged Archaeological Sites. Ocean Eng. 2019, 190, 106487. [Google Scholar] [CrossRef]
  56. Loureiro, S.M.C.; Guerreiro, J.; Ali, F. 20 Years of Research on Virtual Reality and Augmented Reality in Tourism Context: A Text-Mining Approach. Tour. Manag. 2020, 77, 104028. [Google Scholar] [CrossRef]
  57. Cristobal, F.R.; Dodge, M.; Noll, B.; Rosenberg, N.; Burns, J.; Sanchez, J.; Gotshalk, D.; Pascoe, K.; Runyan, A. Exploration of Coral Reefs in Hawai‘i through Virtual Reality: Hawaiian Coral Reef Museum VR. In Practice and Experience in Advanced Research Computing; ACM: New York, NY, USA, 2020; pp. 545–546. [Google Scholar]
  58. Barrile, V.; Fotia, A.; Bernardo, E. The Submerged Heritage: A Virtual Journey in Our. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W10, 17–24. [Google Scholar] [CrossRef]
  59. Han, X.; Liu, J.; Tan, B.; Duan, L. Design and Implementation of Smart Ocean Visualization System Based on Extended Reality Technology. J. Web Eng. 2021, 20, 557–574. [Google Scholar] [CrossRef]
  60. Markowitz, D.M.; Laha, R.; Perone, B.P.; Pea, R.D.; Bailenson, J.N. Immersive Virtual Reality Field Trips Facilitate Learning About Climate Change. Front. Psychol. 2018, 9, 2364. [Google Scholar] [CrossRef]
  61. Monteiro, J.H.F.; Montanha, G.K. Desenvolvimento de Aplicação Em Realidade Virtual. Tekhne Logos 2019, 10, 99–111. [Google Scholar]
  62. Bimber, O.; Raskar, R. Spatial Augmented Reality: Merging Real and Virtual Worlds; CRC Press: New York, NY, USA, 2005; ISBN 1439864942. [Google Scholar]
  63. Kaufmann, H. Collaborative Augmented Reality in Education. Inst. Softw. Technol. Interact. Syst. Vienna Univ. Technol. 2003, 2–4. [Google Scholar]
  64. Scholz, J.; Smith, A.N. Augmented Reality: Designing Immersive Experiences That Maximize Consumer Engagement. Bus. Horiz. 2016, 59, 149–161. [Google Scholar] [CrossRef]
  65. Borba, E.Z. Imersão Visual e Corporal: Paradigmas Da Percepção Em Simuladores. Narrat. Comun. Complexificadas II–A Forma. SantaCruz do Sul Edunisc 2014, 239–256. [Google Scholar]
  66. Lee, K. Augmented Reality in Education and Training. TechTrends 2012, 56, 13–21. [Google Scholar] [CrossRef]
  67. Arena, F.; Collotta, M.; Pau, G.; Termine, F. An Overview of Augmented Reality. Computers 2022, 11, 28. [Google Scholar] [CrossRef]
  68. Carmigniani, J.; Furht, B. Augmented Reality: An Overview. Handb. Augment. Real. 2011, 3–46. [Google Scholar]
  69. Billinghurst, M.; Duenser, A. Augmented Reality in the Classroom. Computer 2012, 45, 56–63. [Google Scholar] [CrossRef]
  70. Cipresso, P.; Giglioli, I.A.C.; Raya, M.A.; Riva, G. The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature. Front. Psychol. 2018, 9. [Google Scholar] [CrossRef] [PubMed]
  71. Dunleavy, M.; Dede, C.; Mitchell, R. Affordances and Limitations of Immersive Participatory Augmented Reality Simulations for Teaching and Learning. J. Sci. Educ. Technol. 2009, 18, 7–22. [Google Scholar] [CrossRef]
  72. Feiner, S.; MacIntyre, B.; Seligmann, D. Knowledge-Based Augmented Reality. Commun. ACM 1993, 36, 53–62. [Google Scholar] [CrossRef]
  73. Fernández-Batanero, J.M.; Montenegro-Rueda, M.; Fernández-Cerero, J. Use of Augmented Reality for Students with Educational Needs: A Systematic Review (2016–2021). Societies 2022, 12, 36. [Google Scholar] [CrossRef]
  74. Labster Reimagining the Future of Education. Available online: https://www.labster.com (accessed on 15 March 2022).
  75. ClassVR Student Wearing ClassVR Headset Virtual Reality for Schools. Available online: https://www.classvr.com/ (accessed on 9 August 2022).
  76. Denoyel, A.; Pinson, C.; Passet, P.A. Sketchfab-The Best 3D Viewer on the Web. Available online: https://sketchfab.com/ (accessed on 15 September 2022).
  77. VRgeoscience Digital Outcrop Modelling. Available online: https://www.vrgeoscience.com/ (accessed on 15 March 2022).
  78. Rizvic, S.; Boskovic, D.; Okanovic, V.; Sljivo, S.; Zukic, M. Interactive Digital Storytelling: Bringing Cultural Heritage in a Classroom. J. Comput. Educ. 2019, 6, 143–166. [Google Scholar] [CrossRef]
  79. Doležal, M.; Vlachos, M.; Secci, M.; Demesticha, S.; Skarlatos, D.; Liarokapis, F. Understanding Underwater Photogrammetry for Maritime Archaeology Through Immersive Virtual Reality. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 85–91. [Google Scholar] [CrossRef]
  80. I-MareCulture–H2020 Funded EU Research and Innovation Project. Available online: https://imareculture.eu/ (accessed on 15 September 2022).
  81. Shelton, B.E.; Hedley, N.R. Exploring a Cognitive Basis for Learning Spatial Relationships with Augmented Reality. Technol. Instr. Cogn. Learn. 2004, 1, 323. [Google Scholar]
  82. Sotiriou, S.; Bogner, F.X. Visualizing the Invisible: Augmented Reality as an Innovative Science Education Scheme. Adv. Sci. Lett. 2008, 1, 114–122. [Google Scholar] [CrossRef]
  83. Boolukos, C.M.; Lim, A.; O’Riordan, R.M.; Wheeler, A.J. Cold-Water Corals in Decline–A Temporal (4 Year) Species Abundance and Biodiversity Appraisal of Complete Photomosaiced Cold-Water Coral Reef on the Irish Margin. Deep Sea Res. Part I Oceanogr. Res. Pap. 2019, 146, 44–54. [Google Scholar] [CrossRef]
  84. European Union Habitats Directive. European Union Habitats (Porcupine Bank Canyon Special Area of Conservation 003001) Regulations; Irish Statute Book: Dublin, Ireland, 2016. [Google Scholar]
  85. Wheeler, A.J.; Kozachenko, M.; Beyer, A.; Foubert, A.; Huvenne, V.A.I.; Klages, M.; Masson, D.G.; Olu-Le Roy, K.; Thiede, J. Sedimentary Processes and Carbonate Mounds in the Belgica Mound Province, Porcupine Seabight, NE Atlantic. In Cold-Water Corals and Ecosystems; Springer: Berlin/Heidelberg, Germany, 2005; pp. 571–603. [Google Scholar]
  86. Lim, A.; Wheeler, A.J.; Arnaubec, A. High-Resolution Facies Zonation within a Cold-Water Coral Mound: The Case of the Piddington Mound, Porcupine Seabight, NE Atlantic. Mar. Geol. 2017, 390, 120–130. [Google Scholar] [CrossRef]
  87. Addamo, A.M.; Vertino, A.; Stolarski, J.; Garc\’\ia-Jiménez, R.; Taviani, M.; Machordom, A. Merging Scleractinian Genera: The Overwhelming Genetic Similarity between Solitary Desmophyllum and Colonial Lophelia. BMC Evol. Biol 2016, 16, 1–17. [Google Scholar]
  88. De Mol, B.; Van Rensbergen, P.; Pillen, S.; Van Herreweghe, K.; Van Rooij, D.; McDonnell, A.; Huvenne, V.; Ivanov, M.; Swennen, R.; Henriet, J. Large Deep-Water Coral Banks in the Porcupine Basin, Southwest of Ireland. Mar. Geol. 2002, 188, 193–231. [Google Scholar] [CrossRef]
  89. Lim, A.; Huvenne, V.A.I.; Vertino, A.; Spezzaferri, S.; Wheeler, A.J. New Insights on Coral Mound Development from Groundtruthed High-Resolution ROV-Mounted Multibeam Imaging. Mar. Geol. 2018, 403, 225–237. [Google Scholar] [CrossRef]
  90. Dorschel, B.; Hebbeln, D.; Foubert, A.; White, M.; Wheeler, A.J. Hydrodynamics and Cold-Water Coral Facies Distribution Related to Recent Sedimentary Processes at Galway Mound West of Ireland. Mar. Geol. 2007, 244, 184–195. [Google Scholar] [CrossRef]
  91. Foubert, A.; Huvenne, V.A.I.; Wheeler, A.; Kozachenko, M.; Opderbecke, J.; Henriet, J.-P. The Moira Mounds, Small Cold-Water Coral Mounds in the Porcupine Seabight, NE Atlantic: Part B---Evaluating the Impact of Sediment Dynamics through High-Resolution ROV-Borne Bathymetric Mapping. Mar. Geol. 2011, 282, 65–78. [Google Scholar] [CrossRef]
  92. Lim, A.; O’Reilly, L.; Summer, G.; de Oliveira, L.M.C.; Strachan, R. CE20011-Systematic Monitoring Survey of the Moira Mound Chain (SyMonS_MoM); University College Cork: Cork, Ireland, 2020; Available online: https://www.ucc.ie/en/media/research/marinegeo/mgpdfs/SyMonS_MoM__CE20011_Cruise_Report.pdf (accessed on 15 August 2021).
  93. Agisoft Agisoft Metashape User Manual: Professional Edition, Version 1.7; Agisoft LLC: St. Petersburg, Russia, 2021; p. 160.
  94. Unity Technologies Real-Time Solutions. Endless Opportunities. Available online: https://unity.com/ (accessed on 20 August 2021).
  95. Vuforia Vuforia Engine. Available online: https://developer.vuforia.com/ (accessed on 20 August 2021).
  96. Microsoft Android Studio | Android Developers. Available online: https://developer.android.com/studio (accessed on 15 August 2021).
  97. Cieza, E.; Lujan, D. Educational Mobile Application of Augmented Reality Based on Markers to Improve the Learning of Vowel Usage and Numbers for Children of a Kindergarten in Trujillo. Procedia Comput. Sci. 2018, 130, 352–358. [Google Scholar] [CrossRef]
  98. Liu, X.; Sohn, Y.-H.; Park, D.-W. Application Development with Augmented Reality Technique Using Unity 3D and Vuforia. Int. J. Appl. Eng. Res. 2018, 13, 15068–15071. [Google Scholar]
  99. Leon, J.X.; Roelfsema, C.M.; Saunders, M.I.; Phinn, S.R. Measuring Coral Reef Terrain Roughness Using ‘Structure-from-Motion’ Close-Range Photogrammetry. Geomorphology 2015, 242, 21–28. [Google Scholar] [CrossRef]
  100. Mienis, F.; Bouma, T.; Witbaard, R.; van Oevelen, D.; Duineveld, G. Experimental Assessment of the Effects of Coldwater Coral Patches on Water Flow. Mar. Ecol. Prog. Ser. 2019, 609, 101–117. [Google Scholar] [CrossRef]
  101. McKinnon, D.; He, H.; Upcroft, B.; Smith, R.N. Towards Automated and In-Situ, near-Real Time 3-D Reconstruction of Coral Reef Environments. In Proceedings of the OCEANS’11 MTS/IEEE KONA, Waikoloa, HI, USA, 19–22 September 2011. [Google Scholar] [CrossRef]
  102. Roberts, S.; Hirshfield, M. Deep-Sea Corals: Out of Sight, but No Longer out of Mind. Front. Ecol. Environ. 2004, 2, 123–130. [Google Scholar] [CrossRef]
  103. Pizarro, O.; Friedman, A.; Bryson, M.; Williams, S.B.; Madin, J. A Simple, Fast, and Repeatable Survey Method for Underwater Visual 3D Benthic Mapping and Monitoring. Ecol. Evol. 2017, 7, 1770–1782. [Google Scholar] [CrossRef]
  104. Buchner, J.; Buntins, K.; Kerres, M. The Impact of Augmented Reality on Cognitive Load and Performance: A Systematic Review. J. Comput. Assist. Learn. 2022, 38, 285–303. [Google Scholar] [CrossRef]
  105. Kasinathan, V.; Al-Sharafi, A.T.A.; Zamnah, A.; Appadurai, N.K.; Thiruchelvam, V.; Mustapha, A. Augmented Reality in Ocean’s Secrets: Educational Application with Attached Book for Students. Linguist. Cult. Rev. 2021, 5, 1123–1137. [Google Scholar] [CrossRef]
  106. Cimperman, R. UAT Defined: A Guide to Practical User Acceptance Testing (Digital Short Cut); Pearson Education: London, UK, 2006; ISBN 0132702622. [Google Scholar]
  107. Jung, T.H.; Lee, H.; Chung, N.; tom Dieck, M.C. Cross-Cultural Differences in Adopting Mobile Augmented Reality at Cultural Heritage Tourism Sites. Int. J. Contemp. Hosp. Manag. 2018, 30, 1621–1645. [Google Scholar] [CrossRef]
Figure 1. Location of the study site in the Piddington Mound, Belgica Mound Province area relative to the Irish margin. Upper left: extracted HD video frame of the study site.
Figure 1. Location of the study site in the Piddington Mound, Belgica Mound Province area relative to the Irish margin. Upper left: extracted HD video frame of the study site.
Geosciences 12 00356 g001
Figure 2. Generalised workflow of the methodology from the video acquisition to the APP development.
Figure 2. Generalised workflow of the methodology from the video acquisition to the APP development.
Geosciences 12 00356 g002
Figure 3. Workflow of the platforms used in the development of the Android APP.
Figure 3. Workflow of the platforms used in the development of the Android APP.
Geosciences 12 00356 g003
Figure 4. Demonstration of AR APP integration with the mobile phone and QR Code of two models. The background image shows the map of the study area in relation to Ireland and the QR Code for interaction: (a) demonstration of APP visualisation of Model D and (b) demonstration of APP visualisation of Model B.
Figure 4. Demonstration of AR APP integration with the mobile phone and QR Code of two models. The background image shows the map of the study area in relation to Ireland and the QR Code for interaction: (a) demonstration of APP visualisation of Model D and (b) demonstration of APP visualisation of Model B.
Geosciences 12 00356 g004
Figure 5. Three-dimensional models of CWC: (a) Model A with 1,124,301 faces and 4,177,311 vertices at a resolution of 297,749.21 faces/area (m2); (b) Model B with 462,589 faces and 232,453 vertices at a resolution of 31,825.87 faces/area (m2); (c) Model C with 6,285,480 faces and 3,154,610 vertices at 292,660.99 faces/area (m2); (d) Model D with 44,000 faces and 22,122 vertices at 4108.31 faces/area (m2), with A extracted from C.
Figure 5. Three-dimensional models of CWC: (a) Model A with 1,124,301 faces and 4,177,311 vertices at a resolution of 297,749.21 faces/area (m2); (b) Model B with 462,589 faces and 232,453 vertices at a resolution of 31,825.87 faces/area (m2); (c) Model C with 6,285,480 faces and 3,154,610 vertices at 292,660.99 faces/area (m2); (d) Model D with 44,000 faces and 22,122 vertices at 4108.31 faces/area (m2), with A extracted from C.
Geosciences 12 00356 g005
Figure 6. Relationship of the number of faces (x-axis) and number of vertices (blue trend line) in relation to the size of the APP in MB (y-axis).
Figure 6. Relationship of the number of faces (x-axis) and number of vertices (blue trend line) in relation to the size of the APP in MB (y-axis).
Geosciences 12 00356 g006
Figure 7. View of Model A (297,749.21 faces/m2 resolution) and Model D (4,108.31 faces/m2 resolution) from Unity 3D desktop. Here, it is possible to see the resolution differences between models.
Figure 7. View of Model A (297,749.21 faces/m2 resolution) and Model D (4,108.31 faces/m2 resolution) from Unity 3D desktop. Here, it is possible to see the resolution differences between models.
Geosciences 12 00356 g007
Figure 8. Interactive map for the AR app visualisation. After downloading the APP (APK) on their phone, the user can open app and scan the QR Code (left) on the map.
Figure 8. Interactive map for the AR app visualisation. After downloading the APP (APK) on their phone, the user can open app and scan the QR Code (left) on the map.
Geosciences 12 00356 g008
Table 1. Metadata of each 3D model used in the study.
Table 1. Metadata of each 3D model used in the study.
3D Model NameNumber of FacesNumber of VerticesArea (m2)Resolution
(Number of Faces/Area (m2)
Texture Size (Pixels)Total Number
of Frames
Scale Error (m)Android
Application Size (MB)
A1,124,3014,177,3113.776297,749.214096 × 40966130.07513106
B462,589232,45314.53531,825.874096 × 409611220.0576756.5
C6,285,4803,154,61021.477292,660.994096 × 40966130.038187163
D44,00022,12210.714108.314096 × 409711590.0751349.3
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

de Oliveira, L.M.C.; Oliveira, P.A.d.; Lim, A.; Wheeler, A.J.; Conti, L.A. Developing Mobile Applications with Augmented Reality and 3D Photogrammetry for Visualisation of Cold-Water Coral Reefs and Deep-Water Habitats. Geosciences 2022, 12, 356. https://doi.org/10.3390/geosciences12100356

AMA Style

de Oliveira LMC, Oliveira PAd, Lim A, Wheeler AJ, Conti LA. Developing Mobile Applications with Augmented Reality and 3D Photogrammetry for Visualisation of Cold-Water Coral Reefs and Deep-Water Habitats. Geosciences. 2022; 12(10):356. https://doi.org/10.3390/geosciences12100356

Chicago/Turabian Style

de Oliveira, Larissa Macedo Cruz, Priscila Almeida de Oliveira, Aaron Lim, Andrew J. Wheeler, and Luis Americo Conti. 2022. "Developing Mobile Applications with Augmented Reality and 3D Photogrammetry for Visualisation of Cold-Water Coral Reefs and Deep-Water Habitats" Geosciences 12, no. 10: 356. https://doi.org/10.3390/geosciences12100356

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop