J Navig Port Res > Volume 43(5); 2019 > Article
Jeon: A Study on Implementation of the Mobile Application of Aid to Navigation Using Location-based Augmented Reality

ABSTRACT

In this paper, we implemented a mobile application of location-based augmented reality that combines self-sensing technology and various safety information using technological advancements of the smartphone. Vessel navigation is a suitable area for augmented reality because it requires accurate knowledge of the distance and location of destinations, danger zones, AtoN, and adjacent vessels. Current smartphone applications only provide 2D images and location information. Such applications do not include information about the surrounding environment, and as a result, they can only function using their own sensing information and surrounding information into a location-based augmented reality. If you provide a variety of sensor information embedded in the smartphone to ‘BadaGO’, the implemented application through this study, ‘BadaGO’ can provide safe navigation information to the user device in real time with a variety of its own formed information. The user has a high practicality and applicability of a small ship that is supplied with safe navigation information in a changing marine environment only by providing information through the application on the smartphone.

1. Introduction

Advances in ubiquitous technology have enabled smart spaces that intelligently receive desired services in the space where general users are located. In this space, the content market of location-based services(LBS), which provides services with location information as an important factor, is expanding rapidly(Junglas, 2008; Ahn, 2006; Song, 2012). Location-based services can be considered as a new paradigm that provides various services such as weather, traffic information, navigation, games, etc. using the location information of the user. In particular, the use of smartphone that can efficiently use location-based services is becoming common(Lee, 2005; Kim, 2012).
Augmented Reality(AR) is a field of Virtual Reality(VR) that combines the real world and the virtual experience in real time by partially combining virtual objects or adding information in real space. This is also called mixed reality(MR) because the image of the real world is combined into a virtual world with additional information and displayed as a single image in real time(Choi, 2011). Hybrid VR System, which combines real and virtual environments, has been researched and developed in the United States and Japan since the late 1990s.
Recently, as the usability is realized due to the advancement of smart devices and networks, it is attracting attention as a promising technology. In order to implement augmented reality, external input elements such as a camera, GPS, a gyro sensor, and an acceleration sensor were required. Recently, smart devices in the mobile environment have most of the functions necessary for this, and thus the usability is realized through the implementation of augmented reality in the mobile environment(Jang, 2011).
Therefore, it was able to implement an AtoN mobile application using location-based augmented reality utilizing various sensors and functions embedded in a smart phone, a kind of smart device.

2. Safety Navigation Mobile Application

2.1 Safety Navigation Information provision Scenario

Figure 1 shows the requirements as diagram format containing a case that safety navigation service of context-based can be implemented based on the user’s setting as what user might encounter while navigating vessel.
Fig. 1
Safety navigation information scenario
KINPR-43-5-281_F1.jpg

2.2 Mobile Augmented Reality

The augmented reality transmits the camera image captured in the real world as shown in Figure 2, and delivers the image to the tracking module that tracks the position, movement, and direction of the object through the video interface. In the following rendering module, an augmented image is generated by creating or removing a virtual object based on the location of the object determined by the tracking module. After that, the distance between the virtual objects generated by the merge module, the distance and the direction between the generated coordinate systems are measured, and the augmented reality is expressed through the screen by checking the interference between the virtual objects.
Fig. 2
Augmented reality
KINPR-43-5-281_F2.jpg

2.3 The System Configuration of Augmented Reality

As shown in Figure 3, AtoN users can search information such as the AtoN and dangerous areas by using data server via wireless communication network(5G, LTE, 3G, Wifi).
Fig. 3
The System configuration of augmented reality
KINPR-43-5-281_F3.jpg
In addition, the user can register in real time the presence of the abnormality of the AtoN on the application can be processed quickly maintenance.
Figure 4 shows the range which calculates the user's location information using GPS, the left and right width using the direction sensor, and the distance using the tilt sensor to determine the user's desired range. Therefore both AtoN information provision and display of realted information is available, and the user can manually set the distance and orientation if the use of smartphone sensor is not available due to heavy vessel shakings.
Fig. 4
Camera view area through using smartphone’s sensors
KINPR-43-5-281_F4.jpg

2.4 GPS detection and application algorithm

In addition to using Wi-Fi information to determine the location of a smartphone, Android can also use a Cellular Network. Mobile communication networks are used in a similar way using Wi-Fi access points to determine the current location of a smartphone. The Location class is an encapsulation of the actual data that your location provider provides to your app.
It contains data that can be quantified such as latitude, longitude, and altitude. Once the app has received a Location object, it can start processing data for the application only. One important point is that they have a variety of location data properties, but not all location providers populate all of these properties. For example, if your app uses a location provider that doesn't provide altitude, the Location instance doesn't contain elevation information. The Location class provides a method(in this case has altitude()) for your app to check whether an instance contains information.

1) android manifest class

  • · android.permission.ACCESS_FINE_LOCATION

  • · android.permission.ACCESS_COARSE_LOCATION

These permissions represent the accuracy that location services provide to the application. The android.permission. ACCESS_FINE_LOCATION permission provides more accurate location data.

2) Location Provider Decision

  • · GPS location provider

  • · Network location provider

  • · Passive location provider

The application can either explicitly register the desired location provider with the LocationManager or declare which location provider to use by specifying properties in the Criteria object and choosing how to pass that object to the LocationManager.

3. Location-based AtoN Application Design Using Augmented Reality

3.1 Configuration of Mobile Augmented Reality

Location-based safety navigation AR module function using augmented reality is as follows. The safety navigation refers to the navigational environment and the dangerous area.
  • - The safety navigation AR module obtains the image through the Camera Module.

  • - The safety navigation AR module gets the user's location from the GPS Module.

  • - Based on the user's location from the GPS module and tracking module which has the tilt and direction of the smartphone, it accesses the DB and receives safety navigation and dangerous area information.

  • - Obtain the image coordinates of the marker to display in the camera image through the rendering module.

  • - Merging Module matches images and markers and displays the matched images on the user's smartphone through AR Viewer.

To provide the above functions, the augmented reality module has a structure as shown in Figure 5.
Fig. 5
Augmented reality module configuration of safety navigation
KINPR-43-5-281_F5.jpg

3.2 AR View Module

Table 1 describes the classes of safety navigation AR modules that mark AtoN and dangerous areas.
Table 1
Safety navigation AR class description
KINPR-43-5-281_T1.jpg

3.3 Location Accuracy of Smartphone

Providing accurate distance and location is very important in securing usability through providing reliable information.
Therefore, it is necessary to check how much error occurs by comparing the distance or location information received through the GPS sensor with the actual.
The noise component of the Google Nexus 9 was lower than that of the Samsung Galaxy S8, which was used for experiments with “BadaGO(The name of the software implemented through this study)” application. Table 2 shows the statistical results of single position horizontal position accuracy of Nexus 9 and Galaxy S8 with Trimble equipment as standard, and the root mean square(RMS) of single position was ± 3 ∼ 5m. Therefore, the horizontal position error of the application developed in this study was found to be less than 5m although there are some differences depending on the type of smartphone. Figure 6 shows the horizontal position accuracy of smartphone observations compared to Trimble equipment(Spatial Information Research Institute, 2017; Lim, 2011; Park, 2006).
Table 2
Statistical results of location accuracy (Unit: m)
KINPR-43-5-281_T2.jpg
Fig. 6
Location accuracy of Nexus 9 and Galaxy S8
KINPR-43-5-281_F6.jpg

3.4 Gyroscope sensing technology

The following items are necessary to determine the search range of the route marker using a smartphone.
  • - Gyroscope Sensor Values for Rotation Angle in Three Dimensional Space

  • - Roll, pitch and yaw angles for the components of X, Y and Z axes

The rotation angle of the gyroscope sensor is obtained by integrating the rotation angle component.
(1)
Radian = previous rotation angle + rotationangle component x dt
where dt is the period at which the value is obtained by the sensor.
The sensors that make up the rotation or acceleration also vary depending on the smartphone manufacturer. The response speed of the proximity sensor is fast, however the distance is very short. As the result, there are many limitations in implementing the program using the proximity sensor. Figure 7
Fig. 7
Component and rotation angle of each axis of gyroscope sensor
KINPR-43-5-281_F7.jpg
Because acceleration, proximity, and magnetic sensors are base modules, only acceleration and magnetic sensors, the essential sensor functions for the development, can be extracted and applied for most of cases. Gyroscope sensors applied to smartphone have the advantage of delicately detecting the movement of smartphone.
However, the gyroscope sensor of the Android phone itself is so sensitive that the sensitivity is slowed down by rounding the decimal point when calling the sensor data value due to the nature of navigation equipment, as shown in Figure 8.
Fig. 8
Sensor test results and gyroscope indicator
KINPR-43-5-281_F8.jpg
The GPS coordinate data of the current mobile phone position and the target point(obstacle) that the camera is facing is called to acquire the direction data up to the target point. The sensor values of the acquired direction data and the in-device direction data were applied to the route markers and the location of the danger zone.

3.5 The schematic diagram of Location-based Augmented Reality Application

Figure 9 shows the schematic diagram of the augmented reality application implementation. It is called BadaGO. To implement augmented reality, BadaGO implemented AR view module that combines camera view with virtual object or marker instead of ENC(Electronic Navigational chart) viewer. AR view works in conjunction with the database from the GPS module to create a marker through the rendering module to the danger area and AtoN location near the current position coordinates. At this time, interworking with all databases sends and receives data in the form of Jason. The created markers are matched with the camera viewers received from the Ship Track Viewer through the merging module. The matched augmented reality is configured to display on the user's smartphone through augmented reality viewer.
Fig. 9
The schematic diagram of “BadaGO” application with augmented reality
KINPR-43-5-281_F9.jpg

3.6 BadaGO Augmented Reality Application Using Image Matching

Figure 10 shows the prediction screen of location-based augmented reality AtoN mobile application that is based on the above design. The white marker that appears in the camera viewer indicates the AtoN number or name and distance information from the current location. Red marker indicates dangerous area. It is designed to display the details of the AtoN on the screen when the corresponding marker or AtoN details are touched.
Fig. 10
Virtual screen realizing augmented reality
KINPR-43-5-281_F10.jpg

4. Results

The information providing system of the AtoN and dangerous area using Location-based Augmented reality was developed based on Android application with the jurisdiction of the Pohang Regional Maritime Affairs and Fisheries Office.
The information providing system of the AtoN and dangerous area using Location-based Augmented reality is applied with sensing technology, GPS calculation and error correction technology, and the application menus, function details and remarks are shown in Table 3.
Table 3
Mobile platform menu and function for mobile application of aid to navigation
KINPR-43-5-281_T3.jpg
Currently, because the database is made up of AtoN and dangerous area in the jurisdiction of the Pohang Regional Maritime Affairs and Fisheries Office, we have to test the application directly in the offshore of Pohang, but we have tested it in the laboratory using the virtual location application due to constraints such as communication environment.
Figure 11 shows the BadaGO location-based augmented reality application and displays the AtoN name and distance as white markers based on the virtual location.
Fig. 11
Implemented location-based augmented reality screen
KINPR-43-5-281_F11.jpg
Figure 12 shows the AtoN information when the AtoN is clicked, and the indicator indicating the direction and distance on the upper left of the augmented reality screen is shown in Figure 13.
Fig. 12
Detailed information of aid to navigation
KINPR-43-5-281_F12.jpg
Fig. 13
Indicator indicating direction and distance
KINPR-43-5-281_F13.jpg

5. Conclusion

Some components of BadaGO, a navigational safety information service for navigators and people(public) who sail on small ships, have been implemented as augmented reality components. The location-based information providing application is implemented with location-based augmented reality, which resolves the difficulty in identifying the location of the AtoN and the dangerous area information due to weather and back light. which can be managed in real time using a smartphone, and can quickly respond to the failure of the AtoN facilities such as failure, loss and damage.
Although augmented reality technology has been mainly used for providing tourism or local information, various forms of image retrieval, communication, entertainment, leisure activities, etc. have been utilized by using the system based on location-based service and image matching technology proposed in this paper. Its business model will extend its reach.

REFERENCES

1. Ahn, Y. A.(2006 “Design and Application of Location Data Management System for LBS”, Journal of Korea Multimedia Society, Vol. 9, No. 4, pp. 388-400. .
2. Choi, S. K.(2011 “Contents Service Case Analysis and Business Prospect of Augmented Reality”, Journal of Korea Korean Society for Internet Information, Vol. 12, No. 1, pp. 53-54. .
3. Jang, W. S. and Ji, Y. G.(2011 “Usability Evaluation for Smart Phone Augmented Reality Application User Interface”, Journal of Korea Korean Society for e-Business Studies, Vol. 1, No. 16, pp. 35-47. .
4. Junglas, I. and Watson, R.(2008 “Location-Based Services”, Magazine Communication of the ACM, Vol. 51, pp. 65-69. .
5. Kim, S. H.,, Kim, G. U,, Kim, H. J. and Park, D. G.(2012 “A Tour Information System on Smart Phone using Location Based Service”, Journal of Korea Multimedia Society, Vol. 15, No. 5, pp. 677-691. .
6. Lee, S. H.,, Min, J. H,, Kim, J. W. and Park, J. H.(2005 “Technical Trend of Location-Based ServiceTechnical Trend of Location-Based Service”, Electronic Communication Trend Analysis, Vol. 93, pp. 33-42. .
7. Lim, J. S. and Choi, G. H.(2011 “A Study on a Location Determination System using Infrastructure Information of a WLAN Network”, The Journal of The Korea Institute of Intelligent Transport Systems , Vol. 10, No. 6, pp. 98-107. .
8. Park, Y. H. and Kim, S. M.(2006 “Next-generation location-based service location technology”, Journal of Korean Institute of Communication Sciences, Vol. 23, No. 6, pp. 83-98. .
9. Song, E. J.(2012 “A Case of the Mobile Application System Development using Location Based Service”, The Journal of Content Computing Society, Vol. 13, No. 1, pp. 53-60. .
10. Spatial Information Research Institute2017), A Study on Improvement of Smartphone Positioning Accuracy, Technical Report, Land and Geospatial Informatrix Corporation, p. 49. .


ABOUT
BROWSE ARTICLES
FOR CONTRIBUTORS
Editorial Office
C1-327 Korea Maritime and Ocean University
727 Taejong-ro, Youngdo-gu, Busan 49112, Korea
Tel: +82-51-410-4127    Fax: +82-51-404-5993    E-mail: jkinpr@kmou.ac.kr                

Copyright © 2024 by Korean Institute of Navigation and Port Research.

Developed in M2PI

Close layer
prev next