Comparing the Performance of Different Methods for Estimation in Inertial Navigation Systems

Comparing the Performance of Different Methods for Estimation in Inertial Navigation Systems

There are many positioning systems available today. The most prominent of these systems is the Inertial Navigation System, which is increasingly preferred because it works with its own internal system independent of external stimuli. This system de-tects position, orientation and velocity information by means of accelerometer and gyroscope sensors. Using this information, it is possible to make predictions for the next position, orientation and speed with various algorithms. In the studies conducted so far, Kalman Filter (KF) algorithms have been predominantly used for prediction. In this study, Long Term-Short Memory (LSTM) neural network architecture, Bidirectional Long Short-Term Memory (BLSTM), Gated Recurrent Unit (GRU) and Kalman Filtering methods, which are among the deep learning algorithms that have proven themselves as prediction algorithms, are examined in detail and a comparative study is presented. Here, LSTM, BLSTM and GRU deep learning networks were first trained with IMU sensor data and speed estimation was performed. Root Mean Squared Error (RMSE) values were obtained as 2.5547, 2.7592 and 2.5414, respectively. Furthermore, the same deep learning network methods were trained with GPS data. The prediction data obtained through LSTM, BLSTM and GRU provided RMSE values of 0.42542, 1.91122 and 0.32274, respectively. We see that the prediction with GPS data have higher accuracy since deep learning networks trained with GPS were less affected by noise during the training phase.

___

  • [1] A. Noureldin, T. B. Karamat, and J. Georgy, Fundamentals of Inertial Navigation, Satellite-based Positioning and their Integration; Chapter 4 Inertial Navigation System. 2013.
  • [2] B. Moaveni, M. Khosravi Roqaye Abad, and S. Nasiri, “Vehicle longitudinal velocity estimation during the braking process using unknown input Kalman filter,” Veh. Syst. Dyn., vol. 53, no. 10, pp. 1373–1392, Oct. 2015, doi: 10.1080/00423114.2015.1038279.
  • [3] B. Zhang, W. Zhao, S. Zou, H. Zhang, and Z. Luan, “Methodology Based on SBI-LSTM During,” IEEE Sens. J., vol. 21, no. 14, pp. 15485–15495, 2021.
  • [4] S. Liu et al., “Pedestrian indoor navigation using foot-mounted IMU with multi-sensor data fusion,” Int. J. Model. Identif. Control, vol. 30, no. 4, pp. 261–272, 2018, doi: 10.1504/IJMIC.2018.095833.
  • [5] J. Svacha, G. Loianno, and V. Kumar, “Inertial Yaw-Independent Velocity and Attitude Estimation for High-Speed Quadrotor Flight,” IEEE Robot. Autom. Lett., vol. 4, no. 2, pp. 1109–1116, 2019, doi: 10.1109/LRA.2019.2894220.
  • [6] K. L. Wang and J. Xu, “A speed regression using acceleration data in a deep convolutional neural network,” IEEE Access, vol. 7, pp. 9351–9356, 2019, doi: 10.1109/ACCESS.2019.2890967.
  • [7] C. Gençoğlu and H. Gümüş, “Standing Handball Throwing Velocity Estimation with a Single Wrist-Mounted Inertial Sensor,” Ann. Appl. Sport Sci., vol. 8, no. 1, pp. 2–8, 2020, doi: 10.29252/aassjournal.893.
  • [8] A. Jain, D. Kulemann, and S. Schon, “Improved velocity estimation in urban areas using Doppler observations,” 2021 Int. Conf. Localization GNSS, ICL-GNSS 2021 - Proc., 2021, doi: 10.1109/ICL-GNSS51451.2021.9452243.
  • [9] C. Yi and J. Cho, “Sensor Fusion for Accurate Ego-Motion Estimation in a Moving Platform,” Int. J. Distrib. Sens. Networks, vol. 2015, pp. 1–6, 2015, doi: 10.1155/2015/831780.
  • [10] A. Poulose, O. S. Eyobu, and D. S. Han, “An Indoor Position-Estimation Algorithm Using Smartphone IMU Sensor Data,” IEEE Access, vol. 7, pp. 11165–11177, 2019, doi: 10.1109/ACCESS.2019.2891942.
  • [11] Y. Kim and H. Bang, “Introduction to Kalman Filter and Its Applications,” in Introduction and Implementations of the Kalman Filter, IntechOpen, 2019, pp. 1–16.
  • [12] L. A. McGee and S. F. Schmidt, “Discovery of the Kalman Filter as a Practical Tool for Aerospace and Industry,” NASA Tech. Memo., no. November, p. 21, 1985, [Online]. Available: http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19860003843_1986003843.pdf.
  • [13] G. Welch and G. Bishop, “An Introduction to the Kalman Filter,” 1997.
  • [14] J. Wang, M. Xue, R. Culhane, E. Diao, J. Ding, and V. Tarokh, “Speech emotion recognition with dual-sequence LSTM architecture,” ICASSP 2020 - 2020 IEEE Int. Conf. Acoust. Speech Signal Process., pp. 6469–6473, 2020.
  • [15] L. Wang, X. Xu, H. Dong, R. Gui, R. Yang, and F. Pu, “Exploring convolutional LSTM for polsar image classification,” pp. 8452–8455, 2018.
  • [16] Q. Zhan, L. Zhang, H. Deng, and X. Xie, “An Improved LSTM For Language Identification,” in 2018 14th IEEE International Conference on Signal Processing (ICSP), Aug. 2018, vol. 2018-Augus, pp. 609–612, doi: 10.1109/ICSP.2018.8652445.
  • [17] O. Akköse, “Uzun-Kısa Vadeli Bellek(LSTM),” 2022. https://medium.com/deep-learning-turkiye/uzun-kısa-vadeli-bellek-lstm-b018c07174a3.
  • [18] B. Babüroğlu, A. Tekerek, and M. Tekerek, “Türkçİçi̇nDeri̇n Öğrenme Tabanli DoğalDi̇lİşlemeModeGeli̇şti̇ri̇lmesi̇,” p. 8, 2019, [Online]. Available: http://files/413/Babüroğlu vd. - TÜRKÇE İÇİN DERİN ÖĞRENME TABANLI DOĞAL DİL İŞLEME.pdf.
  • [19] Ö. Yildirim, “A novel wavelet sequence based on deep bidirectional LSTM network model for ECG signal classification,” Comput. Biol. Med., vol. 96, no. March, pp. 189–202, May 2018, doi: 10.1016/j.compbiomed.2018.03.016.
  • [20] S. Kostadinov, “Understanding GRU Networks.” https://towardsdatascience.com/understanding-gru-networks-2ef37df6c9be.
  • [21] R. Fu, Z. Zhang, and L. Li, “Using LSTM and GRU neural network methods for traffic flow prediction,” in 2016 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC), Nov. 2016, no. November 2016, pp. 324–328, doi: 10.1109/YAC.2016.7804912.
  • [22] P. T. Yamak, L. Yujian, and P. K. Gadosey, “A Comparison between ARIMA, LSTM, and GRU for Time Series Forecasting,” in Proceedings of the 2019 2nd International Conference on Algorithms, Computing and Artificial Intelligence, Dec. 2019, pp. 49–55, doi: 10.1145/3377713.3377722.