Akıllı telefon ve giyilebilir cihazlarla aktivite tanıma: Klasik yaklaşımlar, yeni çözümler

Son yıllarda özellikle akıllı telefonların ve giyilebilir cihazların gelişmesiyle birlikte aktivite tanıma alanındaki çalışmalar hız kazanmıştır. Aktiviteler temel olarak yürüme, koşma gibi basit aktiviteler ve yemek yeme, uyuma, diş fırçalama gibi karışık aktiviteler olmak üzere ikiye ayrılmaktadır. Bu derleme çalışmasında aktivite tanıma konusu ile ilgili makaleler değerlendirilmiş ve aktivite tanımada kullanılan sensörler, aktivite çeşitleri, uygulama alanları, aktivite tanıma için kullanılan cihazlar, veri toplama işlemi, eğitim yöntemleri, sınıflandırma algoritmaları ve kaynak tüketimi konuları detaylı olarak incelenmiştir. Yapılmış çalışmaların mevcut durumu ortaya konmuş ve kullanılan farklı yöntemler karşılaştırılmıştır. Daha sonra açık veri setleri paylaşılmış ve literatürdeki derin öğrenme yöntemleri kullanan yenilikçi çözümlerden bahsedilmiştir. Son olarak, bu alanda hala açık olan noktalardan bahsedilmiş ve ileride çalışma yapılabilecek konular önerilmiştir.

Activity recognition using smartphones and wearable devices: Traditional approaches, new solutions

In recent years, the research on activity recognition has gained speed especially with the development of smart phones and wearable devices. Activities could be categorized into two main groups. simple activities such as walking, running and complex activities such as eating, sleeping, brushing teeth. In this survey paper, articles about activity recognition are examined thoroughly. Sensors and devices used in activity recognition, types of daily activities, application areas, data collection process, training methods, classification algorithms and resource consumption are mentioned in details. The state of the art is elaborated and the existing methods are compared to each other. Later, open data sets are mentioned and studies offering innovative solutions using latest approaches such as deep learning methods are introduced. Finally, still open issues on this area are presented and future work has been discussed.

___

  • IDC. “Smartphone OS Market Share, 2016 Q3”. http://www.idc.com/promo/smartphone-market-share/os (09.05.2017).
  • McDonough, Michele. “Mobile Usage Statistics: Key Facts and Findings for Publishers”. https://blog.ezoic.com/mobile-usage-statistics-key-facts-and-findings-for-publishers (09.05.2017).
  • Wikipedia. “Activity recognition”. https://en.wikipedia.org/wiki/Activity_recognition (09.05.2017).
  • Zhang S, Wei Z, Nie J, Huang L, Wang S, Li Z. “A review on human activity recognition using vision-based method”. Journal of Healthcare Engineering, Volume 2017, https://doi.org/10.1155/2017/3090343, 2017.
  • Huang RS, Chien BC. “Activity recognition on multi-sensor data streams using distinguishing sequential patterns”. International Symposium on Artificial Intelligence (JSAI2013), Toyama, Japan, 4-7 June 2013.
  • Yan S, Liao Y, Feng X, Liu Y. “Real time activity recognition on streaming sensor data for smart environments”. Progress in Informatics and Computing (PIC), Shanghai, China, 23-25 December 2016.
  • Albert MV, Toledo S, Shapiro M, Kording K. “Using mobile phones for activity recognition in Parkinson’s patients”. Frontiers in Neurology, 3(3), 158-164 2012.
  • Concepción MAA, Morillo LMS, García JAA, Abril LG. “Mobile activity recognition and fall detection system for elderly people using Ameva algorithm”. Pervasive and Mobile Computing, 34(C), 3-13, 2017.
  • Shoaib M, Bosch S, Incel OD, Scholten H, Havinga PJM. “a survey of online activity recognition using mobile phones”. Sensors, 15(1), 2059-2085, 2015.
  • Hoseini-Tabatabaei SA, Gluhak A, Tafazolli R. “A survey on smartphone based systems for opportunistic user context recognition”. ACM Computing Surveys, 27, 45(3), 2013.
  • Incel OD, Kose M, Ersoy C. “A review and taxonomy of activity recognition on mobile phones”. BioNanoScience, 3(2), 145-171, 2013.
  • Lara OD, Labrador MA. “A survey on human activity recognition using wearable sensors”. IEEE Communications Surveys & Tutorials, 15(3), 1192-1209, 2013.
  • Chen L, Hoey J, Nugent C, Cook D, Yu Z. “Sensor-based activity recognition”. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 42(6), 790-808, 2012.
  • Kharat MV, Walse KH, Dharaskar RV. “Survey on soft computing approaches for human activity recognition”. International Journal of Science and Research, 6(2), 1328-1334, 2015.
  • Nick. “Did You Know How Many Different Kinds of Sensors Go Inside a Smartphone?”. http://www.phonearena.com/news/Did-you-know-how-many-different-kinds-of-sensors-go-inside-a-smartphone_id57885 (08.05.2017).
  • Anuva. “A Resource Guide To Wearable Device Sensors”. http://anuva.com/blog/a-resource-guide-to-wearable-device-sensors (08.05.2017).
  • Henpraserttae A, Thiemjarus S, Marukatat S. “Accurate activity recognition using a mobile phone regardless of device orientation and location”. 2011 International Conference on Body Sensor Networks, Dallas, USA, 23-25 May 2011.
  • Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ. “Simple and complex activity recognition through smart phones”. 8th International Conference on Intelligent Environments, Guanajuato, Mexico, 26-29 June 2012.
  • Siirtola P, Röning J. “Recognizing human activities user- independently on smartphones based on accelerometer data”. International Journal of Interactive Multimedia and Artificial Intelligence, 1(5), 38-45, 2012.
  • Ustev YE, Incel OD, Ersoy C. “User, Device and orientation ındependent human activity recognition on mobile phones: challenges and a proposal”. UbiComp’13 Adjunct, Zurich, Switzerland, 08-12 September 2013.
  • Riboni D, Bettini C. “COSAR: hybrid reasoning for context-aware activity recognition”. Personal and Ubiquitous Computing, 15(3), 271-289, 2011.
  • Martín H, Bernardos AM, Iglesias J, Casar JR. “Activity logging using lightweight classification techniques in mobile devices”. Personal and Ubiquitous Computing, 17(4), 675-695, 2013.
  • Lee YS, Cho SB. “Activity recognition using hierarchical hidden markov models on a smartphone with 3d accelerometer”. International Conference on Hybrid Artificial Intelligence Systems, Wroclaw, Poland, 23-25 May 2011.
  • Kwapisz JR, Weiss GM, Moore SA. “Activity recognition using cell phone accelerometers”. ACM SIGKDD Explorations Newsletter, 12(2), 74-82, 2010.
  • Singla G, Cook DJ, Edgecombe MS. “Recognizing independent and joint activities among multiple residents in smart environments”. Ambient Intelligence and Humanized Computing Journal, 1(1), 57-63, 2010.
  • Ni Q, Hernando ABG, de la Cruz IP. “The elderly's ındependent living in smart homes: a characterization of activities and sensing infrastructure survey to facilitate services development”. Sensors, 15(5), 11312-11362, 2015.
  • Ballı A, Sağbaş EA. “Akıllı saat algılayıcıları ile insan hareketlerinin sınıflandırılması”. Süleyman Demirel Üniversitesi Fen Bilimleri Enstitüsü Dergisi, 21(3), 980-990, 2017.
  • Ayu MA, Ismail SA, Matin AFA, Mantoro T. “A comparison study of classifier algorithms for mobile-phone’s accelerometer based activity recognition”. Procedia Engineering, 41, 224-229, 2012.
  • Zhao Z, Chen Y, Liu J, Shen Z, Liu M. “Cross-people mobile-phone based activity recognition”. IJCAI'11 Proceedings of the Twenty-Second İnternational Joint Conference on Artificial Intelligence, Barcelona, Spain, 16-22 July 2011.
  • Borazio M, Laerhoven KV. “Using time use with mobile sensor data: a road to practical mobile activity recognition?”. MUM 2013, Luleå, Sweden, 02-05 December 2013.
  • Sousa PAC, Gaber MM, Krishnaswamy S, Gomes JB, Menasalvas E. “MARS: a personalised mobile activity recognition system”. 14th International Conference on Mobile Data Management, Bengaluru, India, 23-26 July 2012.
  • Vo QV, Hoang MT, Choi D. “Personalization in mobile activity recognition system using k-medoids clustering algorithm”. International Journal of Distributed Sensor Networks, 9(7), 315841, 2013.
  • Yan Z, Subbaraju V, Chakraborty D, Misra A, Aberer K. “Energy-efficient continuous activity recognition on mobile phones: an activity-adaptive approach”. 16th International Symposium on Wearable Computers, Newcastle, UK, 18-22 June 2012.
  • Lee J, Kim J. “Energy-efficient real-time human activity recognition on smart mobile devices”. Mobile Information Systems, 2016, 1-12, 2016.
  • Acharjee D, Mukherjee A, Mandal JK, Mukherjee N. “Activity recognition system using inbuilt sensors of smart mobile phone and minimizing feature vectors”. Microsystem Technologies, 22(11), 2715–2722, 2016.
  • Abdallah ZA, Gaber MM, Srinivasan B, Krishnaswamy S. “Adaptive mobile activity recognition system with evolving data streams”. Neurocomputing, 150(PA), 304-317, 2015.
  • Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL. “Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine”. International Workshop on Ambient Assisted Living, Vitoria-Gasteiz, Spain, 3-5 December 2012.
  • Sztyler T, Stuckenschmidt H, Petrich W. “Position-aware activity recognition with wearable devices”. Pervasive and Mobile Computing, 38(2), 281-295, 2017.
  • Peng JX, Ferguson S, Rafferty K, Kelly PD. “An efficient feature selection method for mobile devices with application to activity recognition”. Neurocomputing, 74(17), 3543-3552, 2011.
  • Hsu HH, Chu CT, Zhou Y, Cheng Z. “Two-phase activity recognition with smartphone sensors”. Network-Based Information Systems, Taipei, Taiwan, 2-4 September 2015.
  • Ignatov AD, Strijov VV. “Human activity recognition using quasiperiodic time series collected from a single tri-axial accelerometer”. Multimedia Tools and Applications, 75(12), 7257-7270, 2016.
  • Banos O, Galvez JM, Damas M, Pomares H, Rojas I. “Window size impact in human activity recognition”. Sensors, 14(4), 6474-6499, 2014.
  • Laguna JO, Olaya AG, Borrajo D. “A dynamic sliding window approach for activity recognition”. International Conference on User Modeling, Adaptation, and Personalization, Girona, Spain, 11-15 July 2011.
  • Baratloo A, Hosseini M, Negida A, El Ashal G. “Part 1: simple definition and calculation of accuracy, sensitivity and specificity”. Emergency, 3(2), 48-49, 2015.
  • WISDM Lab. “WISDM: Wireless Sensor Data Mining”. http://www.cis.fordham.edu/wisdm/dataset.php#actitracker (20.08.2017).
  • Fordham CIS-WISDM. “Actitracker”. https://play.google.com/store/apps/details?id=edu.fordham.cis.wisdm.actitracker.client (31.08.2017).
  • Lockhart JW, Weiss GM, Xue JC, Gallagher ST, Grosner AB, Pulickal TT. “Design considerations for the WISDM smart phone-based sensor mining architecture”. SensorKDD '11 5th International Workshop on Knowledge Discovery from Sensor Data, San Diego, USA, 21 August 2011.
  • WISDM Lab. “WISDM: Wireless Sensor Data Mining”. http://www.cis.fordham.edu/wisdm/dataset.php#activityprediction (20.08.2017).
  • Walse KH, Dharaskar RV, Thakare VM. “A study of human activity recognition using AdaBoost classifiers on WISDM dataset”. The Institute of Integrative Omics and Applied Biotechnology Journal, 7(2), 68-76, 2016.
  • ETH Zurich Electronics Laboratory. “Activity Recognition Datasets”. http://www.ife.ee.ethz.ch/research/activity-recognition-datasets.html (20.08.2017).
  • Melle F. “Human Activity Recognition with Mobile Sensors”. https://github.com/Fmelle/Human-activity-recognition (20.08.2017).
  • Human-Computer Intelligent Interaction Lab. “A Brief Introduction of SCUT-NAA”. http://www.hcii-lab.net/data/SCUTNAA/EN/naa.html (20.08.2017).
  • Xue Y, Jin L. “A naturalistic 3D acceleration-based activity dataset & benchmark evaluations”. Systems Man and Cybernetics, Istanbul, Turkey, 10-13 October 2010.
  • UCI. “Heterogeneity Activity Recognition Data Set”. https://archive.ics.uci.edu/ml/datasets/Heterogeneity+Activity+Recognition (20.08.2017).
  • UCI. “Human Activity Recognition Using Smartphones Data Set”. https://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones (20.08.2017).
  • UCI. “OPPORTUNITY Activity Recognition Data Set”. https://archive.ics.uci.edu/ml/datasets/opportunity+activity+recognition (20.08.2017).
  • UCI. “Smartphone Dataset for Human Activity Recognition (HAR) in Ambient Assisted Living (AAL) Data Set”. https://archive.ics.uci.edu/ml/datasets/Smartphone+Dataset+for+Human+Activity+Recognition+(HAR)+in+Ambient+Assisted+Living+(AAL) (20.08.2017).
  • UCI. “Smartphone-Based Recognition of Human Activities and Postural Transitions Data Set”. https://archive.ics.uci.edu/ml/datasets/Smartphone-Based+Recognition+of+Human+Activities+and+Postural+Transitions (20.08.2017).
  • Universiteit Twente. “Pervasive Systems Research Data Sets”. http://ps.cs.utwente.nl/Datasets.php (20.08.2017).
  • EveryWare Lab. “Pal-SPOT Project”. http://everywarelab.di.unimi.it/palspot (20.08.2017).
  • Liu W, Wang Z, Liu X, Zeng N, Liu Y, E. Alsaadi F. “A survey of deep neural network architectures and their applications”. Neurocomputing, 234, 11-26, 2017.
  • Schmidhuber J. “Deep learning in neural networks: an overview”. Neural Networks, 61, 85-117, 2015.
  • Lane ND, Georgiev P. “Can deep learning revolutionize mobile sensing?”. HotMobile '15 Proceedings of the 16th International Workshop on Mobile Computing Systems and Applications, New Mexico, USA, 12-13 February 2015.
  • LeCun Y, Bottou L, Bengio Y, Haffner P. “Gradient-based learning applied to document recognition”. Proceedings of the IEEE, 86(11), 2278-2324, 1998.
  • Zeng M, Nguyen LT, Yu B, Mengshoel OJ, Zhu J, Wu P, Zhang J. “Convolutional neural networks for human activity recognition using mobile sensors”. Mobile Computing, Applications and Services, Austin, USA, 6-7 November 2014.
  • Ronao CA, Cho SB. “Evaluation of deep convolutional neural network architectures for human activity recognition with smartphone sensors”. Proceedings of the Korean Information Science Society, 2015, 858-860, 2015.
  • Jiang W, Yin Z. “Human activity recognition using wearable sensors by deep convolutional neural networks”. MM '15 Proceedings of the 23rd ACM international conference on Multimedia, Brisbane, Australia, 26-30 October 2015.
  • Ravi D, Wong C, Lo B, Yang GZ. “Deep learning for human activity recognition: a resource efficient implementation on low-power devices”. Wearable and Implantable Body Sensor Networks, San Francisco, USA, 14-17 June 2016.
  • Yao S, Hu S, Zhao Y, Zhang A, Abdelzaher T. “DeepSense: a unified deep learning framework for time-series mobile sensing data processing”. WWW2017, Perth, Australia, 3-7 April 2017.
  • Ronao CA, Cho SB. “Human activity recognition with smartphone sensors using deep learning neural networks”. Expert Systems with Applications: An International Journal, 59(C), 235-244, 2016.
  • Ordóñez FJ, Roggen D. “Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition”. Sensors, 16(1), 115, 2016.
  • Hinton GE, Salakhutdinov RR. “Reducing the dimensionality of data with neural networks”. Science, 313(5786), 504-507, 2006.
  • Radu V, Lane ND, Bhattacharya S, Mascolo C, Marina MK, Kawsar F. “Towards multimodal deep learning for activity recognition on mobile devices”. UbiComp '16 Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, Heidelberg, Germany, 12-16 September 2016.
  • Lane ND, Bhattacharya S. “From smart to deep: robust activity recognition on smartwatches using deep learning”. The Second IEEE International Workshop on Sensing Systems and Applications Using Wrist Worn Smart Devices, Sydney, Australia, 14-18 March 2016.
  • Hinton G, Osindero S, Teh YW. “A fast learning algorithm for deep belief nets”. Neural Computation, 18(7), 1527-1554, 2006.
  • Abu Alsheikh M, Selim A, Niyato D, Doyle L, Lin S, Tan HP. “Deep activity recognition models with triaxial accelerometers”. The Workshops of the Thirtieth AAAI Conference on Artificial Intelligence, Arizona, USA, 12-17 February 2016.
  • Lipton ZC, Berkowitz J, Elkan C. “A critical review of recurrent neural networks for sequence learning”. arXiv preprint arXiv:1506.00019, 2015.
  • Hammerla NY, Halloran S, Plötz T. “Deep, convolutional, and recurrent models for human activity recognition using wearables”. IJCAI'16 Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, New York, USA, 09-15 July 2016.
  • Chen Y, Zhong K, Zhang J, Zhao X. “LSTM networks for mobile human activity recognition”. 2016 International Conference on Artificial Intelligence: Technologies and Applications, Bangkok, Thailand, 24-25 January 2016.
  • Chen Y, Xue Y. “A deep learning approach to human activity recognition based on single accelerometer”. Systems, Man, and Cybernetics, Kowloon, China, 9-12 October 2015.
  • Yang JB, Nguyen MN, San PP, Li XL, Krishnaswamy S. “Deep convolutional neural networks on multichannel time series for human activity recognition”. IJCAI'15 Proceedings of the 24th International Conference on Artificial Intelligence, Buenos Aires, Argentina, 25-31 July 2015.
  • Sağbaş EA, Ballı S. “Transportation mode detection by using smartphone sensors and machine learning”. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, 22(5), 376-383, 2016.
  • Casilari E, Luque R, Morón MJ. “Analysis of android device-based solutions for fall detection”. Sensors, 15(8), 17827-17894, 2015.
  • Shoaib M, Bosch S, Incel OD, Scholten H, Havinga PJM. “Complex human activity recognition using smartphone and wrist-worn motion sensors”. Sensors, 16(4), 426, 2016.
  • Salpietro R, Bedogni L, Felice MD, Bononi L. “Park here! a smart parking system based on smartphones' embedded sensors and short range Communication Technologies”. 2015 IEEE 2nd World Forum on Internet of Things (WF-IoT), Milan, Italy, 14-16 December 2015.
  • Nef T, Urwyler P, Büchler M, Tarnanas I, Stucki R, Cazzoli D, Müri R, Mosimann U. “Evaluation of three state-of-the-art classifiers for recognition of activities of daily living from smart home ambient data”. Sensors, 15(5), 11725-11740, 2015.
  • Su X, Tong H, Ji P. “Activity recognition with smartphone sensors”. Tsinghua Science and Technology, 19(3), 235-249, 2014.