Yapısal Özellikleri Kullanan Parçacık Filtresi İle Uzun Süreli Nesne Takibi
Nesnelerin uzun süreli takip edilmesi eski bir araştırma konusu olmasına rağmen araştırmacıların hala aktif olarak ilgisini çeken ve hakkında birçok çalışma yapılan araştırma konularının başında gelmektedir. Bu çalışmada tahminsel yöntemler arasında adı anılan, durum uzay değişkenlerinden yararlanarak takip konusunu ilgilendiren dinamikleri modelleyen parçacık filtresi ile nesne takibi gerçekleştirilmiştir. Parçacık filtresinde, parçacık ağırlıklarının belirlenmesinde kullanılan ölçüm modelinde yenilikler sunularak nesnenin yapısal özelliklerinin kullanıldığı SSIM benzerlik katsayısı ile birlikte adaptif histogram eşitlemesi ve nesne merkez bölgesinin ağırlıklandırılması temeline dayanan yeni bir ölçüm modeli geliştirilmiştir. Yapılan deneysel sonuçlar, önerilen nesne takip yönteminin klasik takip performansını en az %18.59 oranında arttırdığı gözlemlenmiştir
Long Time Object Tracking Using Structural Features With Particle Filter
Although long time tracking is an old research subject, it is still among the research subject actively attracting the attention of researchers and it is one of the research topic many studies conducted about. Object tracking with particle filter, known to be among stochastic methods, models dynamics related to tracking subjects by taking advantage of state space variables, implemented in this study. Presenting improvements in the measurement models used to determine the weight of the particles, a new measurement model based on structural features of similarity coefficients used by SSIM with adaptive histogram equalization and weighting center of the object has been developed in The Particle Filter. Experimental results show that the proposed measurement model in object tracking increase classical tracking performance by at least %18.59
___
- [1] M. Islam, C. Oh, and C. Lee, “Video Based Moving Object Tracking by Particle Filter,” Int. J. Signal Process. Image Process. Pattern, vol. 2, pp. 119–132, 2009.
- [2] M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. Signal Process., vol. 50, no. 2, pp. 174–188, 2002.
- [3] G. M. Rao and C. Satyanarayana, “Visual Object Target Tracking Using Particle Filter: A Survey,” Int. J. Image, Graph. Signal Process., vol. 5, no. 6, pp. 57–71, 2013.
- [4] L. Mihaylova, P. Brasnett, N. Canagarajah, and D. Bull, “Object Tracking by Particle Filtering Techniques in Video Sequences,” Adv. Challenges Multisens. Data Inf. Process., vol. 8, pp. 260–268, 2007.
- [5] M. Isard and A. Blake, “Condensation - conditional density propagation for visual tracking,” Int. J. Comput. Vis., vol. 29, no. 1, pp. 5–28, 1998.
- [6] C. Hue, J. Vermaak, and M. Gangnet, “Color-Based Probabilistic Tracking,” pp. 661–675, 2002.
- [7] X. Jia and H. Lu, “Visual Tracking via Adaptive Structural Local Sparse Appearance Model,” in IEEE Conf. Comput. Vis. Pattern Recognit., pp. 1822–1829, 2012.
- [8] C. Chen, W. Tarng, and K. Lo, “An Improved Particle Filter Tracking System Based on Colour and Moving Edge Information,” Int. J. Comput. Sci. Inf. Technol., vol. 6, no. 4, pp. 97–117, 2014.
- [9] K. Ng and E. Delp, “New models for real-time tracking using particle filtering,” IS&T/SPIE Electron. Imaging, p. 72570B–72570B–12, 2009.
- [10]J. Chun and G. Shin, “Realtime Facial Expression Recognition from Video Sequences Using Optical Flow and Expression HMM,” in Journal of Korean Society for Internet Information, 2014, pp. 55–70.
- [11]M. Lucena and J. M. Fuertes, “Optical flow-based observation models for particle filter tracking,” pp. 135–143, 2015.
- [12]S. Belgacem, A. Ben-hamadou, and T. Paquet, “Hand Tracking Using Optical-Flow Embedded Particle Filter in Sign Language Scenes,” pp. 1–8, 2012.
- [13]T. M. F. Dilmen H, “Tek Boyutlu Durum Uzay Değişkenlerinin Parçacık Filtresi Yöntemi İle Takibi Tracking One Dimension State Space Variables With Particle Filter Method.,” in Signal Processing and Communications Applications Conference (SIU), 2015 23th, Malatya, pp. 1513–1516, 2015.
- [14]S. J. Julier and J. K. Uhlmann, “A new extension of the Kalman filter to nonlinear systems,” Int Symp AerospaceDefense Sens. Simul Control., vol. 3, pp. 182–193, 1997.
- [15]D. L. and T. Kanade, “An iterative image registration tech-nique with an application to stereo vision,” in inProc. 7th Int. Joint Conf. Artif. Intell., 1981, pp. 121–130, 1981
- [16]X. Mei and H. Ling, “Robust Visual Tracking using L1 Minimization,” in IEEE International Conference on Computer Vision, no. Iccv, pp. 1436–1443, 2009
- [17]D. A. Ross, J. Lim, R.-S. Lin, and M.-H. Yang, “Incremental Learning for Robust Visual Tracking,” Int. J. Comput. Vis., vol. 77, no. 1–3, pp. 125–141, 2007.
- [18]A. Łoza, L. Mihaylova, D. Bull, and N. Canagarajah, “Structural similarity-based object tracking in multimodality surveillance videos,” Mach. Vis. Appl., vol. 20, no. 2, pp. 71–83, 2009.
- [19]Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: From error visibility to structural similarity,” IEEE Trans. Image Process., vol. 13, no. 4, pp. 600–612, 2004.
- [20]G. Yadav, S. Maheshwari, and A. Agarwal, “Contrast limited adaptive histogram equalization based enhancement for real time video system,” Proc. 2014 Int. Conf. Adv. Comput. Commun. Informatics, ICACCI 2014, pp. 2392–2397, 2014.
- [21]Y. Wu, J. Lim, and M.-H. Yang, “Online Object Tracking: A Benchmark,” 2013 IEEE Conf. Comput. Vis. Pattern Recognit., pp. 2411–2418, 2013.
- [22]L. Cehovin, A. Leonardis, and M. Kristan, “Visual Object Tracking Performance Measures Revisited,” IEEE Trans. Image Process., vol. 25, no. 3, pp. 1261–1274, 2016.