We present a new particle filter method for visual object tracking to effectively handle occlusion and fast motion. The proposed approach uses a chaotic local search to model irregular motion and, compared to ordinary particle filter approaches, requires a lower number of particles. Furthermore, a new chaotic sampling procedure is used to force particles to specific areas with high values of the likelihood function with maximum diversity and a histogram of dynamical information is introduced to represent the motion over successive frames based on state space reconstruction. Finally, a new criterion is proposed to distinguish occlusion and out-of-view for appearance updating. We present numerical experiments demonstrating that the developed framework outperforms other state-of-the-art approaches dealing with irregular motions and uncertainties. According to the results on BOBOT, OTB100, OTB2013, and VOT2018, compared with traditional approaches, including methods based on deep and reinforcement learning, correlation filters, and Siamese neural networks, the proposed strategy leads to much closer convergence to the true target state, increasing the tracking accuracy. Finally, we prove analytically the convergence of the proposed method.(c) 2023 Elsevier Inc. All rights reserved.
Adaptive chaotic sampling particle filter to handle occlusion and fast motion in visual object tracking / Firouznia, M; Koupaei, Ja; Faez, K; Trunfio, Ga; Amindavar, H. - In: DIGITAL SIGNAL PROCESSING. - ISSN 1051-2004. - 134:(2023), p. 103933. [10.1016/j.dsp.2023.103933]
Adaptive chaotic sampling particle filter to handle occlusion and fast motion in visual object tracking
Trunfio, GA;
2023-01-01
Abstract
We present a new particle filter method for visual object tracking to effectively handle occlusion and fast motion. The proposed approach uses a chaotic local search to model irregular motion and, compared to ordinary particle filter approaches, requires a lower number of particles. Furthermore, a new chaotic sampling procedure is used to force particles to specific areas with high values of the likelihood function with maximum diversity and a histogram of dynamical information is introduced to represent the motion over successive frames based on state space reconstruction. Finally, a new criterion is proposed to distinguish occlusion and out-of-view for appearance updating. We present numerical experiments demonstrating that the developed framework outperforms other state-of-the-art approaches dealing with irregular motions and uncertainties. According to the results on BOBOT, OTB100, OTB2013, and VOT2018, compared with traditional approaches, including methods based on deep and reinforcement learning, correlation filters, and Siamese neural networks, the proposed strategy leads to much closer convergence to the true target state, increasing the tracking accuracy. Finally, we prove analytically the convergence of the proposed method.(c) 2023 Elsevier Inc. All rights reserved.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.