基于漸進(jìn)高斯濾波融合的多視角人體姿態(tài)估計
doi: 10.16383/j.aas.c230316
-
1.
浙江工業(yè)大學(xué)信息工程學(xué)院 杭州 310023
-
2.
浙江省嵌入式系統聯(lián)合重點(diǎn)實(shí)驗室 杭州 310023
Multi-view Human Pose Estimation Based on Progressive Gaussian Filtering Fusion
-
1.
College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023
-
2.
Zhejiang Provincial United Key Laboratory of Embedded Systems, Hangzhou 310023
-
摘要: 針對視覺(jué)遮擋引起的人體姿態(tài)估計(Human pose estimation, HPE)性能下降問(wèn)題, 提出基于漸進(jìn)高斯濾波(Progressive Gaussian filtering, PGF)融合的人體姿態(tài)估計方法. 首先, 設計分層性能評估方法對多視覺(jué)量測進(jìn)行分類(lèi)處理, 以適應視覺(jué)遮擋引起的量測不確定性問(wèn)題. 其次, 構建分布式漸進(jìn)貝葉斯濾波融合框架, 以及設計一種分層分類(lèi)融合估計方法來(lái)提升復雜量測融合的魯棒性和準確性. 特別地, 針對量測統計特性變化問(wèn)題, 利用局部估計間的交互信息來(lái)引導漸進(jìn)量測更新, 從而隱式地補償量測不確定性. 最后, 仿真與實(shí)驗結果表明, 相比于現有的方法, 所提的人體姿態(tài)估計方法具有更高的準確性和魯棒性.
-
關(guān)鍵詞:
- 漸進(jìn)高斯濾波 /
- 自適應濾波 /
- 分布式融合 /
- 人體姿態(tài)估計
Abstract: A human pose estimation (HPE) method based on progressive Gaussian filtering (PGF) fusion is proposed to address the performance degradation issue caused by visual occlusion. Firstly, a hierarchical performance evaluation method is designed to classify and handle multiple visual measurements, in order to adapt to the uncertainty problem caused by visual occlusion. Secondly, a distributed progressive Bayesian filtering fusion framework is constructed, and a hierarchical classification fusion estimation method is designed to improve the robustness and accuracy of complex measurement fusion. Specifically, to address the issue of measurement statistical characteristic variation, the interactive information among local estimators is utilized to guide the progressive measurement update, thereby implicitly compensating for measurement uncertainty. Finally, from simulation and experimental results, it demonstrates that compared with existing methods, the proposed human pose estimation method achieves higher accuracy and robustness. -
表 1 累積誤差均值統計(mm)
Table 1 Cumulative error mean statistics (mm)
實(shí)驗方法 腕關(guān)節 肘關(guān)節 肩關(guān)節 觀(guān)測融合 166.44 124.44 96.56 CF 157.55 118.00 95.00 AMFKF 147.81 113.85 93.08 CI 127.63 117.85 99.62 IWCF 153.12 113.21 92.53 PGFFwoC 151.77 114.12 92.83 PGFFwC 119.47 108.98 84.11 下載: 導出CSV亚洲第一网址_国产国产人精品视频69_久久久久精品视频_国产精品第九页 -
[1] Desmarais Y, Mottet D, Slangen P, Montesinos P. A review of 3D human pose estimation algorithms for markerless motion capture. Computer Vision and Image Understanding, 2021, 212: Article No. 103275 doi: 10.1016/j.cviu.2021.103275 [2] Yang X, Yin S, Zhang W A, Hu F, Yu L, Asynchronous Gaussian filtering fusion for human motion estimation based on RGB-D cameras. IEEE Sensors Journal, 2023, 23(22): 28044-28054 doi: 10.1109/JSEN.2023.3323869 [3] 杜惠斌, 趙憶文, 韓建達, 趙新剛, 王爭, 宋國立. 基于集員濾波的雙Kinect人體關(guān)節點(diǎn)數據融合. 自動(dòng)化學(xué)報, 2016, 42(12): 1886-1898Du Hui-bin, Zhao Yi-wen, Han Jian-da, Zhao Xin-Gang, Wang Zheng, Song Guo-Li. Data fusion of dual Kinect human body joints based on ensemble filtering. Acta Automatica Sinica, 2022, 42(9): 2830-2837 [4] Wang J, Tan S, Zhen X, Xu S, Zheng F, He Z, et al. Deep 3D human pose estimation: a review. Computer Vision and Image Understanding, 2021, 210: Article No. 103225 doi: 10.1016/j.cviu.2021.103225 [5] 蔡興泉, 霍宇晴, 李發(fā)建, 孫海燕. 面向太極拳學(xué)習的人體姿態(tài)估計及相似度計算. 圖學(xué)學(xué)報, 2022, 43(4): 695-706Cai Xing-Quan, Huo Yu-Qing, Li Fa-Jian, Sun Hai-Yan. Human posture estimation and similarity calculation for Taijiquan learning. Journal of Graphics, 2022, 43(4): 695-706 [6] 張鋆豪, 何百岳, 楊旭升, 張文安. 基于可穿戴式慣性傳感器的人體運動(dòng)跟蹤方法綜述. 自動(dòng)化學(xué)報, 2019, 45(8): 1439-1454Zhang Jun-Hao, He Bai-Yue, Yang Xu-Sheng, Zhang Wen-An. A review of human motion tracking methods based on wearable inertial sensors. Acta Automatica Sinica, 2019, 45(8): 1439-1454 [7] Casalino A, Guzman S, Maria Zanchettin A, Rocco P. Human pose estimation in presence of occlusion using depth camera sensors, in human-robot coexistence scenarios. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid, Spain: IEEE, 2018. 1?7 [8] Moon S, Park Y, Ko D W, Suh I H. Multiple Kinect sensor fusion for human skeleton tracking using Kalman filtering. International Journal of Advanced Robotic Systems, 2016, 13(2): 315-323 [9] Liu G, Tian G, Li J, Zhu X, Wang Z. Human action recognition using a distributed RGB-depth camera network. IEEE Sensors Journal, 2018, 18(18): 7570-7576 doi: 10.1109/JSEN.2018.2859268 [10] He H, Liu G, Zhu X, He L, Tian G. Interacting multiple model-based human pose estimation using a distributed 3D camera network. IEEE Sensors Journal, 2019, 19(22): 10584-10590 doi: 10.1109/JSEN.2019.2931603 [11] Ahmed F, Hossain Bari A S M, Sieu B, Sadeghi J, Scholten J, Gavrilova M L. Kalman filter-based noise reduction framework for posture estimation using depth sensor. In: Proceedings of the 18th IEEE International Conference on Cognitive Informatics and Cognitive Computing. Milan, Italy: IEEE, 2019. 150?158 [12] Yeung K Y, Kwok T H, Wang C C L. Improved skeleton tracking by duplex Kinects: a practical approach for real-time applications. Journal of Computing and Information Science in Engineering, 2013, 13(4): Article No. 041007 [13] Olfati-Saber R, Fax J A, Murray R M. Consensus and cooperation in networked multi-agent systems. Proceedings of the IEEE, 2007, 95(1): 215-233 doi: 10.1109/JPROC.2006.887293 [14] Kalman R E, Bucy R S. New results in linear filtering and prediction theory. Journal of Basic Engineering, 1961, 83(1): 95-108 doi: 10.1115/1.3658902 [15] 楊旭升, 張文安, 俞立. 適用于事件觸發(fā)的分布式隨機目標跟蹤方法. 自動(dòng)化學(xué)報, 2017, 43(8): 1393-1401Yang Xu-Sheng, Zhang Wen-An, Yu Li. Distributed random target tracking method suitable for event triggering. Acta Automatica Sinica, 2017, 43(8): 1393-1401 [16] Yang X, Zhang W A, Chen M Z Q, Yu L. Hybrid sequential fusion estimation for asynchronous sensor network-based target tracking. IEEE Transactions on Control Systems Technology, 2017, 25(2): 669-676 doi: 10.1109/TCST.2016.2558632 [17] Yang X, Zhang W A, Yu L, Yang F. Sequential Gaussian approximation filter for target tracking with nonsynchronous measurements. IEEE Transactions on Aerospace and Electronic Systems, 2019, 55(1): 407-418 doi: 10.1109/TAES.2018.2852398 [18] Coskun H, Achilles F, DiPietro R, Navab N, Tombari F. Long short-term memory Kalman filters: Recurrent neural estimators for pose regularization. In: Proceedings of the IEEE International Conference on Computer Vision. Venice, Italy: IEEE, 2017. 5525?5533 [19] Pathirana P N, Li S, Trinh H M, Seneviratne A. Robust real-time bio-kinematic movement tracking using multiple Kinects for tele-rehabilitation. IEEE Transactions on Industrial Electronics, 2016, 63(3): 1822-1833 doi: 10.1109/TIE.2015.2497662 [20] 賈曉凌, 張文安, 楊旭升. 視覺(jué)遮擋下的人體姿態(tài)魯棒估計. 高技術(shù)通訊, 2021, 31(11): 1210-1218 doi: 10.3772/j.issn.1002-0470.2021.11.011Jia Xiao-Ling, Zhang Wen-An, Yang Xu-Sheng. Robust estimation of human posture under visual occlusion. High Technology Communications, 2021, 31(11): 1210-1218 doi: 10.3772/j.issn.1002-0470.2021.11.011 [21] 鄭婷婷, 楊旭升, 張文安, 俞立. 一種高斯漸進(jìn)濾波框架下的目標跟蹤方法. 自動(dòng)化學(xué)報, 2018, 44(12): 2250-2258Zheng Ting-Ting, Yang Xu-Sheng, Zhang Wen-An, Yu Li. A target tracking method in Gaussian progressive filtering framework. Acta Automatica Sinica, 2018, 44(12): 2250-2258 [22] Yang X, Zhao C, Chen B. Progressive Gaussian approximation filter with adaptive measurement update. Measurement, 2019, 148: Article No. 106898 doi: 10.1016/j.measurement.2019.106898 [23] Zhang J, Yang X, Zhang W A. A progressive Bayesian filtering framework for nonlinear systems with heavy-tailed noises. IEEE Transactions on Automatic Control, 2023, 68(3): 1918-1925 doi: 10.1109/TAC.2022.3172165 [24] Ivorra E, Ortega Pérez M, Alcaniz Raya M L. Azure Kinect body tracking under review for the specific case of upper limb exercises. MM Science Journal (Online), 2021, 2021: 4333-4341 doi: 10.17973/MMSJ.2021_6_2021012 [25] Yang X, Zhang W A, Liu A, Yu L. Linear fusion estimation for range-only target tracking with nonlinear transformation. IEEE Transactions on Industrial Informatics, 2020, 16(10): 6403-6412 doi: 10.1109/TII.2019.2955931 [26] Romeo L, Marani R, Malosio M, Perri A G, D'Orazio T. Performance analysis of body tracking with the Microsoft Azure Kinect. In: Proceedings of the 29th Mediterranean Conference on Control and Automation. Puglia, Italy: IEEE, 2021. 572?577 [27] Antico M, Balletti N, Laudato G, Lazich A, Notarantonio M, Oliveto R, et al. Postural control assessment via Microsoft Azure Kinect DK: an evaluation study. Computer Methods and Programs in Biomedicine, 2021, 209: Article No. 106324 doi: 10.1016/j.cmpb.2021.106324 [28] Nagymáté G, M Kiss R. Application of OptiTrack motion capture systems in human movement analysis. Recent Innovations in Mechatronics, 2018, 5(1): 1-9