• 中文核心期刊要目总览
  • 中国科技核心期刊
  • 中国科学引文数据库(CSCD)
  • 中国科技论文与引文数据库(CSTPCD)
  • 中国学术期刊文摘数据库(CSAD)
  • 中国学术期刊(网络版)(CNKI)
  • 中文科技期刊数据库
  • 万方数据知识服务平台
  • 中国超星期刊域出版平台
  • 国家科技学术期刊开放平台
  • 荷兰文摘与引文数据库(SCOPUS)
  • 日本科学技术振兴机构数据库(JST)
李楚溪, 肖子凡, 李叶荣, 陈智南, 吉勋, 刘逸群, 冯书斐, 张臻, 张凯明, 冯建峰, TrevorW. Robbins, 熊诗圣, 陈永昌, 肖晓. 2023: 基于深度学习和二维骨骼点的食蟹猴动作识别和精细运动研究. 动物学研究, 44(5): 967-980. DOI: 10.24272/j.issn.2095-8137.2022.449
引用本文: 李楚溪, 肖子凡, 李叶荣, 陈智南, 吉勋, 刘逸群, 冯书斐, 张臻, 张凯明, 冯建峰, TrevorW. Robbins, 熊诗圣, 陈永昌, 肖晓. 2023: 基于深度学习和二维骨骼点的食蟹猴动作识别和精细运动研究. 动物学研究, 44(5): 967-980. DOI: 10.24272/j.issn.2095-8137.2022.449
Chuxi Li, Zifan Xiao, Yerong Li, Zhinan Chen, Xun Ji, Yiqun Liu, Shufei Feng, Zhen Zhang, Kaiming Zhang, Jianfeng Feng, Trevor W. Robbins, Shisheng Xiong, Yongchang Chen, Xiao Xiao. 2023. Deep learning-based activity recognition and fine motor identification using 2D skeletons of cynomolgus monkeys. Zoological Research, 44(5): 967-980. DOI: 10.24272/j.issn.2095-8137.2022.449
Citation: Chuxi Li, Zifan Xiao, Yerong Li, Zhinan Chen, Xun Ji, Yiqun Liu, Shufei Feng, Zhen Zhang, Kaiming Zhang, Jianfeng Feng, Trevor W. Robbins, Shisheng Xiong, Yongchang Chen, Xiao Xiao. 2023. Deep learning-based activity recognition and fine motor identification using 2D skeletons of cynomolgus monkeys. Zoological Research, 44(5): 967-980. DOI: 10.24272/j.issn.2095-8137.2022.449

基于深度学习和二维骨骼点的食蟹猴动作识别和精细运动研究

Deep learning-based activity recognition and fine motor identification using 2D skeletons of cynomolgus monkeys

  • 摘要: 在神经科学及临床研究中,基于视频的动作识别方法日益成为神经疾病检测和预测的重要工具。然而,非人灵长类动物研究的动作识别目前仍依赖高强度的人工操作,并且缺乏标准化的评估方法,极大地影响研究效率及识别准确性。因此,该研究建立了在实验室环境下的两个非人灵长类动物基准数据集:MonkeyinLab (MiL) 数据集(包括13 类动作和姿态)和MiL2D数据集(包括15个2D 骨骼特征点),这两个标准数据集覆盖了食蟹猴的日常表型行为。此外,该研究还提出了一个基于深度学习的工具箱MonkeyMonitorKit (MonKit),MonKit采用TSSA网络识别猴子动作(准确度98.99%)和HRNet网络识别猴子的骨骼特征点(准确度98.8%),并结合动作姿势估计建立了精细动作的评估模型和行为分析方法,可评估疾病状态下的低头行为(抑郁症样表型)、刻板行为(自闭症样表型)等。该研究利用MonKit工具箱量化比较了作为 Rett 综合征疾病模型的MECP2 基因突变食蟹猴与野生型食蟹猴的日常行为分类和精细动作评估,并与人工识别进行对比,确认MonKit工具箱具备自动化、高效、标准统一、高准确率、高敏感性和低误差等优势,为非人灵长类的表型行为分类和评估提供了一种新颖而全面的研究手段。

     

    Abstract: Video-based action recognition is becoming a vital tool in clinical research and neuroscientific study for disorder detection and prediction. However, action recognition currently used in non-human primate (NHP) research relies heavily on intense manual labor and lacks standardized assessment. In this work, we established two standard benchmark datasets of NHPs in the laboratory: MonkeyinLab (MiL), which includes 13 categories of actions and postures, and MiL2D, which includes sequences of two-dimensional (2D) skeleton features. Furthermore, based on recent methodological advances in deep learning and skeleton visualization, we introduced the MonkeyMonitorKit (MonKit) toolbox for automatic action recognition, posture estimation, and identification of fine motor activity in monkeys. Using the datasets and MonKit, we evaluated the daily behaviors of wild-type cynomolgus monkeys within their home cages and experimental environments and compared these observations with the behaviors exhibited by cynomolgus monkeys possessing mutations in the MECP2 gene as a disease model of Rett syndrome (RTT). MonKit was used to assess motor function, stereotyped behaviors, and depressive phenotypes, with the outcomes compared with human manual detection. MonKit established consistent criteria for identifying behavior in NHPs with high accuracy and efficiency, thus providing a novel and comprehensive tool for assessing phenotypic behavior in monkeys.

     

/

返回文章
返回