In addition, this paper also investigates the trade-off between
computational complexity (FLOPs) and accuracy. In the scenario of
Kinetics, among the four methods, TSN [3] has the highest
computational complexity and the lowest recognition accuracy. For the
time related SSV1, the computational complexity of TSN is half that of
the other three, and the corresponding recognition accuracy rate is also
half that of the other three, which proves the importance of
incorporating time into the features more specifically.
Conclusion: A lightweight and efficient algorithm framework to
solve the problem of human action recognition is discussed in this
paper, in which a coarse-fine time granularity algorithm based on motion
salience and multi-dimensional excitation and a method of action feature
extraction based on motion salience and spatio-temporall difference are
developed. Meanwhile, a method of motivating action features by
deformable convolution and spatio-temporal channel excitation according
to time related information are also be employed. The data experiments
on most popular benchmarks have verified the advantage of our presented
algorithms on the rate of accuracy and computational efficiency, and our
future work will focus on exploring more strategies to improve the
robustness of the algorithm.
Author contributions: Kaishi Xu: Conceptualization;
supervision; writing-reviewing; editing; data curation; software;
investigation. Hanhua Cao : writing-reviewing; editing;
software; Investigation and methodology. Yang YI :
writing—original drift; software; acquisition; Funding acquisition;
Investigation and methodology.
Acknowledgments: This paper is partly supported by Key Discipline
Project of Guangzhou Xinhua University with No.2020XZD02, Guangdong Key
Discipline Scientific Research Capability Improvement Project with
No.2021ZDJS144 and No.2022ZDJS151.
Conflict of interest statement: The authors declare no conflicts
of interest.
Data availability statement: The data that support the findings
of this study are available from the corresponding author upon
reasonable request.