Total views : 186

Human Activity Recognition by Analysis of Skeleton Joint Position in Internet of Things (IOT) Environment


  • Bilaspur University, Bilaspur - 495001, Chhattisgarh, India
  • National Institute of Technology, Raipur - 492001, Chhattisgarh, India


Objective: To provide automatically analyzing and detecting human activities to provide better support in healthcare sector, security purpose etc. Method: We have used UT Kinect-Action 3D dataset containing position of 20 body joint captured by Kinect sensor. We selected two set of joints J1 and J2; after that we have formed some rules for activity classification then we have applied SVM classifier, KNN classifier using Euclidean distance and KNN classifier using minkowski distance for activity classification. Findings: When we have used joint set J1 we got 97.8% accuracy with SVM classifier, 98.8% accuracy with KNN classifier using Euclidean distance, and 98.9% accuracy with KNN classifier using minkowski distance and for joint set J2 we got 97.7% accuracy with SVM classifier, 98.6% accuracy with KNN classifier using Euclidean distance, and 98.7% accuracy with KNN classifier using minkowski distance. Application/Improvement: we have classified four activities hand waving, standing, sitting and picking. In future more activities can also be included in this study. IOT along with this activity recognition method can be used to reduce overheads.


Activity Recognition, IOT, Joint Set, Kinect, Skeleton.

Full Text:

 |  (PDF views: 198)


  • Boulgouris NV, Hatzinakos D, Plataniotis KN. Gait recognition, a challenging signal processing technology for biometric identification. IEEE Signal Processing Magazine. 2005; 22(6):78–90. CrossRef.
  • Kusakunniran W, Wu Q, Zhang, J, Li H. Gait recognition under various viewing angles based on correlated motion regression. IEEE transactions on circuits and systems for video technology. 2012; 22(6): 966–80. CrossRef.
  • Choudhury SD, Tjahjadi T. Silhouette-based gait recognition using Procrustes shape analysis and elliptic Fourier descriptors. Pattern Recognition. 2012; 45(9): 3414–26. CrossRef.
  • Hu M, Wang Y, Zhang Z, Zhang D, Little JJ. Incremental learning for video-based gait recognition with LBP flow. IEEE transactions on cybernetics. 2013; 43(1): 77–89. CrossRef. PMid:22692925
  • Preis J, Kessel M, Werner M, Linnhoff-Popien C. Gait recognition with kinect. 1st international workshop on kinect in pervasi0ve computing. New Castle, UK: 2012.p. 1–4.
  • Dikovski B, Madjarov G, Gjorgjevikj D. Evaluation of different feature sets for gait recognition using skeletal data from Kinect. Information and Communication Technology, Electronics and Microelectronics (MIPRO); 2014; 37th International Convention on. IEEE; 2014.p. 1304–08. CrossRef.
  • Milovanovic M, Minovic M, Starcevic D. Walking in colors, human gait recognition using Kinect and CBIR. IEEE MultiMedia . 2013; 20(4): 28–36. CrossRef.
  • Chattopadhyay P, Roy A, Sural S, Mukhopadhyay J. Pose Depth Volume extraction from RGB-D streams for frontal gait recognition. Journal of Visual Communication and Image Representation. 2014; 25(1): 53–63. CrossRef.
  • Alexiadis DS, Kelly P, Daras P, Connor NE, Boubekeur T, Moussa MB. Evaluating a dancer's performance using kinect-based skeleton tracking. Proceedings of the 19th ACM international conference on Multimedia. ACM; 2011.p. 659–62. CrossRef.
  • Nizam Y, Mohd MNH, Jamil MM. A classification of human fall from activities of daily life using joint measurements. Journal of Telecommunication, Electronic and Computer Engineering (JTEC). 2016; 8(4): 145–9.
  • Aoki R, Miyamoto R. Personal identification based on feature extraction using motions of a reduced set of joints. Control, Decision and Information Technologies (CoDIT), 2016 International Conference on. IEEE; 2016.p. 40–5. CrossRef.
  • Le TL, Nguyen MQ. Human posture recognition using human skeleton provided by Kinect. Computing, Management and Telecommunications (ComManTel), 2013 International Conference on IEEE; 2013.p. 340–5. CrossRef.
  • Wang Y, Sun J, Li J, Zhao D. Gait recognition based on 3D skeleton joints captured by kinect. Image Processing (ICIP), 2016 IEEE International Conference on IEEE; 2016.p. 3151–55.
  • Papadopoulos GT, Axenopoulos, A, Daras P. Real-time skeleton-tracking-based human action recognition using kinect data. International Conference on Multimedia Modeling, Springer International Publishing; 2014.p. 473– 83. CrossRef. PMid:24429800
  • Gia TN, Tcarenko I, Sarker VK, Rahmani AM, Westerlund, T, Liljeberg, P, Tenhunen H. IoT-based fall detection system with energy efficient sensor nodes. Nordic Circuits and Systems Conference (NORCAS), IEEE; 2016.p. 1–6. CrossRef.
  • Luo X, Liu T, Liu J, Guo X, Wang, G. Design and implementation of a distributed fall detection system based on wireless sensor networks. EURASIP Journal on Wireless Communications and Networking. 2012; 2012(1): 118. CrossRef.
  • Hu P, Ning H, Qiu T, Song H, Wang Y, Yao X. Security and privacy preservation scheme of face identification and resolution framework using fog computing in internet of things. IEEE Internet of Things Journal. 2017;99: 1–1.
  • Xia L, Chen CC, Aggarwal JK. View invariant human action recognition using histograms of 3d joints. Computer Vision and Pattern Recognition Workshops (CVPRW), 2012 IEEE Computer Society Conference on. IEEE; 2012.p. 20–7.
  • Date accessed: 15/03/2016


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.