Zunash Zaki,Muhammad Arif Shah,Karzan Wakil,Falak Sher,




UCI-HAR dataset,HAPT dataset,Smartphones,Accelerometer and gyroscope Sensors,Classifiers,HAR,


Human activity recognition through smartphones is now beneficial for humans to recognize their daily activities. Many of the researches are introduced for recognition of activities but somehow the performance of the classifiers is low because of different problems with the data or the classifiers. This research study offers a method to achieve the best performing classifiers. The comparative analysis held between the supervised and ensemble learning classifiers. Based on the best performing classifier, a system is also introduced in this study. We evaluate the method by using two publicly available datasets of human activities recognition acquired from UCI Machine Learning repository. One is UCI-Human Activity Recognition and the second is Smartphone-Based Recognition of Human Activities and Postural Transitions. The activities selected for this research study are Walking, Standing, Sitting, Laying, Downstairs and Upstairs. These input signals are a 3-dimensional raw form of data that was difficult to handle. The Principle Component Analysis (PCA) technique is used to reduce the dimensionalities of the data features and extract the most substantial data features for the classification of human activities. A comparison is performed between the different supervised and ensemble machine learning classifiers on the selected datasets. The supervised learning classifiers that we used are Gaussian Naïve Bayes, K-Nearest Neighbor, and Logistic Regression while the ensemble learning classifiers are Random Forest and Gradient Boosting. The achieved result shows that the Logistic Regression is more accurate as compared to other selected classifiers in this study for human activity recognition. The higher accuracy rate of Logistic Regression is 96.1% for UCI-HAR and 94.5% for HAPT dataset among all the compared classifiers.


I. [vanSmeden, Maarten, et al. “No rationale for 1 variable per 10 events criterion for binary logistic regression analysis.” BMC Medical Research Methodology 16.1 (2016): 163]

II. Anguita, D., Ghio, A., Oneto, L., Parra, X., & Reyes-Ortiz, J. L. (2013, April). A public domain the dataset for human activity recognition using smartphones. In ESANN.

III. B. Yuan and J. Herbert, “Context-aware Hybrid Reasoning Framework for Pervasive Healthcare,” Pervasive Ubiquitous Computing., vol. 18, no. 4, pp.865–881, 2013.

IV. Bayat, Akram, Marc Pomplun, and Duc A. Tran. “A study on human activity recognition using accelerometer data from smartphones.” Procedia Computer Science 34 (2014): 450-457.

V. Campos, Guilherme O.; Zimek, Arthur; Sander, Jörg; Campello, Ricardo J. G. B.; Micenková,Barbora; Schubert, Erich; Assent, Ira; Houle, Michael E. (2016). “On the evaluation ofunsupervised outlier detection: measures, datasets, and an empirical study”. Data Mining andKnowledge Discovery. 30 (4): 891–927. doi:10.1007/s10618-015-0444-8. ISSN 1384-5810

VI. D. Figo, P. C. Diniz, D. R. Ferreira, and J. M. P.Cardoso, “Preprocessing Techniques for Context Recognition from Accelerometer Data,” PervasiveUbiquitous Computing., vol. 14, no. 7, pp. 645–662,2010.

VII. Deshmukh, R., Aware, S., Picha, A., Agrawal, A., &Wable, S. D. (2018). Human ActivityRecognition using Embedded Smartphone Sensors.

VIII. Elith, Jane. “A working guide to boosted regression trees”. British Ecological Society. British Ecological Society. Retrieved 31 August 2018.

IX. F. Attal, S. Mohammed, M. Dedabrishvili, F. Chamroukhi, L. Oukhellou, and Y.Amirat, “Physical Human Activity Recognition Using Wearable Sensors,” Sensors, vol. 15, no.12, pp. 31314–31338, 2015.

X. Friedman, Jerome. “Multiple Additive Regression Trees with Application in Epidemiology”Thestatisticin Medicine. Wiley. Retrieved 31 August 2018.

XI. Fu, B., Kirchbuchner, F., Kuijper, A., Braun, A., &VaithyalingamGangatharan, D. (2018, June). Fitness Activity Recognition on Smartphones Using Doppler Measurements. In Informatics (Vol. 5, No. 2, p. 24). Multidisciplinary Digital Publishing Institute.

XII. G. Vavoulas, M. Pediaditis, E. G. Spanakis, and M.Tsiknakis, “The MobiFall dataset: An Initial Evaluation of Fall Detection Algorithms using Smartphones,” 13thIEEE Int. Conf. Bioinforma. Bioeng., no. November,pp. 1–4, 2013.

XIII. I. Farkas and E. Doran, “Activity Recognition from Acceleration Data Collected with a Tri-axial Accelerometer,” Acta Tech. Napocensis – Electron.Telecommun., vol. 52, no. 2, pp. 38–43, 2011.

XIV. Inoue, Masaya, Sozo Inoue, and Takeshi Nishida. “Deep recurrent neural network for mobilehuman activity recognition with high throughput.” Artificial Life and Robotics 23.2 (2018): 173- 185.

XV. J. Fu, C. Liu, Y. Hsu, and L. Fu, “Recognizing Context-aware Activities of Daily Living using RGBD Sensor,” Iros2013, pp. 2222–2227, 2013.

XVI. K. G. ManoshaChathuramali and R. Rodrigo, “Faster Human Activity Recognition with SVM,” in International Conference on Advances in ICT for Emerging Regions, ICTer 2012 – Conference Proceedings, 2012, pp. 197–203.

XVII. Le, Tuan Dinh, and Chung Van Nguyen. “Human activity recognition by smartphone. “Information and Computer Science (NICS), 2015 2nd National Foundation for Science andTechnology Development Conference on. IEEE, 2015.

XVIII. Li, F., Shirahama, K., Nisar, M. A., Köping, L., &Grzegorzek, M. (2018). Comparison of Feature Learning Methods for Human Activity Recognition Using Wearable Sensors. Sensors, 18(2), 679.

XIX. M. N. S. Zainuddin, N. Sulaiman, N. Mustapha, and T. Perumal, “ActivityRecognitionbasedonAccelerometerSensorusingCombinational Classifiers,” pp. 68–73, 2015.

XX. Machado, Inês P., et al. “Human activity data discovery from triaxial accelerometer sensor:Non-supervised learning sensitivity to feature extraction parametrization.” InformationProcessing & Management 51.2 (2015): 204-214.

XXI. Milenkoski, Martin, et al. “Real-time human activity recognition on smartphones using LSTMNetworks.” 2018 41st International Convention on Information and CommunicationTechnology, Electronics and Microelectronics (MIPRO). IEEE, 2018.

XXII. R. Cilla, M. A. Patricio, J. García, A. Berlanga, and J.M. Molina, “Recognizing Human Activities from Sensors using Hidden Markov Models Constructed by Feature Selection Techniques,” Algorithms, vol. 2, no. 1, pp. 282–300, 2009.

XXIII. Ronao, C. A., & Cho, S. B. (2016). Human activity recognition with smartphone sensors using deep learning neural networks. Expert Systems with Applications, 59, 235-244.

XXIV. S. Chernbumroong, A. S. Atkins, and H. Yu, “Perception of Smart Home Technologies to Assist Elderly People,” 4th Int. Conf. Software, Knowledge,Inf. Manag. Appl. (SKIMA 2010),
no. March 2016, pp.1–7, 2010.

XXV. Shah, Muhammad Arif, et al. “Ensembling Artificial Bee Colony with Analogy-Based Estimation to Improve Software Development Effort Prediction.” IEEE Access (2020).

XXVI. Shah, Muhammad Arif, et al. “Communication management guidelines for software organizations in Pakistan with clients from Afghanistan.” IOP Conference Series: Materials Science and Engineering. Vol. 160. No. 1. IOP Publishing, 2016.

XXVII. Shoaib, M., Bosch, S., Incel, O. D., Scholten, H., &Havinga, P. J. (2016). Complex humanactivity recognition using a smartphone and wrist-worn motion sensors. Sensors, 16(4), 426.

XXVIII. Sukor, AS Abdul, A. Zakaria, and N. Abdul Rahim. “Activity recognition using accelerometer sensor and machine learning classifiers.” Signal Processing & Its Applications (CSPA), 2018 IEEE 14th International Colloquium on. IEEE, 2018.

XXIX. T. Shi, X. Sun, Z. Xia, L. Chen, and J. Liu, “Fall Detection Algorithm Based on TriaxialAccelerometer and Magnetometer,” no.2, May 2016.

XXX. T. Sztyler, “On-body Localization of Wearable Devices : An Investigation of Position-AwareActivity Recognition,” 2016.

XXXI. Vellampalli, Haritha. “Physical Human Activity Recognition Using Machine LearningAlgorithms.” (2017).

XXXII. W. Xiao and Y. Lu, “Daily Human Physical Activity Recognition Based on KernelDiscriminant Analysis and Extreme Learning Machine,” Math. Probl. Eng., vol. 2015, 2015.

View Download