UMP Institutional Repository

ANN-based Performance Analysis on Human Activity Recognition

Elzein, Nahla Mohammed and Fakherldin, Mohammed and Abaker, Ibrahim and Mazlina, Abdul Majid (2020) ANN-based Performance Analysis on Human Activity Recognition. In: IEEE 4th International Conference and Workshops on Recent Advances and Innovations in Engineering (ICRAIE2020), 27-29 November 2019 , Kedah, Malaysia. pp. 1-6.. ISBN 978-1-7281-2610-4

[img]
Preview
Pdf
ANN-based Performance Analysis on Human Activity Recognition1.pdf

Download (71kB) | Preview

Abstract

In the Big Data era, where various devices can connect to each other through network and cloud services, a smartphone has numerous sensors that can detect data about everything around it. This makes the identifying-process activity (AR) applications and behavior aware of the context. In this paper, we used an algorithm to predict a person's activity based on the collected sensor data. Also, Principal Component Analysis(PCA) is applied to the 561 features of the dataset. PCA reduced the dimensions of the dataset from 561 to 50, decreasing the complexity of the data. Therefore, a number of important features are identified out of the 561 features. Consequently, the neural network outperforms the HF-SVM. The HF-SVM was chosen because it requires less memory, computational power, and battery consumption. This study suggests minimizing the resources used by the neural network or exploring another classification algorithm to achieve comparable results with fewer resources.

Item Type: Conference or Workshop Item (Lecture)
Uncontrolled Keywords: ANN, Human Activity Recognition, Machine Learning, Analysis
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty/Division: Institute of Postgraduate Studies
Faculty of Computing
Depositing User: Noorul Farina Arifin
Date Deposited: 02 Jul 2020 08:31
Last Modified: 02 Jul 2020 08:31
URI: http://umpir.ump.edu.my/id/eprint/28646
Download Statistic: View Download Statistics

Actions (login required)

View Item View Item