Implementation of Physiological Signals Using the K-Nearest Neighbor Classifier to Recognize Emotions with Specific Needs

Authors

  • Sukenda, S.T., M.T., Erlangga Bimatara, Dimas Fajar K., Donny Ramdani, Jajang M. Mimbar

Abstract

Technological developments have begun to be applied in the field of psychology, one example of which is the application used for emotion recognition. Basically, the process of recognizing emotions can be done in several ways, namely writing (text), physiological signals, facial expressions, voice intonation, and gestures. However, it is possible to manipulate facial expressions, handwriting, voice intonation and gestures to make emotional recognition less valid. The recognition of emotions through physiological signals is more representative and can provide more objective results because physiological signals cannot be consciously controlled by the user himself. Physiological signals that can be used to recognize emotions are the heart rate and the response of skin conductance. To be able to recognize emotions based on physiological signals, this is done by building a system. The system consists of two main units, namely the hardware unit and the software unit. The hardware unit consists of two sensors, namely a pulse sensor and a GSR sensor which is integrated with the Arduino microcontroller to measure signals from the body. The software unit functions for data processing which consists of user interface applications, database systems, and machine learning. Data received from the sensor will be stored in a database for data pre-processing, feature scaling, and data classification. The data pre-processing process consists of two stages, namely filter data and filter attributes. Then for feature scaling that is used is normalization. After going through these two processes, the data is classified using the KNN algorithm, which can then predict emotions. From the results of the research, the system built was able to carry out the process of measuring physiological signals and classification of emotions with an average value of accuracy, precision, and recall of 76%.

Published

2020-12-01

Issue

Section

Articles