
Princeton Journal of Interdisciplinary Research, Volume 1, Issue 2
— Frontiers of Inquiry (December 2025) - ISSN 3069-8200
Leveraging Machine Learning and Feature
Extraction from Physiological Signals for Multimodal Emotion Detection and Mental Health Support
Author: Ashvik Raina
Affiliation: Cambridge Center of International Research
Abstract:
Mental health conditions like anxiety and depression often involve emotional dysregulation, but most methods used to measure emotions depend on people’s own subjective reports. The goal of this study is to use signal processing and machine learning to create a system that can objectively classify emotional states based on physiological signals, especially EEG and ECG, to support real-time mental health monitoring and intervention.
For this research, the DREAMER dataset and features from EEG and ECG channels for 23 subjects were used. Bandpass filtering and z-score normalization were used for preprocessing to remove noise and make the data more consistent. Band power, entropy, and HRV were extracted from 14 EEG and 2 ECG channels. To try and distinguish between baseline and stimulus conditions, multiple classification models were evaluated, including Gradient Boosting, Random Forest, AdaBoost Classifier, Logistic Regression, and K-Nearest Neighbors. Subject-exclusive train-test splits ensured generalizability. The Gradient Boosting model achieved the highest performance among all tested algorithms, with an F1 score of 76.74%, an accuracy of 77.78%, a precision of 80.49%, and a recall of 73.33%, suggesting that EEG and ECG signals offer a promising foundation for developing objective, physiology-based emotion recognition systems. The multimodal machine learning approach used in this study also lends future work to incorporate additional bio-signals, such as skin temperature from smartwatches, to enhance emotion detection accuracy in clinical and personal health applications.
Keywords: Physiological signals, emotions, machine learning, feature extraction, classification
ISSN 3069-8200
© 2025 Princeton Journal of Interdisciplinary Research.