Menu

UniPhyNet: A Unified Network for Multimodal Physiological Signal Classification

Renxiang Qiu1

1 University of Copenhagen

In this study, we introduce UniPhyNet, a novel neural network architecture designed to classify cognitive load using multimodal physiological data, specifically EEG, ECG, and EDA signals. Inspired by the EEGNet architecture, UniPhyNet incorporates significant enhancements, including parallel convolutional blocks with various 1D CNN kernel sizes to capture multi-scale features. UniPhyNet replaces the traditional depthwise separable convolutions of EEGNet with ResNet blocks, incorporating CBAM-based attention mechanisms to dynamically emphasize the most informative features in both channel and spatial dimensions. This design allows UniPhyNet to effectively process and integrate diverse physiological signals, enhancing its ability to classify cognitive load more accurately. Our model is versatile, offering two configurations: a single-modal version that trains on individual data types (EEG, ECG, or EDA) and a multimodal version that combines these data types through an intermediate fusion mechanism.

We evaluate UniPhyNet using a publicly available dataset CL-Drive, which includes EEG, ECG, and EDA data collected from subjects in various driving scenarios to induce different levels of cognitive load. The single-modal version of our model is tested on EEG, ECG, and EDA data separately, while the multimodal version integrates all three data types through feature fusion. Our results demonstrate that UniPhyNet outperforms baseline feature-based models in both binary and ternary classification tasks, showing significant improvements in accuracy and robustness.