Titulo:

Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
.

Sumario:

Este artículo aborda el diseño y desarrollo de un sistema avanzado de neurofeedback para el entrenamiento en habilidades y competencias de regulación emocional; el sistema integra una plataforma de Realidad Virtual (VR) con un dispositivo OpenBCI de 16 canales para la captura en tiempo real de señales electroencefalográficas (EEG). El principal objetivo de la investigación radica en la aplicación de algoritmos de aprendizaje automático, concretamente Random Forest y K-Nearest Neighbors (KNN), para la clasificación de estados emocionales en términos de valencia y excitación. Estos algoritmos logran una precisión de hasta el 83% para la clasificación de la excitación y del 90% para la valencia. Las señales de EEG se procesan y clasifican en t... Ver más

Guardado en:

2011-2084

2011-7922

17

2024-09-03

113

118

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

http://purl.org/coar/access_right/c_abf2

info:eu-repo/semantics/openAccess

id metarevistapublica_unisanbuenaventura_internationaljournalofpsychologicalresearch_21-article-7467
record_format ojs
spelling Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
Neurofeedback
Artículo de revista
Entrenamiento
Open BCI
aprendizaje automatico
realidad virtual
Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
regulación emocional,
Este artículo aborda el diseño y desarrollo de un sistema avanzado de neurofeedback para el entrenamiento en habilidades y competencias de regulación emocional; el sistema integra una plataforma de Realidad Virtual (VR) con un dispositivo OpenBCI de 16 canales para la captura en tiempo real de señales electroencefalográficas (EEG). El principal objetivo de la investigación radica en la aplicación de algoritmos de aprendizaje automático, concretamente Random Forest y K-Nearest Neighbors (KNN), para la clasificación de estados emocionales en términos de valencia y excitación. Estos algoritmos logran una precisión de hasta el 83% para la clasificación de la excitación y del 90% para la valencia. Las señales de EEG se procesan y clasifican en tiempo real y los resultados se integran en un entorno de realidad virtual creado en Unity. Este entorno adaptativo cambia según los estados emocionales detectados, permitiendo una regulación más precisa. Además, se ha desarrollado un protocolo de respiración diafragmática dentro del entorno de realidad virtual como estrategia de intervención para la regulación emocional. El sistema se encuentra en su etapa final de prueba para establecer la eficacia del sistema.
info:eu-repo/semantics/article
Gross, J. J. (2014). Emotion regulation: Conceptual and empirical foundations. En J. J. Gross (Ed.), Handbook of emotion regulation (2ª. ed., pp. 3-20). Guilford Press. Hermann, E. (2022). Neural responses to positive and negative valence: How can valence influence frontal alpha asymmetry? Tilburg University. Honda, S., Ishikawa, Y., Konno, R., Imai, E., Nomiyama, N., Sakurada, K., … Nakatani, M. (2020). Proximal Binaural Sound Can Induce Subjective Frisson. Frontiers in Psychology, 11(March, Article 316), 1–10. https://doi.org/10.3389/fpsyg.2020.00316
Inglés
http://creativecommons.org/licenses/by-nc-nd/4.0
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
REFERENCES Adorni, R., Brugnera, A., Gatti, A., Tasca, G. A., Sakatani, K., & Compare, A. (2019). Psychophysiological Responses to Stress Related to Anxiety in Healthy Aging: A Near- Infrared Spectroscopy (NIRS) Study. Journal of Psychophysiology, 33(3), 188–197. https://doi.org/10.1027/0269-8803/a000221 Allen, J. J. B., Coan, J. A., & Nazarian, M. (2004). Issues and assumptions on the road from raw signals to metrics of frontal EEG asymmetry in emotion. Biological Psychology, 67(1–2), 183–218. https://doi.org/10.1016/j.biopsycho.2004.03.007 Drossos, K., Floros, A., & Giannakoulopoulos, A. (2014). BEADS: A dataset of Binaural Emotionally Annotated Digital Sounds. IISA 2014 - 5th International Conference on Information, Intelligence, Systems and Applications, (July), 158–163.
Eisenberg, N., Hofer, C. and Vaughan, J. (2007) Effortful control and its socio-emotional consequences. In: Gross, J., Ed., Handbook of Emotion Regulation, 287-306.
Hughes, S., & Kearney, G. (2015). Fear and Localisation: Emotional Fine-Tuning Utlising Multiple Source Directions. AES: Journal of the Audio Engineering Society, (56th International Conference, London, UK). Hsu, B. W., & Wang, M. J. J. (2013). Evaluating the effectiveness of using electroencephalogram power indices to measure visual fatigue. Perceptual and Motor Skills, 116(1), 235–252. https://doi.org/10.2466/29.15.24.PMS.116.1.235-252 Katsigiannis, S., & Ramzan, N. (2018). DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices. IEEE Journal of Biomedical and Health Informatics, 22(1), 98–107. https://doi.org/10.1109/JBHI.2017.2688239
Navea, R. F., & Dadios, E. (2015). Beta/Alpha power ratio and alpha asymmetry characterization of EEG signals due to musical tone stimulation. Project Einstein 2015, (October). Posner, J., Russel, J. A., & Peterson, B. S. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. National Institude of Health, (17), 715–734. Sotgiu, A. De, Coccoli, M., & Vercelli, G. (2020). Comparing the perception of “sense of presence” between a stereo mix and a binaural mix in immersive music. 148th Audio Engineering Society Convention 2020, (Convention e-Brief 588), 1–5. Subramanian, R., Wache, J., Abadi, M. K., Vieriu, R. L., Winkler, S., & Sebe, N. (2018). ASCERTAIN: Emotion and personality recognition using commercial sensors. IEEE Transactions on Affective Computing, 9(2), 147–160. https://doi.org/10.1109/TAFFC.2016.2625250 Suhaimi, N. S., Mountstephens, J., & Teo, J. (2020). EEG-Based Emotion Recognition: A State- of-the-Art Review of Current Trends and Opportunities. Computational Intelligence and Neuroscience, 2020. https://doi.org/10.1155/2020/8875426 Yang, Y., Wu, Q., Qiu, M., Wang, Y., & Chen, X. (2018). Emotion Recognition from Multi- Channel EEG through Parallel Convolutional Recurrent Neural Network. In 2018 International Joint Conference on Neural Networks (IJCNN) (pp. 1-7). https://doi.org/10.1109/IJCNN.2018.8489331 Shen, F., Dai, G., Lin, G., Zhang, J., Kong, W., & Zeng, H. (2020). EEG-based emotion recognition using 4D convolutional recurrent neural network. Cognitive Neurodynamics, 14(6), 815-828. https://doi.org/10.1007/s11571-020-09634
Text
http://purl.org/coar/access_right/c_abf2
info:eu-repo/semantics/openAccess
http://purl.org/coar/version/c_970fb48d4fbd8a85
info:eu-repo/semantics/publishedVersion
http://purl.org/coar/resource_type/c_6501
Kim, J., Kim, W., & Kim, J.-T. (2015). Psycho-physiological responses of drivers to road section types and elapsed driving time on a freeway. Can. J. Civ. Eng., 42, 881–888. https://doi.org/https://doi.org/10.1139/cjce-2014-0392 Li, Y., Cai, J., Dong, Q., Wu, L., & Chen, Q. (2020). Psychophysiological responses of young people to soundscapes in actual rural and city environments. AES: Journal of the Audio Engineering Society, 68 (12), 910–925. https://doi.org/10.17743/JAES.2020.0060 Lepa, S., Weinzierl, S., Maempel, H. J., & Ungeheuer, E. (2014). Emotional impact of different forms of spatialization in everyday mediatized music listening: Placebo or technology effects? 136th Audio Engineering Society Convention 2014, (Convention Paper 9024), 141–148. Nair, S. (2016). Reverse Engineering Emotions in an Immersive Audio Mix Format. (IBC), 1–5.
https://revistas.usb.edu.co/index.php/IJPR/article/view/7467
Universidad San Buenaventura - USB (Colombia)
International Journal of Psychological Research
17
This paper addresses the design and development of an advanced neurofeedback system for training in emotional regulation skills and competencies; the system integrates a Virtual Reality (VR) platform with a 16-channel OpenBCI device for real-time capture of electroencephalographic (EEG) signals. The main objective of the research lies in the application of machine learning algorithms, specifically Random Forest and K-Nearest Neighbors (KNN), for the classification of emotional states in terms of valence and arousal. These algorithms achieve an accuracy of up to 83% for arousal classification and 90% for valence. EEG signals are processed and classified in real time and the results are integrated into a virtual reality environment created in Unity. This adaptive environment changes according to the detected emotional states, allowing for more precise regulation. In addition, a diaphragmatic breathing protocol has been developed within the virtual reality environment as an intervention strategy for emotional regulation. The system is in its final stage of piloting to establish the efficacy of the system.
Camelo Roa, Sandra Milena
Rodríguez, Belman Jahir
Neurofeedback
Emotional regulation
Virtual reality
Machine learning
OpenBCI
Training
2
Núm. 2 , Año 2024 : Interdisciplinary Approaches for Human Cognition: Expanding Perspectives on the Mind
Journal article
application/pdf
Publication
2011-2084
118
113
https://revistas.usb.edu.co/index.php/IJPR/article/download/7467/5567
2024-09-03T00:00:00Z
2024-09-03T00:00:00Z
2024-09-03
https://doi.org/10.21500/20112084.7467
10.21500/20112084.7467
2011-7922
institution UNIVERSIDAD DE SAN BUENAVENTURA
thumbnail https://nuevo.metarevistas.org/UNIVERSIDADDESANBUENAVENTURA_COLOMBIA/logo.png
country_str Colombia
collection International Journal of Psychological Research
title Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
spellingShingle Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
Camelo Roa, Sandra Milena
Rodríguez, Belman Jahir
Neurofeedback
Entrenamiento
Open BCI
aprendizaje automatico
realidad virtual
regulación emocional,
Neurofeedback
Emotional regulation
Virtual reality
Machine learning
OpenBCI
Training
title_short Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
title_full Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
title_fullStr Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
title_full_unstemmed Entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando OpenBCI
title_sort entrenamiento en regulación emocional impulsado por neurofeedback en un entorno de realidad virtual: un enfoque de aprendizaje automático utilizando openbci
description Este artículo aborda el diseño y desarrollo de un sistema avanzado de neurofeedback para el entrenamiento en habilidades y competencias de regulación emocional; el sistema integra una plataforma de Realidad Virtual (VR) con un dispositivo OpenBCI de 16 canales para la captura en tiempo real de señales electroencefalográficas (EEG). El principal objetivo de la investigación radica en la aplicación de algoritmos de aprendizaje automático, concretamente Random Forest y K-Nearest Neighbors (KNN), para la clasificación de estados emocionales en términos de valencia y excitación. Estos algoritmos logran una precisión de hasta el 83% para la clasificación de la excitación y del 90% para la valencia. Las señales de EEG se procesan y clasifican en tiempo real y los resultados se integran en un entorno de realidad virtual creado en Unity. Este entorno adaptativo cambia según los estados emocionales detectados, permitiendo una regulación más precisa. Además, se ha desarrollado un protocolo de respiración diafragmática dentro del entorno de realidad virtual como estrategia de intervención para la regulación emocional. El sistema se encuentra en su etapa final de prueba para establecer la eficacia del sistema.
description_eng This paper addresses the design and development of an advanced neurofeedback system for training in emotional regulation skills and competencies; the system integrates a Virtual Reality (VR) platform with a 16-channel OpenBCI device for real-time capture of electroencephalographic (EEG) signals. The main objective of the research lies in the application of machine learning algorithms, specifically Random Forest and K-Nearest Neighbors (KNN), for the classification of emotional states in terms of valence and arousal. These algorithms achieve an accuracy of up to 83% for arousal classification and 90% for valence. EEG signals are processed and classified in real time and the results are integrated into a virtual reality environment created in Unity. This adaptive environment changes according to the detected emotional states, allowing for more precise regulation. In addition, a diaphragmatic breathing protocol has been developed within the virtual reality environment as an intervention strategy for emotional regulation. The system is in its final stage of piloting to establish the efficacy of the system.
author Camelo Roa, Sandra Milena
Rodríguez, Belman Jahir
author_facet Camelo Roa, Sandra Milena
Rodríguez, Belman Jahir
topicspa_str_mv Neurofeedback
Entrenamiento
Open BCI
aprendizaje automatico
realidad virtual
regulación emocional,
topic Neurofeedback
Entrenamiento
Open BCI
aprendizaje automatico
realidad virtual
regulación emocional,
Neurofeedback
Emotional regulation
Virtual reality
Machine learning
OpenBCI
Training
topic_facet Neurofeedback
Entrenamiento
Open BCI
aprendizaje automatico
realidad virtual
regulación emocional,
Neurofeedback
Emotional regulation
Virtual reality
Machine learning
OpenBCI
Training
citationvolume 17
citationissue 2
citationedition Núm. 2 , Año 2024 : Interdisciplinary Approaches for Human Cognition: Expanding Perspectives on the Mind
publisher Universidad San Buenaventura - USB (Colombia)
ispartofjournal International Journal of Psychological Research
source https://revistas.usb.edu.co/index.php/IJPR/article/view/7467
language Inglés
format Article
rights http://creativecommons.org/licenses/by-nc-nd/4.0
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
http://purl.org/coar/access_right/c_abf2
info:eu-repo/semantics/openAccess
references_eng Gross, J. J. (2014). Emotion regulation: Conceptual and empirical foundations. En J. J. Gross (Ed.), Handbook of emotion regulation (2ª. ed., pp. 3-20). Guilford Press. Hermann, E. (2022). Neural responses to positive and negative valence: How can valence influence frontal alpha asymmetry? Tilburg University. Honda, S., Ishikawa, Y., Konno, R., Imai, E., Nomiyama, N., Sakurada, K., … Nakatani, M. (2020). Proximal Binaural Sound Can Induce Subjective Frisson. Frontiers in Psychology, 11(March, Article 316), 1–10. https://doi.org/10.3389/fpsyg.2020.00316
REFERENCES Adorni, R., Brugnera, A., Gatti, A., Tasca, G. A., Sakatani, K., & Compare, A. (2019). Psychophysiological Responses to Stress Related to Anxiety in Healthy Aging: A Near- Infrared Spectroscopy (NIRS) Study. Journal of Psychophysiology, 33(3), 188–197. https://doi.org/10.1027/0269-8803/a000221 Allen, J. J. B., Coan, J. A., & Nazarian, M. (2004). Issues and assumptions on the road from raw signals to metrics of frontal EEG asymmetry in emotion. Biological Psychology, 67(1–2), 183–218. https://doi.org/10.1016/j.biopsycho.2004.03.007 Drossos, K., Floros, A., & Giannakoulopoulos, A. (2014). BEADS: A dataset of Binaural Emotionally Annotated Digital Sounds. IISA 2014 - 5th International Conference on Information, Intelligence, Systems and Applications, (July), 158–163.
Eisenberg, N., Hofer, C. and Vaughan, J. (2007) Effortful control and its socio-emotional consequences. In: Gross, J., Ed., Handbook of Emotion Regulation, 287-306.
Hughes, S., & Kearney, G. (2015). Fear and Localisation: Emotional Fine-Tuning Utlising Multiple Source Directions. AES: Journal of the Audio Engineering Society, (56th International Conference, London, UK). Hsu, B. W., & Wang, M. J. J. (2013). Evaluating the effectiveness of using electroencephalogram power indices to measure visual fatigue. Perceptual and Motor Skills, 116(1), 235–252. https://doi.org/10.2466/29.15.24.PMS.116.1.235-252 Katsigiannis, S., & Ramzan, N. (2018). DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices. IEEE Journal of Biomedical and Health Informatics, 22(1), 98–107. https://doi.org/10.1109/JBHI.2017.2688239
Navea, R. F., & Dadios, E. (2015). Beta/Alpha power ratio and alpha asymmetry characterization of EEG signals due to musical tone stimulation. Project Einstein 2015, (October). Posner, J., Russel, J. A., & Peterson, B. S. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. National Institude of Health, (17), 715–734. Sotgiu, A. De, Coccoli, M., & Vercelli, G. (2020). Comparing the perception of “sense of presence” between a stereo mix and a binaural mix in immersive music. 148th Audio Engineering Society Convention 2020, (Convention e-Brief 588), 1–5. Subramanian, R., Wache, J., Abadi, M. K., Vieriu, R. L., Winkler, S., & Sebe, N. (2018). ASCERTAIN: Emotion and personality recognition using commercial sensors. IEEE Transactions on Affective Computing, 9(2), 147–160. https://doi.org/10.1109/TAFFC.2016.2625250 Suhaimi, N. S., Mountstephens, J., & Teo, J. (2020). EEG-Based Emotion Recognition: A State- of-the-Art Review of Current Trends and Opportunities. Computational Intelligence and Neuroscience, 2020. https://doi.org/10.1155/2020/8875426 Yang, Y., Wu, Q., Qiu, M., Wang, Y., & Chen, X. (2018). Emotion Recognition from Multi- Channel EEG through Parallel Convolutional Recurrent Neural Network. In 2018 International Joint Conference on Neural Networks (IJCNN) (pp. 1-7). https://doi.org/10.1109/IJCNN.2018.8489331 Shen, F., Dai, G., Lin, G., Zhang, J., Kong, W., & Zeng, H. (2020). EEG-based emotion recognition using 4D convolutional recurrent neural network. Cognitive Neurodynamics, 14(6), 815-828. https://doi.org/10.1007/s11571-020-09634
Kim, J., Kim, W., & Kim, J.-T. (2015). Psycho-physiological responses of drivers to road section types and elapsed driving time on a freeway. Can. J. Civ. Eng., 42, 881–888. https://doi.org/https://doi.org/10.1139/cjce-2014-0392 Li, Y., Cai, J., Dong, Q., Wu, L., & Chen, Q. (2020). Psychophysiological responses of young people to soundscapes in actual rural and city environments. AES: Journal of the Audio Engineering Society, 68 (12), 910–925. https://doi.org/10.17743/JAES.2020.0060 Lepa, S., Weinzierl, S., Maempel, H. J., & Ungeheuer, E. (2014). Emotional impact of different forms of spatialization in everyday mediatized music listening: Placebo or technology effects? 136th Audio Engineering Society Convention 2014, (Convention Paper 9024), 141–148. Nair, S. (2016). Reverse Engineering Emotions in an Immersive Audio Mix Format. (IBC), 1–5.
type_driver info:eu-repo/semantics/article
type_coar http://purl.org/coar/resource_type/c_6501
type_version info:eu-repo/semantics/publishedVersion
type_coarversion http://purl.org/coar/version/c_970fb48d4fbd8a85
type_content Text
publishDate 2024-09-03
date_accessioned 2024-09-03T00:00:00Z
date_available 2024-09-03T00:00:00Z
url https://revistas.usb.edu.co/index.php/IJPR/article/view/7467
url_doi https://doi.org/10.21500/20112084.7467
issn 2011-2084
eissn 2011-7922
doi 10.21500/20112084.7467
citationstartpage 113
citationendpage 118
url2_str_mv https://revistas.usb.edu.co/index.php/IJPR/article/download/7467/5567
_version_ 1832800480288309248