Presenting Author

Nayarah Shabir

Presentation Type

Oral Presentation

Discipline Track

Biomedical ENGR/Technology/Computation

Abstract Type

Research/Clinical

Abstract

Background: Emotion is being referred to as a person’s mental state, since it relates to their ideas, feelings, and actions. There is a lot of evidence that health affects the emotion. Therefore, the nature of emotions ought to reveal the health of a person. The emotions are represented by facial expressions controlled by muscular motor actions. Brain health may affect the working of the muscles leading to the emotional changes extracted from the facial images.

Methods: A dataset of facial images annotated with matching emotion labels is the first step in using convolutional neural networks (CNNs) for facial expression emotion recognition. A CNN architecture is selected with particular layers intended for feature extraction, and a final layer for classification. Using the labelled images for training, the model's hyper parameters are adjusted to maximize the performance. Further it is evaluated using an independent test set to check the accuracy. This helps in predicting a more sophisticated understanding of the emotions linked with underlying diseases, such as cancer.

Results: Utilizing CNN, our initial examinations of facial images and the identified emotions suggest a clear connection between muscular actions and brain diseases, warranting further exploration for potential applications in detecting underlying brain-related illnesses. The utilization of facial muscles, beyond conveying a person's emotions, may also contribute to early cancer detection, manifesting in various symptoms and functional limitations such as paralysis, facial weakness, and alterations in facial appearances.

Conclusion: This study introduces an innovative emotion recognition using CNN, which has proved to be a potent and successful method for facial expression emotion recognition. CNNs are good at automatically deriving hierarchical features from photos, which makes them useful for identifying complex patterns in face expressions. The algorithm would also be useful for applications in clinical psychology and extracting the facial expressions linked with early detection of symptoms for various types of cancer.

Academic/Professional Position

Graduate Student

Share

COinS
 

Emotion Recognition as a Novel Indicator for Assessing Brain Health: A Machine learning Approach

Background: Emotion is being referred to as a person’s mental state, since it relates to their ideas, feelings, and actions. There is a lot of evidence that health affects the emotion. Therefore, the nature of emotions ought to reveal the health of a person. The emotions are represented by facial expressions controlled by muscular motor actions. Brain health may affect the working of the muscles leading to the emotional changes extracted from the facial images.

Methods: A dataset of facial images annotated with matching emotion labels is the first step in using convolutional neural networks (CNNs) for facial expression emotion recognition. A CNN architecture is selected with particular layers intended for feature extraction, and a final layer for classification. Using the labelled images for training, the model's hyper parameters are adjusted to maximize the performance. Further it is evaluated using an independent test set to check the accuracy. This helps in predicting a more sophisticated understanding of the emotions linked with underlying diseases, such as cancer.

Results: Utilizing CNN, our initial examinations of facial images and the identified emotions suggest a clear connection between muscular actions and brain diseases, warranting further exploration for potential applications in detecting underlying brain-related illnesses. The utilization of facial muscles, beyond conveying a person's emotions, may also contribute to early cancer detection, manifesting in various symptoms and functional limitations such as paralysis, facial weakness, and alterations in facial appearances.

Conclusion: This study introduces an innovative emotion recognition using CNN, which has proved to be a potent and successful method for facial expression emotion recognition. CNNs are good at automatically deriving hierarchical features from photos, which makes them useful for identifying complex patterns in face expressions. The algorithm would also be useful for applications in clinical psychology and extracting the facial expressions linked with early detection of symptoms for various types of cancer.

blog comments powered by Disqus
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.