Effective caregiving for infants necessitates the precise and prompt identification of their emotional states, primarily conveyed by facial expressions and cries. Caregivers face substantial difficulty in interpreting these emotions, resulting in delays in addressing an infant's needs. This research focuses on developing an AI-based model to identify and classify newborns' emotions based on their facial expressions, using Convolutional Neural Network (CNN) architecture (Inception V3) with a dataset of baby facial expressions. The system integrates multiple sensors to collect multi-modal data, improving cry emotion classification. The CNN model analyzes the newborn's emotional state, while input from the sensors combined with fuzzy logic identifies potential distress causes, such as hunger or discomfort. The classification report for the infant emotion recognition model shows an overall test accuracy of 0.85. The precision, recall, and F1 scores for happy, neutral, and sad emotions indicate the model's effectiveness, particularly in identifying happy and sad emotions. This demonstrates the model's potential for real-time caregiving applications, providing timely and accurate emotional assessments to improve infant care quality.
Key words: Artificial Intelligence
Emotions
Infant
Facial
Expressions
|