
Understanding Human Emotional Models in Robotics
Explore the role of Affective Computing and AI in developing human emotional models for robots in robotics. Learn about emotion recognition, processing, and expression, along with the framework for emotional understanding in AI interactions.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Dr. SNS RAJALAKSHMI COLLEGE OF ARTS & SCIENCE (Autonomous) Coimbatore -641049 Accredited by NAAC(Cycle III) with A+ Grade (Recognized by UGC, Approved by AICTE, New Delhi and Affiliated to Bharathiar University, Coimbatore) DEPARTMENT OF B.Sc CS (artificialintelligence &data science) Intelligent Systems and RoBOTICS Cource code : 20UAI801 Mr.S.Sanjai Student, Department of Computer Science(AI&DS) 1/X Dr . SNSRCAS BSc CS(AI&DS)
HUMAN EMOTIONAL MODEL The human emotional model in robotics is based on Affective Computing, which enables robots to: Detect human emotions (via facial expressions, voice tone, body language) Process and understand emotions (using AI, deep learning, and psychology-based models) Respond appropriately (through speech, gestures, and facial expressions) Example: Humanoid robots like Sophia can recognize emotions and respond with facial expressions. AI assistants like Alexa or Siri detect voice tone to adjust responses. 2 Dr . SNSRCAS BSc CS(AI&DS) Created using P presentations.m
Components of the Emotional Model in Robotics Emotion Recognition Facial Recognition Emotion Processing (AI & Speech & Voice Analysis Deep Learning) Physiological Sensors Sentiment Analysis Cognitive Architectures Emotion Expression (Robot Responses) Facial Expressions Speech Modulation Gestures & Body Language Psychological Models 3 Dr . SNSRCAS BSc CS(AI&DS) Created using P presentations.m
THE HUMAN EMOTIONAL MODEL A Framework for Al's Emotional Understanding The robot also needs a model of the human it is talking to, so it can make different responses based on how the human is feeling. We model four emotions for our human interactions for the robot to use in formulating responses: happy/sad, and welcoming/distant. We can put emotion tags into our patterns in the script file with [happy] , [sad],[welcome] or [distant] to mark the emotions of responses. 4 Dr . SNSRCAS BSc CS(AI&DS) Created using P presentations.m
Emotion Model For example, if we are not getting answers to our questions, we can mark that response with [distant] to note that our subject is not being cooperative: 5 Dr . SNSRCAS BSc CS(AI&DS) Created using P presentations.m
Explaination o Our human emotion model makes use of a Python dictionary data structure to hold our model. o We have two axes, the Happy/Sad axis and the Welcome/Distant axis. o We move the Happy/Sad index up or down based on responses. o If we think a response expresses happy thoughts (Do you like school? Yes ), the program moves up the emotion index in the happy direction. o We use the intersection of these to set the current emotional index. o If the human is near the center, we note this as neutral, our starting point: 6 Dr . SNSRCAS BSc CS(AI&DS) Created using P presentations.m
Sample Program class HumanEmotionEngine(): def __init__(self): self.emostate = [90,0] self.emoText = "neutral 50" self.emotions = { "happy" : 50, "sad": 50, "welcome" : 50, "distant":50} # list of happy emotions and sad emotions self.emotBalance={"happy": "sad", "welcome":"distant"} self.emotionAxis = {'distant': 315, 'welcome': 135, 'sad': 225, 'happy': 45} self.update() 7 Dr . SNSRCAS BSc CS(AI&DS) Created using JJlpresentations.lt!I
Thank You all Did you have any questions about this topic? Please ask it to me, If I know I will clear your doubt 8 Dr . SNSRCAS BSc CS(AI&DS)