Hatice Gunes is a Turkish computer scientist who is Professor of Affective Intelligence & Robotics at the University of Cambridge. Gunes leads the Affective Intelligence & Robotics Lab. Her research considers human robot interactions and the development of sophisticated technologies with emotional intelligence.

Hatice Gunes
NationalityTurkish
Alma materUniversity of Technology Sydney
Yıldız Technical University
Scientific career
InstitutionsUniversity of Cambridge
Queen Mary University of London
Alan Turing Institute
ThesisVision-based multimodal analysis of affective face and upper-body behaviour (2007)

Early life and education edit

Gunes was an undergraduate student at the Yıldız Technical University.[citation needed] She moved to the University of Technology Sydney for her doctoral research, where she was awarded the Australian Government International Postgraduate Research Scholarship (IPRS) to focus on vision and machine learning based analysis of affective face and upper body behaviour.[1][2] Her doctoral research showed that affective face and body displays are simultaneous but not strictly synchronous; explicit detection of temporal phases (onset-apex-offset) can improve the accuracy of affect recognition; recognition from fused face and body modalities performs better than that from the face or the body modality alone; and synchronized feature-level fusion achieves better performance than decision-level fusion.[3] She created the Bimodal Face and Body Gesture Database (FABO), a collection of labelled videos of posed, affective face and body displays for automatic analysis of human nonverbal affective behavior.[4] After earning her doctorate, she was appointed an Australian Research Council postdoctoral fellow, and worked on airport and railway security through object human tracking.[citation needed] In 2008, Gunes moved to Imperial College London, where she worked alongside Maja Pantić in the Intelligent Behaviour Understanding Group (iBUG).[5][6] The project looked to build a dialogue system that can interact with humans via a virtual character.[7]

Research and career edit

In 2011, Gunes was appointed a lecturer at Queen Mary University of London. She remained there for four years, becoming an associate professor in 2014.[citation needed] She moved to the University of Cambridge in 2016, where she was promoted to a Professor of Affective Intelligence and Robotics. In 2019, she was awarded an Engineering and Physical Sciences Research Council fellowship, and was appointed a Faculty Fellow of the Alan Turing Institute.[8] Her fellowship considered human–robot interactions and the development of robot emotional intelligence through the study of human-human interactions.[9][10] She investigated the relationships between humans and their companion robots and looked to design robots with enhanced socio-emotional skills.[9]

Gunes was appointed President of the Association for the Advancement of Affective Computing in 2017. She is interested in how technologies can enhance a sense of wellbeing, through affective VR, autonomous and tele-presence social robotics.[11]

Selected publications edit

  • Evangelos Sariyanidi; Hatice Gunes; Andrea Cavallaro (1 June 2015). "Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition". IEEE Transactions on Pattern Analysis and Machine Intelligence. 37 (6): 1113–1133. doi:10.1109/TPAMI.2014.2366127. ISSN 0162-8828. PMID 26357337. Wikidata Q38584262.
  • Hatice Gunes; Massimo Piccardi (12 August 2008). "Automatic temporal segment detection and affect recognition from face and body display". IEEE Transactions on Systems, Man, and Cybernetics. 39 (1): 64–84. doi:10.1109/TSMCB.2008.927269. ISSN 1083-4419. PMID 19068431. Wikidata Q51860563.
  • Nicolaou, M. A.; Gunes, H.; Pantic, M. (2011). "Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space". IEEE Transactions on Affective Computing. 2 (2): 92–105. doi:10.1109/t-affc.2011.9. ISSN 1949-3045. S2CID 9296806.

References edit

  1. ^ Gunes, Hatice (2007). Vision-based multimodal analysis of affective face and upper-body behaviour (Thesis). OCLC 271213807.
  2. ^ Lakatos, Alessandra Rossi, Patrick Holthaus, Sílvia Moros, Marcus Scheunemann, Gabriella. "Programme". SCRITA 2021. Retrieved 2022-06-24.{{cite web}}: CS1 maint: multiple names: authors list (link)
  3. ^ Gunes, H.; Piccardi, M. (2009). "Automatic Temporal Segment Detection and Affect Recognition from Face and Body Display". IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics. 39 (1): 64–84. doi:10.1109/TSMCB.2008.927269. PMID 19068431. S2CID 1921438. Retrieved 2022-06-24.
  4. ^ Gunes, H.; Piccardi, M. (August 2006). "A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior". 18th International Conference on Pattern Recognition (ICPR'06). Vol. 1. pp. 1148–1153. doi:10.1109/ICPR.2006.39. hdl:10453/2540. ISBN 0-7695-2521-0. S2CID 117447.
  5. ^ "i·bug - people". ibug.doc.ic.ac.uk. Retrieved 2022-06-24.
  6. ^ "i·bug - people - Hatice Gunes". ibug.doc.ic.ac.uk. Retrieved 2022-06-24.
  7. ^ "Hatice Gunes's Home Page at Cambridge: Research". www.cl.cam.ac.uk. Retrieved 2022-07-29.
  8. ^ Gunes, Hatice (2019-12-18). "Prof Hatice Gunes". www.cst.cam.ac.uk. Retrieved 2022-06-24.
  9. ^ a b "Affective Mechanisms for Modelling Lifelong Human-Robot Relationships". 2018.
  10. ^ Dr Hatice Gunes Data-driven Artificial Social Intelligence: From Social Appropriateness to Fairness, retrieved 2022-06-24
  11. ^ "Advancing Wellbeing Seminar Series: Hatice Gunes". MIT Media Lab. Retrieved 2022-06-24.