Cambridge Ideas – The Emotional Computer



logon to mean find out cambridge ideas transforming tomorrow bollards now what clever clogs i love gadgets like gps satellite navigation systems the fact that they're so difficult to use they were designed by sadist the problem is that computers don't react to how i feel where i'm pleased or annoyed they just ignore me no not really thank you won't go away my name is Peter Robinson I'm building emotionally intelligent computers ones that know how I feel ones that can read my mind when I talk to people I pick up emotions from things like their facial expression and their tone of voice and I change tack accordingly computers are really good at understanding what we're typing or even what we're saying but they need to understand not just what we're saying but how we're saying my facial expression is one key way of understanding me so I'm making a computer system that can sense my feelings in the same way a camera tracks feature points on my face and calculates head gestures and facial expressions these are then interpreted as combinations of 400 predefined mental states so frowning while shaking my head might indicate disagreement but just shaking a head alone might indicate confusion I really think we should go straight on another system analyzes my tone of voice the tempo the pitch the energy and interprets ease using the same predefined mental states that look like a really bad idea to me when I speak the way that I say things is almost as important as what I say very stupid machine that's it I'll beg ya I also express my feelings through movement of my body when I'm angry my gestures are big and forceful I'm sad they're small and gentle my system tracks my body posture and gesture and interpreters using the same set of mental states combining these three measures the computer can correctly read my mind over 70% of the time and that's as well as most people can understand me but that's not enough I want the computer to respond I won't eat to be emotionally expressive – okay now we're gonna film a couple of your facial expressions if you can you do angry please okay battled used forget cheeky now I can make an expressive face on the computer screen but how do I bring it out into my world so that we can really communicate I started with Virgil the chimp and found that people responded even to his expressions but clearly we needed something that looked a bit more like a human so we tried Elvis but he discovered it's live so this is Charles I've just had him built he has cameras in his eyes so that he can monitor my expression and there are twenty four motors controlling muscles in his face so that he can respond expressively as well I'm trying him out as a navigator in my driving simulator to see if he's a bit more friendly than my GPS let's go home be 100 meters to elect her hmm it crowded we try straight on good idea let's try that thank you my pleasure the way that Charles and I can communicate shows us the future of how people are going to interact with machines of Charles I think this is the beginning of a beautiful friendship you you

35 thoughts on “Cambridge Ideas – The Emotional Computer

  1. it's a fabulous video indeed. I do love the idea of being interactive emotionally with a machine like this doll man. Hope see this inventikn developed and fully equipped in future to daily use for public

  2. The whole projext is very promising as all humans interpert the world through all their senses, then linking them to certain feelings and emotions.
    However, In the effort to create a humanoid that people will feel familiar with, you have entered deep inside the uncanny valley as first observed by Masahiro Morι, negating all usefulness since very few people would feel connected and trust such a being.

  3. While not wishing to to disparage his work, i for one do not want a computer to try to read my emotions and reflect them, because i'll know its all fake. I want it to perform it's task correctly, or account for why it has failed. To pretend to empathise with me is infantilising. Perhaps satnav is the wrong application for this research, it may find a better home in other applications, none of which i can think of yet.

  4. I see everyone is very keen to give advice to the developers – hey, if you've got so many great ideas, why aren't you doing it yourself, instead of being armchair critics?

  5. I see everyone is very keen to give advice to the developers – hey, if you've got so many great ideas, why aren't you doing it yourself, instead of being armchair critics?

  6. Amazing project. I doubt anybody would need a plastic doll in the car.
    I'm curious how did you teach computer to recognize your facial expressions.
    Also did you use supervised machine learning techniques? The weak point of this approach is it can be only applied to people that can persuasively express their emotions in the training phase
    Overall, quite impressive.

  7. Allowing computers to read your mind is the greatest mistake you can ever do for human kind Mr. Robinson

  8. It's a great idea Mr Robinson ! If you ever have difficulties funding your research, you can commercialise human size dolls "who" can say « yes » and mean it !

Leave a Reply

Your email address will not be published. Required fields are marked *