A singing robot has been successfully programmed to sing better than any robot has before. How, do you ask? Unlike last year’s singing robot model, the new and improved HRP-4C Humanoid actually sings songs in its own voice using a human singer as its model.
The Wall Street Journal reported that the significance of this new woman robot singer is that she analyses the song, and then breaks it down by the various notes to reproduce it in her own natural sounding voice. The Cybernetic Human HRP-4C also simulates facial expressions and pauses for breaths while singing. This makes for a very interesting and realistic display.
The humanoid robot software was developed by Japan’s National Institute of Advanced Industrial Science and Technology. “We hope the entertainment industry will be able to make widespread use of robots”, says Masataka Goto a researcher from the institute.
The HRP-4C has been programmed with new technology called vocawatcher, which allows her to take breaths and pauses as human singers do during performances. She also mimics the gestures of human singers to replicate the squinting of the eyes, the opening of the mouth and the tilting of the head. She has eight motors that control her facial expressions, which have been created to include two of the seven universal facial expressions anger and surprise according to technovelgy.com. Mr. Goto insists that this is one of the many steps humanoids must take in order to mimic humans in a natural way.
So why was HRP-4C created? Well simply to produce a more realistic looking, acting and emoting robot using the newest technological advances available.
Robots are being used in a wide variety of ways. Timothy Bickmore, a computer scientist at Northwestern University, is using new technology to create a robot that will be a virtual companion and caretaker for the elderly. Realistic robots that are virtually indistinguishable from a real person would be of great benefit for many people with special needs. People lives will be affected in a more positive and beneficial way if they could form emotional connections with a virtual human and not know that it was a robot. Bickmore notes that elderly people who live in isolation have three times the chance of dying over a 5 year period to those who live active lives.
In an earlier blog about emoting robots, we noted that robots are being created to befriend Autistic and special needs children by reading and responding to their emotions.
The HRP-4C debuted last year but has since been greatly improved. Her gestures are more naturalistic and she can learn songs on her own; whereas last year, engineers had to program her singing. It takes the robot about 10 minutes to an hour to learn a new song.
Take a look at the following video. What do you think? Is this new humanoid singing that realistic and impressive? Perhaps, if we wait several years we may not be able to differentiate between the facial expressions of a robot and the facial expressions of a human.
Photo courtesy of National Institute of of Advanced Industrial Science and Technology website:
http://www.aist.go.jp/aist_j/press_release/pr2009/pr20090316/pr20090316.html
I think this thing is really cool. From what I gather, these are only about $250-300k. That’s really an amazing price for something this advanced so early on in an emerging market. If I were of means, I would look into what other capabilities these things have and maybe even consider buying one depending on what all it can do, just because I’ve always loved robots and from as much as I can hear in this video (there aren’t many others of this version around yet), it’s a pretty decent singer for a robot. I would love to play with this technology and see what it can do!
But I find it really odd that the first emotions they chose to have it able to display are surprise and anger. I gather that means they wanted to keep the internal mechanisms simple in the beginning, so they focused on mouth opening which is used for singing, and eyebrow raising/lowering which would allow it to emote surprise and anger to some degree. I would think happiness would be the next logical step on the drawing board, and then sadness.
I would like to see what this robot looks like with some clothes on, or with different colors, because I suspect that it’s the color contrast that makes her hands look so disproportionately large since she’s supposed to be built based on a database of average Japanese female measurements.
I’d actually kind of like to work on a way to simulate more of the facial musculature to make it move more realistically. I have some interesting ideas on how to do it, I wonder if I could get involved somehow. :Hmm… time to save my pennies and see if I can play on my own to build a resume. 🙂