Researcher Lijun Yin, an associate professor of computer science at Binghamton University, has been working on a new universal algorithm to create a digital reading of the human face to identify specific human emotions. “Many scientists are working on similar issues, we are unique in our three-dimensional approach,” Yin stated.
The Pipe Dream website reported that the research has been funded by such sources as the National Science Foundation, the Air Force Research Laboratory and the New York State Office of Science, Technology and Academic Research.
Yin affirms, “We have developed several different things in the past few years regarding a camera’s ability to analyze the human face. Working with [BU associate Professor of Psychology] Peter Gerhardstein, we are trying to get the computer to distinguish between basic human emotions consistently and reliably.”
He suggests that the application of such technology is widespread. Some of the applications range from the use of cameras in hospitals to measure pain levels to operating a computer without the use of a mouse or keyboard.
According to Yin, “Even humans have a significant difficulty being able to visually identify the emotional state of their counterparts quickly and effectively.” He goes on to note that there are limitations to this type of software, “Different people display emotions differently, and a computer can only understand a certain pattern.”