ECU scientist talks to computers

Better watch your language! An East Carolina University scientists wants computers to do a better job of responding to spoken words and has received a research award to iron out some of the wrinkles that go along with talking to machines.

In the research, Dr. Ronnie W. Smith, an ECU computer science professor, will examine some of the reasons for miscommunication among people and between humans and computers. The project is supported by the National Science Foundation (NSF). Computers are already talking to people but communication is limited mostly to switchboard operations and responding to callers and signals from telephone buttons. Smith believes if computers were actually taking the calls and responding to spoken questions and requests, people would be better served. “It’s a matter of trying to make all this wonderful information that we have more readily available to the average person who may or may not be computer proficient or even have the resources to have a computer,” he said. “The medium of the telephone and voice and natural language is something that most people can do.”

The project is software-oriented. Smith said he will purchase the best available hardware to handle the speech side of the problem, and will focus his attention on knowledge modeling and the underlying algorithms that control the way the computer responds. The NSF grant will provide about $45,000 a year over the three-year period. Part of the money will go towards part-time employment for two graduate students enrolled in ECU computer science master’s degree program.

An issue underlying the whole project, according to Smith, is a question that asks why should people and machines need to talk to each other at all? The focus, he said, will be on the need to accomplish a particular task. A person wants to do something or is in the process of doing something and uses the computer to get the job done. As an example, he said people call help lines all the time for assistance with various problems and usually end up having to wait long periods of time before talking to someone who can help them. The problem is the shortage of people available to operate the help lines.

If computers could do a better job of answering questions and responding to help line request, people would be better served. The challenge is making sure that the computer understands the information it gets from a person. In his study, Smith hopes to make improvements in dialog processing software that he helped to develop at Duke University in 1991.

“The big thing to me that was not resolved (in the Duke study) was that the computer did not handle misunderstanding very well,” said Smith. At the time, computers had a vocabulary of 125 words that were recognized through pre-recorded sound wave patterns. The words were often misunderstood because of differences in human dialects and changes variations in vocal tones.

The Duke project led to the development of the Natural Language Dialog System. In addition to writing the software code, the project involved a speech-based computer system that gives direction to a person trying to fix misplaced wires in an electronic circuit. In the Duke experiment, a person familiar with the system had to stay in the room while another person described to the computer the placement of wires on an electronic circuit board. The third person in the room had to tell the person when the computer misunderstood a statement or question.