Computers speak a language of their own.
They can only be programmed by those,
who know the code. Computer scientists of Karlsruhe Institute of
Technology (KIT) are presently working on a software that directly
translates natural language into machine-readable source texts. In this
way, users may generate own computer applications in a few sentences.
The challenge to be managed is that people do not always describe
processes in a strictly chronological order. A new analysis tool
developed by KIT researchers serves to automatically order the commands
in the way they are to be executed by the computer.
"We want to get away from complicated rules for users --
this is what
programming languages are -- towards smart computers that enter into a
dialog with us," says Mathias Landhäußer, scientist of KIT's Institute
for Program Structures and Data Organization (ITP). So far, programs can
only be controlled by language, if they are designed accordingly by the
manufacturer. An example is the sending of short messages via a
smartphone. The KIT computer scientists are presently working on a
software that installs a language interface for any type of programs.
Users are enabled not only to open, but also to operate their apps by
spoken commands. The scientists have already successfully incorporated
such an interface in an application controlling the heating system,
illumination, and windows of smart houses.
"It will take some time until complex software will not only be
operated, but also programmed in natural language," Landhäußer thinks. A
central communication problem between man and machine -- the problem of
order -- has just been solved by the scientists, with the English
language being used as a first example. "Let's have a look at the
sentence 'Before the car starts, the garage door opens.' In our everyday
language, this description is quite usual," Landhäußer says. If a
process is to take place in a virtual world on the computer, however, a
problem arises: The computer executes commands successively in the order
they arrive. In the example given, the computer first receives the
information "the car starts." Then, the information "the garage door
opens" is received. Hence, the car would hit the garage door. "If such a
chain of actions is not envisaged by the program, nothing happens in
the best case. In the worst case, the computer crashes," the computer
scientist says.The new software developed by the KIT scientists analyzes time-related signal words that indicate that the spoken text does not describe the process order in a strictly linear manner. Such signal words indicate whether something takes place "before" or "after," "first" or "last," irrespective of the information's position in a sentence. The computer scientists allocate logical formulas to these words in order to generate a chronological order in the source text. When applied to the example given above, the formula for the signal word "before" moves the main clause automatically to the front. The result is: The garage door opens before the car starts.
researchers news on programming
According to the researchers, requirements made for computer-tailored speaking are no reliable alternative. First tests show that test persons with and without programming knowledge do not speak in strictly chronological order, even if they are asked to do so. Instead, they continue to unconsciously use signal words. "It is our objective that the computer adapts to the way the user speaks, and not the other way around," Landhäußer says.Apart from the order problem, the scientists have identified other challenges in programming in natural language. The test persons replaced some words by synonyms or pronouns. Computers do not automatically understand that the term "car" means the same as "vehicle" or "it" in a following sentence. "People understand these relationships, because the situation flashes like a film on their inner eye. We are working on giving compu
0 comments:
Post a Comment