This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Natural Language in Multimodal Human-Computer Interface
April 1994 (vol. 9 no. 2)
pp. 40-44

Two prototype information-access applications show how the integration of natural language and hypermedia produces systems that allow users and systems to exchange more information. The author considers how the Natural Language Group designed the ALFresco prototype to provide information on 14th-century Italian frescoes and monuments and to suggest other masterpieces that might interest the user. He also discusses the MAIA project which integrates components developed by IRST researchers in speech recognition, natural language, knowledge representation, vision, reasoning, and other areas of AI.

Citation:
Olivero Stock, "Natural Language in Multimodal Human-Computer Interface," IEEE Intelligent Systems, vol. 9, no. 2, pp. 40-44, April 1994, doi:10.1109/64.294134
Usage of this product signifies your acceptance of the Terms of Use.