数学院综合集成与知识科学研究组 Home    |    Contact   |    中文   |   ISS   |    CAS
Cognitive Robotics: Baby steps to learnin...
作者:Professor David Powers (Flinders University) 来源 : 中科院南楼N514 时间:2014-12-12 字体<    >

Abstract: 

Turing in 1950 proposed his imitation game test of intelligence or thinking, a test that is primarily about language, but indirectly about ontology or understanding the world.  Turing said that he expected he could win the game 30% of the time within 50 years by "buying the best sensors money could buy" and making the computer learn. Turing himself developing one of the earliest self-organizing models, one very similar to later models by von der Malsburg and Kohenen. The key here is that learning language is as much about learning about the world, and how we interact successfully with the world, as about learning linguistic knowledge. Piaget in the 1930s and Block in the 1960s emphasized the functional aspects of this, that the things we can do or use become building blocks that can be used for more complex purposes, including as building blocks of language - the sensory motor world is where we ground our nouns and verbs. 

Powers in the 1980s emphasized that language was negotiated more than learned, and that children and robots need to learn a sensory motor ontology, which Harnad in the 1990s characterized as the symbol grounding problem. At the same time Cognitive Linguistics emerged with an emphasis that the nature of language reflected the nature of the world, often using metaphor to describe the relationship - but Powers emphasizes that it is more than a metaphor and key to the nature of human learning. Luc Steels in the 1990s looked at this question from an evolutionary perspective, looking at how robots could evolve a primitive language from scratch. The message here is that our brain and our learning is targeted at dealing with and surviving in our environment, and that interpretation of our social, cultural and linguistic environment is just part of this. 

Powers' STANLIE system in 1984 operated in a toy robot world, and learned about the world (semantics) using exactly the same algorithms as it used for learning grammar (syntax). It learned grammatical information first, and then classes like nouns and verbs emerged. By 1992, 8 levels of self-organization were apparent within a text-based system, and self-organization of phonological information from speech vectors was demonstrated using the same algorithm in 1993. In the late 1990s a robot baby was developed, and then in the 2000s a series of Talking, Thinking and Teaching Heads were developed that exploited a mix of real world and virtual world grounding. The interesting point here is that by the 2010s we have come full circle, as now the Teaching Heads are performing useful tasks in real world applications as varied as education, health and defence, and they are the teachers rather than the learners, having spawned a startup company, Clevertar, that puts computerized companions on your Android, or your iPhone or iPad! 

相关附件
相关文档

CAS,Research Group of Meta-Synthesis and Knowledge Science
京ICP备05002806号-6  文保网安备案号 1101080081 邮箱: mcs@iss.ac.cn
电话:+86 10 82541801