This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Cheek to Chip: Dancing Robots and AI's Future
March/April 2008 (vol. 23 no. 2)
pp. 74-84
Jean-Julien Aucouturier, University of Tokyo
Katsushi Ikeuchi, University of Tokyo
Hirohisa Hirukawa, National Institute of Advanced Industrial Science and Technology
Shin'ichiro Nakaoka, National Institute of Advanced Industrial Science and Technology
Takaaki Shiratori, University of Tokyo
Shunsuke Kudoh, University of Tokyo
Fumio Kanehiro, National Institute of Advanced Industrial Science and Technology
Tetsuya Ogata, Kyoto University
Hideki Kozima, National Institute of Information and Communications Technology
Hiroshi G. Okuno, Kyoto University
Marek P. Michalowski, Carnegie Mellon University
Yuta Ogai, University of Tokyo
Takashi Ikegami, University of Tokyo
Kazuhiro Kosuge, Tohoku University
Takahiro Takeda, Tohoku University
Yasuhisa Hirata, Tohoku University
More and more AI researchers are trying to make robots dance to music. This installment of T&C features five essays showing how this research addresses issues that are central to dance.

1. K. Ikeuchi and T. Suehiro, "Toward an Assembly Plan from Observation Part I: Task Recognition with Polyhedral Objects," IEEE Trans. Robotics &Automation, vol. 10, no. 3, 1994, pp. 365–385.
2. J. Takamatsu et al., "Recognizing Assembly Tasks through Human Demonstration," Int'l J. Robotics Research, vol. 26, no. 7, 2007, pp. 641–659.
3. S. Nakaoka et al., "Learning from Observation Paradigm: Leg Task Models for Enabling a Biped Humanoid Robot to Imitate Human Dance," Int'l J. Robotics Research, vol. 26, no. 8, 2007, pp. 829–844.
4. T. Shiratori, A. Nakazawa, and K. Ikeuchi, "Detecting Dance Motion Structure through Music Analysis," Proc. 6th Int'l Conf. Automatic Face and Gesture Recognition (FGR 04), IEEE, 2004, pp. 857–862.
5. S. Kudho, T. Komura, and K. Ikeuchi, "Modeling and Generating Whole Body Motion of Balance Maintenance," Systems and Computers, vol. 37, no. 13, 2006, pp. 11–19.
1. Y. Hattori et al., "Robot Gesture Generation from Environmental Sounds Using Intermodality Mapping," Proc. Int'l Workshop Epigenetic Robotics (EpiRob 05), 2005, pp. 139–140.
2. J. Tani and M. Ito, "Self-Organization of Behavioral Primitives as Multiple Attractor Dynamics: A Robot Experiment," IEEE Trans. Systems, Man, and Cybernetics, Part A, vol. 33, no. 4, pp. 481–488.
3. M. Jordan, "Attractor Dynamics and Parallelism in a Connectionist Sequential Machine," Proc. 8th Ann. Conf. Cognitive Science Soc., Lawrence Erlbaum Associates, 1986, pp. 513–546.
4. H. Kozima et al., "A Toy-like Robot in the Playroom for Children with Developmental Disorders," Proc. Int'l Conf. Development and Learning (ICDL 04), 2004; http://mplab.ucsd.edu/~icdl/rev-proceedings/ ICDL-BookICDL-fin2.pdf.
5. V.S. Ramachandran and E.M. Hubbard, "Hearing Colors, Tasting Shapes," Scientific American, vol. 288, no. 5, 2003, pp. 53–59.
1. J.K. Burgoon, L.A. Stern, and L. Dillman, Interpersonal Adaptation: Dyadic Interaction Patterns, Cambridge Univ. Press, 1995.
2. H. Kozima, C. Nakagawa, and Y. Yasuda, "Children-Robot Interaction: A Pilot Study in Autism Therapy," From Action to Cognition, C. Von Hofsten and K. Rosander, eds., Elsevier, 2007, pp. 385–400.
3. M.P. Michalowski, S. Sabanovic, and H. Kozima, "A Dancing Robot for Rhythmic Social Interaction," Proc. 2nd Ann. Conf. Human-Robot Interaction (HRI 07), ACM Press, 2007, pp. 89–96.
4. M.P. Michalowski and H. Kozima, "Methodological Issues in Facilitating Rhythmic Play with Robots," Proc. 16th IEEE Int'l Symp. Robot and Human Interactive Communication (RO-MAN 2007), IEEE Press, 2007, pp. 95–100.
1. K. Kaneko and I. Tsuda, "Chaotic Itinerancy," Chaos, vol. 13, no. 3, 2003, pp. 926–936.
2. T. Ikegami, "Simulating Active Perception and Mental Imagery with Embodied Chaotic Itinerancy," J. Consciousness Studies, vol. 14, no. 7, 2007, pp. 111–125.
3. J.-J. Aucouturier, Y. Ogai, and T. Ikegami, "Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons," Proc. 14th Int'l Conf. Neural Information Processing (ICONIP07), Springer, 2007; www.jj-aucouturier.info/papersICONIP-2007.pdf .
4. T. Kostova, R. Ravindran, and M. Schonbek, "Fitzhugh-Nagumo Revisited: Types of Bifurcations, Periodical Forcing and Stability Regions by a Lyapunov Functional," Int'l J. Bifurcation and Chaos, vol. 14, no. 4, 2004, pp. 913–925.
5. M. Palus et al., "Synchronization as Adjustment of Information Rates: Detection from Bivariate Time Series," Physical Rev. E, vol. 63, no. 4, 2001.
1. H. Kazerooni, "Human Robot Interaction via the Transfer of Power and Information Signals," Proc. 1989 IEEE Int'l Conf. Robotics and Automation, IEEE Press, 1989, pp. 1632–1647.
2. T. Takeda, Y. Hirata, and K. Kosuge, "Dance Step Estimation Method Based on HMM for Dance Partner Robot," IEEE Trans. Industrial Electronics, vol. 54, no. 2, 2007, pp. 699–706.
3. T. Takeda, Y. Hirata, and K. Kosuge, "Dance Partner Robot Cooperative Motion Generation with Adjustable Length of Dance Step Stride Based on Physical Interaction," Proc. 2007 IEEE/RSJ Int'l Conf. Intelligent Robots and Systems, IEEE Press, 2007, pp. 3258–3263.
4. Y. Hirata, A. Muraki, and K. Kosuge, "Motion Control of Intelligent Passive-Type Walker for Fall-Prevention Function Based on Estimation of User State," Proc. 2006 IEEE Int'l Conf. Robotics and Automation, IEEE Press, 2006, pp. 3498–3503.

Index Terms:
robotics, autonomous behavior, chaotic itinerancy, Dance Partner Robot, humanoid robots, intermodal mapping, Keepon, neural mapping, rhythmic intelligence, situated knowledge, social intelligence, symbol grounding, synesthesia, task models
Citation:
Jean-Julien Aucouturier, Katsushi Ikeuchi, Hirohisa Hirukawa, Shin'ichiro Nakaoka, Takaaki Shiratori, Shunsuke Kudoh, Fumio Kanehiro, Tetsuya Ogata, Hideki Kozima, Hiroshi G. Okuno, Marek P. Michalowski, Yuta Ogai, Takashi Ikegami, Kazuhiro Kosuge, Takahiro Takeda, Yasuhisa Hirata, "Cheek to Chip: Dancing Robots and AI's Future," IEEE Intelligent Systems, vol. 23, no. 2, pp. 74-84, March-April 2008, doi:10.1109/MIS.2008.22
Usage of this product signifies your acceptance of the Terms of Use.