[1] P. Desain e H. Honing. em Journal of New Music Research (1999). Computacional Models of Beat Induction – The Rule-Based Approach:

[2] N. Todd, R. Cousins, C. Lee, em Empirical Musicology Review, 2, pp. 1-13 and Repp (2007). The Contribution of Anthropometric Factors to Individual Differences in the Perception of Rhythm: etal.pdf

[3] Nick Collins, em 1st year Ph.D. report (2004). Beat Induction and Rhythm Analysis for Live Audio Processing:

[4] F. Gouyon, F. Widmer, G. Serra, X. Flexer, em Music Perception 24(2), pp. 181-194 (2006). Acoustic Cues to Beat Induction – A Machine Learning Perspective:

[5] E. Lee, U. Enke, J. Borchers e L. de Jong, em Proceedings of the 2007 Conference on New Interfaces for Musical expression (NIME07), New York, USA (2007). Towards Rhythmic Analysis of Human Motion using Acceleration-Onset Times:

[6] Carlos Guedes, em Proceedings of the 2006 IEEE International Conference on Systems, Man, and Cybernetics (2006). Extracting Musically-Relevant Rhythmic Information from Dance Movements by Applying Pitch Tracking Techniques to a Video Signal:           

[7] Jean-Julien Aucouturier, Yuta Ogai, em Proceedings of the 14th International Conference on Neural Information Processing (ICONIP), Kitakyushu, Japan (2007). Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons:

[8] Marek P. Michalowski, Selma Sabanovic e Hideki Kozima, em 16th IEEE International Conference on Robot & Human Interactive Communication, Jeju, Korea (2007 a). A Dancing Robot for Rhythmic Social Interaction:

[9] Marek P. Michalowski and Hideki Kozima, em 16th IEEE International Conference on Robot & Human Interactive Communication, Jeju, Korea (2007 b). Methodological Issues in Facilitating Rhythmic Play with Robots:

[10] M.P. Michalowski, S. Sabanovic, P. Michel, em 15th International Symposium on Robot and Human Interactive Communication (RO-MAN), Hatfield, UK (2006). Roillo: Creating a Social Robot for Playrooms:

[11] Gil Weinberg, Scott Driscoll e Mitchell Parry,em Proceedings of IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN) Nashville, TN (2005). Musical Interactions with a Perceptual Robotic Percussionist: 20interactions%20with%20a%22

[12] G. Weinberg, S. Driscoll, em Proceeding of the ACM/IEEE International Conference on Human-Robot Interaction (2007). The Perceptual Robotic Percussionist – New Developments in Form, Mechanics, Perception and Interaction Design: %20-%20New%20Developments%20in%20Design,%20Mechancisand% 20Perception%20submitted.pdf

[13] Gil Weinberg, Roberto Aimi e Kevin Jennings, em Proceedings of the 2002 Conference on New Instruments for Musical Expression (NIME-02), Dublin, Ireland (2002). The Beatbug Network – A Rhythmic System for Interdependent Group Collaboration:

[14] A. Arsenio e P. Fitzpatrick, em Proceedings of the 2nd International Conference on Computational Intelligence, Robotics, and Autonomous Systems, Singapore (2003). Exploiting Cross-Modal Rhythm for Robot Perception of Objects:

[15] A. Arsenio e P. Fitzpatrick, em International Journal of Humanoid Robotics, 2:2, pp. 125-143 (2005). Exploiting Amodal Cues for Robot Perception:

[16] E. Sahin, em Lecture Notes in Computer Science, pp. 10-20, Berlin Heidelberg, Springer-Verlag (2005). Swarm Robotics – From Sources of Inspiration to Domains of Application:

[17] Jacques Penders, em The Guardians Project, Sheffield Hallam University, UK (2005). Robot Swarming Applications:

[18] F. Tanaka e H. Suzuki, em Proceedings of the 2004 IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN), pp. 419-424, Kurashiki, Japan (2004). Dance Interaction with QRIO: A Case Study for Non-boring Interaction by using an Entertainment Ensemble Model:

[19] F. Tanaka, B. Fortenberry, K. Aisaka, J. Movellan, em Proceedings of 2005 4th IEEE International Conference on Development and Learning (ICDL), pp. 142-147, Osaka, Japan (2005). Plans for Developing Real-time Dance Interaction between QRIO and Toddlers in a Classroom Environment:

[20] F. Gouyon e S. Dixon, em Computer Music Journal 29:1 (2005). A Review of Automatic Rhythmic Description Systems:

[21] F. Gouyon, em Ph.D. Dissertation, Universitat Pompeu Fabra, Barcelona (2005). A Computational Approach to Rhythm Description - Audio Features for The Computation of Rhythm Periodicity Functions and their Use in Tempo Induction and Music Content Processing:

[22] Simon Dixon, em Journal of New Music Research, 30 (1), pp. 39-58 (2001). Automatic Extraction of Tempo and Beat from Expressive Performances:

[23] C. Guedes, em PhD thesis, New York University, New York, USA (2005). Mapping Movement to Musical Rhythm – A Study in Interactive Dance:

[24] Axel G.E. Mulder, em Technical Report, NSERC Hand Centered Studies of Human Movement Project (1994). Human Movement Tracking Technology:

[25] P. Desain, em Music Perception, 9(4), pp. 439-454 (1992). A (De)composable Theory of Rhythm Perception:

[26] P. Fraisse, em Deutsch (Ed.), The Psychology of Music, pp. 149-180, Orlando, FL, London: Academic Press (1982). Rhythm and Tempo:

[27] R. Parncutt, em Music Perception, 11, pp. 409-464, (1994). A Perceptual Model of Pulse Salience and Metrical Accent in Musical Rhythm:

[28] Urs Enke, em Diploma Thesis at the Media Computing Group, RWTH Aachen University (2006). DanSense – Rhythmic Analysis of Dance Movements using Acceleration –Onset Times:

[29] M. Goto e Y. Muraoka, em Proceedings of the 1995 International Computer Music Conference, pp. 171-174 (1995). A Real-time Beat Tracking System for Audio Signals:

[30] H. Longuet-Higgins e C. Lee, em Perception 11(2), pp. 115 – 128 (1982). The Perception of Musical Rhythms:

[31] Allen e Dannenberg, em International Computer Music Conference, International Computer Music Association, pp. 140-143 (1990). Tracking Musical Beats in Real Time:

[32] D. Rosenthal, em Computer Music Journal 16/1, pp. 64-76 (1992). Emulation of Human Rhythm Perception:

[33] F. Lerdahl e R. Jackendoff, em Cambridge: MIT Press (1983). A Generative Theory of Tonal Music:
Livro (Google Book) de 1996:,M1

[34] D. J. Povel e P. J. Essens, em Music Perception, 2, pp. 411-441 (1985). Perception of Temporal Patterns:
Power Point:

[35] R. Rowe, em Computer Music Journal 16(1), (1992). Machine Listening and Composing with Cypher:
Publicações de R. Rowe:

[36] A. Tanguiane, em Springer (1993). Artificial perception and music recognition:

[37] A. Schloss, em Ph.D. thesis, Stanford University (1985). On the Automatic Transcription of Percussive Music – From Acoustic Signal to High-Level Analysis:

[38] E. Scheirer, em Journal of the Acoustic Society of America, 103(1):558-601 (1997).
 Tempo and Beat Analysis of Acoustic Musical Signals:

[39] P. Cariani, em Journal New Music Research (2001). Temporal Codes, Timing Nets, and Music Perception:

[40] W. A. Sethares and T. W. Staley, em Journal of New Music Research, Vol. 30, No. 2, (2001). Meter and Periodicity in Musical Performance:

[41] E. W. Large, em Proceedings of the Eighteenth Annual Conference of the Cognitive Science Society (1996). Modeling Beat Perception with a Nonlinear Oscillator:

[42] A. T. Cemgil, H. J. Kappen, P. Desain, e H. Honing, em Journal of New Music Research, 28:4:259-273, (2001). On Tempo Tracking – Tempogram Representation and Kalman Filtering:

[43] P. Desain, em Contemporary Music Review, 9, pp. 239-254 (1993). A Connectionist and a Traditional AI Quantizer, Symbolic versus Sub-Symbolic Models of Rhythm Perception:

[44] C. Lee, em Representing Musical Structure, pp. 59-127, Academic Press (1991). The Perception of Metrical Structure – Experimental Evidence and a New Model: 0000810000S1000S91000002&idtype=cvips&gifs=yes

[45] Antonio Camurri, em 7th Int. Conference on Digital Audio Effects (DAFX-04), Naples, Italy, pp. 5-8, (2004). Multimodal Interfaces for Expressive Sound Control: