References:

[1] P. Desain e H. Honing. em Journal of New Music Research (1999). Computacional Models of Beat Induction – The Rule-Based Approach:
http://www.nici.kun.nl/mmm/papers/dh-100/dh-100.pdf

[2] N. Todd, R. Cousins, C. Lee, em Empirical Musicology Review, 2, pp. 1-13 and Repp (2007). The Contribution of Anthropometric Factors to Individual Differences in the Perception of Rhythm:
https://kb.osu.edu/dspace/bitstream/1811/24478/1/EMR000021a-Todd- etal.pdf

[3] Nick Collins, em 1st year Ph.D. report (2004). Beat Induction and Rhythm Analysis for Live Audio Processing:
http://www.cus.cam.ac.uk/~nc272/papers/pdfs/report1.pdf

[4] F. Gouyon, F. Widmer, G. Serra, X. Flexer, em Music Perception 24(2), pp. 181-194 (2006). Acoustic Cues to Beat Induction – A Machine Learning Perspective:
www.ofai.at/cgi-bin/get-tr?download=1&paper=oefai-tr-2006-14.pdf

[5] E. Lee, U. Enke, J. Borchers e L. de Jong, em Proceedings of the 2007 Conference on New Interfaces for Musical expression (NIME07), New York, USA (2007). Towards Rhythmic Analysis of Human Motion using Acceleration-Onset Times:
http://itp.nyu.edu/nime/2007/proc/nime2007_136.pdf

[6] Carlos Guedes, em Proceedings of the 2006 IEEE International Conference on Systems, Man, and Cybernetics (2006). Extracting Musically-Relevant Rhythmic Information from Dance Movements by Applying Pitch Tracking Techniques to a Video Signal:
http://gmem.free.fr/smc06/papers/4-Guedes-ExtractingMusicREV.pdf           

[7] Jean-Julien Aucouturier, Yuta Ogai, em Proceedings of the 14th International Conference on Neural Information Processing (ICONIP), Kitakyushu, Japan (2007). Making a Robot Dance to Music Using Chaotic Itinerancy in a Network of FitzHugh-Nagumo Neurons:
 http://www.jj-aucouturier.info/papers/ICONIP-2007.pdf

[8] Marek P. Michalowski, Selma Sabanovic e Hideki Kozima, em 16th IEEE International Conference on Robot & Human Interactive Communication, Jeju, Korea (2007 a). A Dancing Robot for Rhythmic Social Interaction:
http://www.cs.cmu.edu/~marekm/publications/HRI07MichalowskiEtal.pdf

[9] Marek P. Michalowski and Hideki Kozima, em 16th IEEE International Conference on Robot & Human Interactive Communication, Jeju, Korea (2007 b). Methodological Issues in Facilitating Rhythmic Play with Robots:
http://www.cs.cmu.edu/~marekm/publications/ROMAN07MichalowskiKozima.pdf

[10] M.P. Michalowski, S. Sabanovic, P. Michel, em 15th International Symposium on Robot and Human Interactive Communication (RO-MAN), Hatfield, UK (2006). Roillo: Creating a Social Robot for Playrooms:
http://www.cs.cmu.edu/~marekm/publications/ROMAN06MichalowskiEtal.pdf

[11] Gil Weinberg, Scott Driscoll e Mitchell Parry,em Proceedings of IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN) Nashville, TN (2005). Musical Interactions with a Perceptual Robotic Percussionist:
http://coa.gatech.edu/~gil/Haile_RoManF.pdf#search=%22musical% 20interactions%20with%20a%22

[12] G. Weinberg, S. Driscoll, em Proceeding of the ACM/IEEE International Conference on Human-Robot Interaction (2007). The Perceptual Robotic Percussionist – New Developments in Form, Mechanics, Perception and Interaction Design:
http://coa.gatech.edu/%7Egil/The%20Perceptual%20Robotic%20Percussionist %20-%20New%20Developments%20in%20Design,%20Mechancisand% 20Perception%20submitted.pdf

[13] Gil Weinberg, Roberto Aimi e Kevin Jennings, em Proceedings of the 2002 Conference on New Instruments for Musical Expression (NIME-02), Dublin, Ireland (2002). The Beatbug Network – A Rhythmic System for Interdependent Group Collaboration:
http://www.media.mit.edu/hyperins/papers/weinberg_NIME02.pdf

[14] A. Arsenio e P. Fitzpatrick, em Proceedings of the 2nd International Conference on Computational Intelligence, Robotics, and Autonomous Systems, Singapore (2003). Exploiting Cross-Modal Rhythm for Robot Perception of Objects: http://people.csail.mit.edu/paulfitz/pub/arsenio03exploiting.pdf

[15] A. Arsenio e P. Fitzpatrick, em International Journal of Humanoid Robotics, 2:2, pp. 125-143 (2005). Exploiting Amodal Cues for Robot Perception:
http://people.csail.mit.edu/paulfitz/pub/arsenio05exploiting.pdf

[16] E. Sahin, em Lecture Notes in Computer Science, pp. 10-20, Berlin Heidelberg, Springer-Verlag (2005). Swarm Robotics – From Sources of Inspiration to Domains of Application:
http://www.kovan.ceng.metu.edu.tr/~erol/publications/pdf/METU-CENG-TR-2005-01.pdf

[17] Jacques Penders, em The Guardians Project, Sheffield Hallam University, UK (2005). Robot Swarming Applications:
http://www.cs.unimaas.nl/jaap60/papers/B31_penders.pdf

[18] F. Tanaka e H. Suzuki, em Proceedings of the 2004 IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN), pp. 419-424, Kurashiki, Japan (2004). Dance Interaction with QRIO: A Case Study for Non-boring Interaction by using an Entertainment Ensemble Model:
http://mplab.ucsd.edu/~boom/paper/Tanaka_ROMAN-04.pdf

[19] F. Tanaka, B. Fortenberry, K. Aisaka, J. Movellan, em Proceedings of 2005 4th IEEE International Conference on Development and Learning (ICDL), pp. 142-147, Osaka, Japan (2005). Plans for Developing Real-time Dance Interaction between QRIO and Toddlers in a Classroom Environment:
http://mplab.ucsd.edu/~boom/paper/Tanaka_ICDL-05.pdf

[20] F. Gouyon e S. Dixon, em Computer Music Journal 29:1 (2005). A Review of Automatic Rhythmic Description Systems:
http://www.iua.upf.es/mtg/publications/CMJ2004-GouyonDixon.pdf

[21] F. Gouyon, em Ph.D. Dissertation, Universitat Pompeu Fabra, Barcelona (2005). A Computational Approach to Rhythm Description - Audio Features for The Computation of Rhythm Periodicity Functions and their Use in Tempo Induction and Music Content Processing:
http://mtg.upf.edu/publications/9d0455-PhD-Gouyon.pdf

[22] Simon Dixon, em Journal of New Music Research, 30 (1), pp. 39-58 (2001). Automatic Extraction of Tempo and Beat from Expressive Performances: http://www.elec.qmul.ac.uk/people/simond/pub/2001/jnmr.pdf

[23] C. Guedes, em PhD thesis, New York University, New York, USA (2005). Mapping Movement to Musical Rhythm – A Study in Interactive Dance:
http://homepage.mac.com/carlosguedes/.Public/MappingMovement.pdf

[24] Axel G.E. Mulder, em Technical Report, NSERC Hand Centered Studies of Human Movement Project (1994). Human Movement Tracking Technology:
http://www.xspasm.com/x/sfu/vmi/HMTT.pub.pdf

[25] P. Desain, em Music Perception, 9(4), pp. 439-454 (1992). A (De)composable Theory of Rhythm Perception:
http://www.nici.kun.nl/mmm/papers/d-92-a.rtf

[26] P. Fraisse, em Deutsch (Ed.), The Psychology of Music, pp. 149-180, Orlando, FL, London: Academic Press (1982). Rhythm and Tempo:
Livro:http://opac.porbase.org/ipac20/ipac.jsp?profile=porbase&uri=full=3100024@!780333@!0&ri=1&aspect=basic_search&menu=search&ipp=20&staffonly=&term=&index=&uindex=&aspect=basic_search&menu=search&ri=1

[27] R. Parncutt, em Music Perception, 11, pp. 409-464, (1994). A Perceptual Model of Pulse Salience and Metrical Accent in Musical Rhythm:
http://www-gewi.uni-graz.at/staff/parncutt/publications/Pa94_Pulse.pdf

[28] Urs Enke, em Diploma Thesis at the Media Computing Group, RWTH Aachen University (2006). DanSense – Rhythmic Analysis of Dance Movements using Acceleration –Onset Times:
http://media.informatik.rwth-aachen.de/materials/publications/enke2006.pdf

[29] M. Goto e Y. Muraoka, em Proceedings of the 1995 International Computer Music Conference, pp. 171-174 (1995). A Real-time Beat Tracking System for Audio Signals:
http://staff.aist.go.jp/m.goto/PAPER/ICMC95/bts.html

[30] H. Longuet-Higgins e C. Lee, em Perception 11(2), pp. 115 – 128 (1982). The Perception of Musical Rhythms:
http://www.perceptionweb.com/abstract.cgi?id=p110115

[31] Allen e Dannenberg, em International Computer Music Conference, International Computer Music Association, pp. 140-143 (1990). Tracking Musical Beats in Real Time:
http://www.cs.cmu.edu/~rbd/papers/bticmc.pdf

[32] D. Rosenthal, em Computer Music Journal 16/1, pp. 64-76 (1992). Emulation of Human Rhythm Perception:
 http://dspace.mit.edu/bitstream/1721.1/12855/1/27778737.pdf

[33] F. Lerdahl e R. Jackendoff, em Cambridge: MIT Press (1983). A Generative Theory of Tonal Music:
Livro (Google Book) de 1996:
http://books.google.com/books?hl=pt-PT&lr=&id=6HGiEW33lucC&oi=fnd&pg=PR13&dq=%22Lerdahl%22+%22A+Generative+Theory+of+Tonal+Music%22+&ots=amGKhEA0Ks&sig=NI3DfE7-8ClDpTMHB-inMd5iR3g#PPP1,M1

[34] D. J. Povel e P. J. Essens, em Music Perception, 2, pp. 411-441 (1985). Perception of Temporal Patterns:
http://doi.apa.org/?uid=1986-19088-001
Power Point: http://www-classes.usc.edu/engr/ise/599muscog/2003/week12/Chen-TempPatterns.ppt

[35] R. Rowe, em Computer Music Journal 16(1), (1992). Machine Listening and Composing with Cypher:
Publicações de R. Rowe: http://homepages.nyu.edu/~rr6/publications.html

[36] A. Tanguiane, em Springer (1993). Artificial perception and music recognition:
http://www.springer.com/computer/artificial/book/978-3-540-57394-4

[37] A. Schloss, em Ph.D. thesis, Stanford University (1985). On the Automatic Transcription of Percussive Music – From Acoustic Signal to High-Level Analysis:
http://ccrma.stanford.edu/STANM/stanms/stanm27/stanm27.pdf

[38] E. Scheirer, em Journal of the Acoustic Society of America, 103(1):558-601 (1997).
 Tempo and Beat Analysis of Acoustic Musical Signals:
http://www.iro.umontreal.ca/~pift6080/documents/papers/scheirer_jasa.pdf

[39] P. Cariani, em Journal New Music Research (2001). Temporal Codes, Timing Nets, and Music Perception:
http://homepage.mac.com/cariani/CarianiWebsite/JNMR2001.pdf

[40] W. A. Sethares and T. W. Staley, em Journal of New Music Research, Vol. 30, No. 2, (2001). Meter and Periodicity in Musical Performance:
http://eceserv0.ece.wisc.edu/~sethares/paperspdf/jnmr2001.pdf

[41] E. W. Large, em Proceedings of the Eighteenth Annual Conference of the Cognitive Science Society (1996). Modeling Beat Perception with a Nonlinear Oscillator:
http://www.ccs.fau.edu/~large/Publications/Large1996.pdf

[42] A. T. Cemgil, H. J. Kappen, P. Desain, e H. Honing, em Journal of New Music Research, 28:4:259-273, (2001). On Tempo Tracking – Tempogram Representation and Kalman Filtering:
http://www-sigproc.eng.cam.ac.uk/~atc27/papers/cemgil-tt.pdf

[43] P. Desain, em Contemporary Music Review, 9, pp. 239-254 (1993). A Connectionist and a Traditional AI Quantizer, Symbolic versus Sub-Symbolic Models of Rhythm Perception:
http://www.nici.kun.nl/mmm/papers/d-93-a.rtf

[44] C. Lee, em Representing Musical Structure, pp. 59-127, Academic Press (1991). The Perception of Metrical Structure – Experimental Evidence and a New Model: http://scitation.aip.org/getabs/servlet/GetabsServlet?prog=normal&id=JASMAN 0000810000S1000S91000002&idtype=cvips&gifs=yes

[45] Antonio Camurri, em 7th Int. Conference on Digital Audio Effects (DAFX-04), Naples, Italy, pp. 5-8, (2004). Multimodal Interfaces for Expressive Sound Control:
http://dafx04.na.infn.it/WebProc/Proc/P_001.pdf

mouseover