dc.creatorHuepe, Cristián
dc.creatorCádiz, Rodrigo F.
dc.creatorColasso, Marco
dc.date.accessioned2022-05-16T13:00:28Z
dc.date.available2022-05-16T13:00:28Z
dc.date.created2022-05-16T13:00:28Z
dc.date.issued2012
dc.identifier10.1109/ACC.2012.6315529
dc.identifier9781457710964
dc.identifier2378-5861
dc.identifier9781457710957
dc.identifier0743-1619
dc.identifierhttps://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6315529
dc.identifierhttps://doi.org/10.1109/ACC.2012.6315529
dc.identifierhttps://repositorio.uc.cl/handle/11534/63926
dc.description.abstractWe explore different approaches for generating music from the flocking dynamics of groups of mobile autonomous agents following a simple decentralized control rule. By developing software that links these dynamics to a set of sound wave generators, we study how each approach reflects sonically the transition to collective order and which produces musically interesting results. We consider three qualitatively different ways of translating flocking dynamics into music: (1) A direct approach that maps agent positions to sounds, (2) a synchronization approach where each agent carries an oscillator that couples to neighboring agents, and (3) a physics-inspired approach that mimics the sound that would result from an effective friction between neighboring agents. We find that all approaches allow the listener to discriminate between different phases in the system, but that the second and third can yield more musically interesting and appealing results.
dc.languageen
dc.publisherIEEE
dc.relationAmerican Control Conference (ACC) (2012 : Montreal, Canadá)
dc.rightsacceso restringido
dc.subjectOscillators
dc.subjectHeuristic algorithms
dc.subjectSpectrogram
dc.subjectNoise
dc.subjectMusic
dc.subjectFriction
dc.subjectReal-time systems
dc.titleGenerating music from flocking dynamics
dc.typecomunicación de congreso


Este ítem pertenece a la siguiente institución