jueves, 29 de enero de 2015

Watch a human musician and his robots improvise together



This performance by Shimon and the Shimi Robots showcases the PhD research of Georgia Tech doctoral student Mason Bretan on machine improvisation, path planning and embodied cognition. (Mason Bretan/YouTube)
This is a performance showcasing part of my PhD research in robotic musicianship at Georgia Tech including 
  • machine improvisation, 
  • path planning, and 
  • embodied cognition. 
The smaller Shimi robots figure out how to move based on an analysis of the music and Shimon generates an improvisation given a precomposed chord progression using a generative algorithm that jointly optimizes for higher level musical parameters and its physical constraints.

The piece is called “What You Say” and is inspired by the high energy funk piece, “What I Say”, from Miles Davis’ Live-Evil album. The incredible brilliance of the musicians on that album (as well as the numerous other great musicians around the world) are not only an inspiration to me and my own musical and instrumental aspirations, but also set the standard for the level of musicianship that I hope machines will one day achieve. And through the power of artificial intelligence, signal processing, and engineering I firmly believe it is possible for machines to be artistic, creative, and inspirational.

I hope you enjoy!

To use this video in a commercial player or in broadcasts, please email licensing@storyful.com


Rest assured that when our future robotic overlords come on the scene, they'll have a sweet sense of rhythm.

The Robotic Musicianship Group at Georgia Tech has been working on Shimon, a musical robot that can improvise melodic accompaniment, for about six years now. And for three years, they've added Shimi — a small, smartphone-connected bot that can respond to music with dance and sound — to the mix.

Shimi shimmies.
Shimi shimmies. (Mason Bretan via The Washington Post)
Shimon and the Shimis (which is a great band name, by the way) are showcased in the above video, in which they jam along with one of their creators, PhD student Mason Bretan. He gave them an arrangement of what he'd be playing and recorded some tracks and cues for them, but in between (when you hear funky electronic noises) they're doing their own thing based on his chord progressions. And the mallet solo in the middle is completely robot-improvised.
Robot-Music-2
(Mason Bretan via The Washington Post)
When Bretan joined the lab five years ago, Shimon was being taught how to compose jazz music on the fly based on music theory. "I jumped right in," Bretan said. "And with Shimi — which don't just generate music, we call them 'robotic musical companions' because you can talk to them and use them to interact with your playlist — with Shimi I've been there from the start."

"I'm always trying something new with the robots, and sometimes they surprise me with something that's sort of out there or pretty cool," he added.

His dissertation, which he hopes to turn in by the end of 2015, centers around teaching the robots to understand their physical constraints and abilities.
"So the goal is that if you gave the same input to a robot with 20 arms, it would perform differently than an eight-armed robot because it would be optimizing its performance," he said."Combined with the new algorithm we have for jazz music improvisation, these skills really allow them to more optimally achieve musical goals."

And while he certainly doesn't want to replace human musicians like himself with robots, he's excited about the mechanical abilities they have that we don't.
"I mean, Shimon already has four arms and can hold eight mallets," he said, "So it can already do something a person can't."



ORIGINAL: Washington Post
By Rachel Feltman
Jan 14, 2015

No hay comentarios:

Publicar un comentario