The Robotic Musicianship group at Georgia Tech Center for Music Technology is working on several robot musicians projects that will enable real-time musical collaboration between humans and robots to create new and fascinating music that combines the unique abilities of both humans and robots.
The music-playing robot, Guitar Bot, can help you play guitar with ease and manipulate the strings like a virtuoso.
In one of their projects, FOREST, they aim to enhance human-robot interaction through the use of non-verbal emotional communication channels like sound and gesture. The emotion-carrying sounds accompanying the robotic gestures and the human-inspired gestures are generated through a deep learning network and a rule-based AI system.
In the Skywalker project, disabled people are able to play music with the aid of a prosthetic hand which uses an ultrasound sensor and deep learning algorithms to predict muscle patterns in the amputee’s stump. The predicted muscle patterns are then mapped onto the robot to control the movements of the robotic fingers.
Shimon is a robot playing music that can collaborate with humans to create novel and captivating music in real-time. Shimon has performed with human musicians at dozens of concerts and festivals so far.
The Robotic drummer is a prosthesis comprised of two drumsticks attached to amputees. Using the musicians’ arms and electromyography (EMG) muscle sensors, the first stick is controlled. The second stick listens to the music being played and improvises accordingly.
In conclusion, robotic musicians can collaborate with humans to produce complex, sophisticated, and innovative music unheard of before.