by Chelsea Finn, Eric Jang
·
Feb 08, 2022
Dancing robots are a cultural sensation. They’ve captured our imaginations and shown us how agile, fun, and expressive robots can be. But the potential benefits of dance to robotics may stretch beyond demonstrating how impressive a robot is. Robots will increasingly be a part of our everyday lives and imbuing their behaviors with dance knowledge is one approach to making them more legible, accepted and delightful.
I’m a roboticist and choreographer. For the last five years, I’ve danced with eight different robots from around the world and am now pursuing my PhD in Mechanical Engineering at Stanford. Since 2020, I’ve been an Artist-in-Residence at Everyday Robots, where I use my dance and performing arts knowledge to inform the way robots move and interact with people.
This project was inspired by questions like - how can we use movement to create sound? How can we artistically shift the way our robots exist in the world? How can we give the impression that the robots may be “dancing”, even as they do regular, everyday tasks?
Working with the team at Everyday Robots supports my personal mission to create robots that make people feel empowered and capable. This is important to me because I witnessed my dad feel alienated by and afraid of essential, health-monitoring machines when he was in the hospital. I spoke about this experience in a TED talk in 2018. Even if a machine is designed to help us, whether it’s a heart monitor or a robot, how it makes you feel has a big impact on your perception and acceptance of it. The team at Everyday Robots understands this and looks to build helper robots that are useful while being legible and welcomed in everyday life.
In the past, robots have been cloistered, operating in cages and factory floors far from people. Their tasks have largely been planned from start to finish and human interaction hasn’t been a priority. In the future, robots will work near people and navigate in our everyday spaces independently. They must be able to learn, improvise, and move around us in understandable ways. This new embodied intelligence will be interdisciplinary and involve insights from fields like anthropology, architecture, and dance.
Robots could be some of the only things that move in our environment that are not a part of nature. We’re accustomed to the tree swaying in the breeze or a bird flying overhead, but we have few past examples for how a robot should move in our offices and homes. Practically, these robots will need to navigate the unpredictability of the real world seamlessly. Emotionally, they will need to do so in a way that feels ‘natural’ to us, and can inspire trust and human ingenuity, in the same way computers sparked a wave of innovation after they moved from industrial settings to people's homes.
I explicitly choreographed the robot’s movements in this video, however this musical experiment can run on the robot at any time. We can hear the music of a robot picking up an object, wiping a table, or navigating from one place to another.
0:50 - Play video
Over the past few months, my teammates and I have been working on an experiment that transforms the robots from everyday physical tools to musical instruments. We map the joint velocities of the robot onto musical tracks, so the robot makes music while it moves - something akin to a ‘music mode’. Each joint is mapped to a different sound - for example, the robot’s base will make a different sound than its wrist joint. This project was inspired by a set of questions around the relationship between dance and music: so often, music is played and robots dance to it, but what if the robot itself could become the musical instrument? What if the movement could dictate the music, rather than the other way around? It is an artistic human-robot-interaction experiment and is the first of several novel artistic projects we’re experimenting with on our robots.
This video shows ‘music mode’ in action with a short, minimalist ballet. It’s a glance through a future-facing telescope where robots and humans will move alongside each other in everyday life.
We’ve designed this music experiment so it can run on the robot at any time - while the robots are opening doors, navigating, sorting trash, and more. In this way, the robot’s ordinary tasks become extraordinary, sonically amplified to hear the music of everyday life. In the short time we’ve been experimenting with music on our robots, it has fundamentally shifted people’s impressions of the robot, sparking inspiration and curiosity.
This experiment has shown us that movement-based sounds can make a significant difference in how people relate to our robots. We will continue to explore the intersection of dance and robotics to help inform how we build robots that can learn to help people with a multitude of tasks in our everyday spaces.
Peter, Tom, and I experimented with different soundscapes and joint-to-sound mappings on the robot. We discovered that artistic and technical requirements can be aligned or at odds with each other.
This experiment was created and conceptualized with Tom Engbersen. Music by Peter Van Straten, software development with Tom Engbersen, Adrian Li, Daniel Lam, and Emre Fisher, and additional support from Kyle Jeffrey, Sara Ahmadi and Will Nail.
Links