Title: |
Materials Make the Bot: Directly Embedding Actuation and Perception into Robotic Structures |
|
Speaker: |
Ms. Lillian Chin |
|
Affiliation: |
MIT Computer Science and Artificial Intelligence Lab |
|
When: |
Thursday, October 20, 2022 at 11:00:00 AM |
|
Where: |
MRDC Building, Room 4211 |
|
Host: |
Dr. Michael Leamy | |
Abstract Materials are the primary interface for a body to interact with the outside world, making them critical for robust robot designs. Most robots, however, continue to be made out of rigid metal and plastic, limiting potential environmental interaction. Soft robotics and architected materials offer a promising approach, but they often lack key robotic functionality such as robust actuation, integrated perception and simple control. In this talk, I argue that robots should be designed from a materials-centric approach so their abilities are a direct consequence of the materials that comprise them. I demonstrate the power of this approach by focusing on how auxetic metamaterials can be explicitly designed as the foundation for a robot’s movement and sensing capabilities. First, I show how applying actuation within an auxetic shell can create modular volumetric actuators. Since these robots’ expansion is determined directly from their geometry, I have significantly more control over their design space compared to traditional robots and can design for a desired compliance, expansion ratio, force output or speed. Then, I introduce a new class of materials called handed shearing auxetics whose internal geometry couples twisting with extension. This material is significantly easier to actuate and fabricate than traditional pneumatic / cable-based techniques, empowering the creation of more robust and more efficient soft robots. Finally, I present a method for sensorizing lattice structures (such as auxetics) by embedding internal fluidic channels within the struts themselves as the structure is being 3D printed. This technique offers proprioceptive feedback with minimal hysteresis, enabling accurate pose reconstruction from these fluidic sensors alone. I close my talk with a discussion on how these methods can be generalized beyond auxetics and how computational design can further enhance robotic material design. |
||
Biography Lillian Chin is a PhD candidate at MIT’s Computer Science and Artificial Intelligence Lab, working with Daniela Rus. She is interested in designing robotic bodies and their materials for optimized interaction with their environment through embedded perception and computational design. Lillian received her bachelor’s and masters degrees in Electrical Engineering and Computer Science at MIT. She is the recipient of several fellowships including the NSF Graduate Research Fellowship, the Hertz Foundation Graduate Fellowship and the Paul and Daisy Soros Fellowship for New Americans. Her work has been published in Science and Science Advances and has been recognized with awards such as the 2019 IEEE Robosoft Best Poster Award, the 2019 ACM CS and Law Best Paper Award and the 2022 Leventhal City Prize. Lillian has also focused heavily in research mentorship, mentoring 24 undergraduates and 2 masters students over her PhD to write 8 papers. Nearly half of these students were women and other gender minorities, over a third were underrepresented racial minorities, and over a third were co-authors on papers. |
||
Notes |
Meet the speaker |