Andrew Spielberg's paper A Simple, Inexpensive, Wearable Glove with Hybrid Resistive‐Pressure Sensors for Computational Sensing, Proprioception, and Task Identification has been published in Advanced Intelligent Systems. In this paper, Andy and his collaborators from Professor Daniela Rus' lab presented a fully soft, wearable glove which is capable of real‐time hand pose reconstruction, environment sensing, and task classification. The design is easy to fabricate using low cost, commercial off‐the‐shelf items in a manner that is amenable to automated manufacturing. The system can reconstruct user hand pose and identify sensory inputs such as holding force, object temperature, conductability, material stiffness, and user heart rate, all with high accuracy. Check out this article at Advanced Science News to see more exciting applications unlocked by this new technology!
Two papers on the application of multi-objective optimization techniques in supervised learning and reinforcement learning have been accepted to ICML 2020: Prediction-Guided Multi-Objective Reinforcement Learning for Continuous Robot Control from Jie Xu, Yunsheng Tian, and Pingchuan Ma proposed an efficient evolutionary learning algorithm to find the Pareto set approximation for continuous robot control problems, and Efficient Continuous Pareto Exploration in Multi-Task Learning from Pingchuan Ma and Tao Du presented a novel, efficient method that generates locally continuous Pareto sets and Pareto fronts for multi-task learning problems. Details about the two papers will be available soon.
The paper Physical Realization Of Elastic Cloaking With A Polar Material was recently accepted to Physical Review Letters (PRL). In this paper, Wan Shou and Beichen Li collaborated with researchers at University of Missouri and Dalian University of Technology to propose a novel elastic cloak by designing and fabricating a new class of polar materials with a distribution of body torque that exhibits asymmetric stresses. The work sets a precedent in the field of transformation elasticity and should find applications in mechanical stress shielding and stealth technologies. Check out their paper to learn more about this breakthrough in material science!
Dr. Petr Kellnhofer's work "Gaze360: Physically Unconstrained Gaze Estimation in the Wild" was accepted to ICCV 2019. In this paper, we present Gaze360, a large-scale gaze-tracking dataset and method for robust 3D gaze estimation in unconstrained images. Our proposed 3D gaze model extends existing models to include temporal information and to directly output an estimate of gaze uncertainty. Finally, we demonstrate an application of our model for estimating customer attention in a supermarket setting. This paper was was in collaboration with Professor Antonio Torralba's group and Toyota Research Institute.
Andrew Spielberg's paper "Learning-In-The-Loop Optimization: End-To-End Control And Co-Design of Soft Robots Through Learned Deep Latent Representations" has been accepted to NeurIPS 2019 recently. This paper tackles the problem of controlling soft robots, which is very challenging due to their infinite degrees of freedom. Our solution marries hybrid particle-grid-based simulation with deep, variational convolutional autoencoder architectures that can capture salient features of robot dynamics with high efficacy, and we demonstrate our dynamics-aware feature learning algorithm on both 2D and 3D soft robots.
Our very own Postdoctoral researcher Changil Kim is joining Facebook in Seattle as a research scientist in September, after spending two and a half wonderful years in our lab. During his time in our lab, Changil has actively contributed to many research projects, including motion magnification, replicating oil painting, speech to face, and video processing. Congratulations to Changil and wish him all the best in Seattle!
During the past week, Alex's two recent papers on knitting Neural Inverse Knitting: From Images to Manufacturing Instructions (ICML 19) and Knitting Skeletons: Computer-Aided Design Tool for Shaping and Patterning of Knitted Garments (UIST 19) have received lots of media attention, including a recent interview from BBC. The story is also widely reported by TechCrunch, 7News Boston, Mashable, Fortune, VentureBeat, Engadget, ZDNet, Geek, Market Research Finance, Interesting Engineering, Fibre2Fashion, and MIT News.
Alex's latest work Knitting Skeletons: A Computer-Aided Design Tool for Shaping and Patterning of Knitted Garments is officially accepted to UIST 2019. In this paper, Alex and Liane present a novel interactive system for simple garment composition and surface patterning. Both casual users and advanced users can benefit from their system. Check out these beautiful samples to explore the new possibility brought by their tool!
Jie's SIGGRAPH paper Learning to Fly: Computational Controller Design for Hybrid UAVs with Reinforcement Learning has been covered by media outlets including Engadget, UASWeekly, Mashable, VentureBeat, SlashGear, Robotics Business Review, Eletronics360, UberGizmo, DroneLife, Unmanned Aerial, Science Daily, and AI News in the past week. If you are at SIGGRAPH, come to room 151 and check out Jie's presentation today!
In our latest work published in Science Advances, our group presented an automated system that designs and 3-D prints complex robotic actuators which are optimized according to an enormous number of specifications. We demonstrate the system by fabricating actuators that show different black-and-white images at different angles. One of our actuators portrays a Vincent van Gogh portrait when laid flat and the famous Edvard Munch painting “The Scream” when tiled an angle. We also 3-D printed floating water lilies with petals equipped with arrays of actuators and hinges that fold up in response to magnetic fields run through conductive fluids.
The research paper Topology optimization and 3D printing of multimaterial magnetic actuators and displays was published in Science Advances last Friday, and MIT News covered our story on the same day.
Our latest work on human grasping "Learning the signatures of the human grasp using a scalable tactile glove" was published in Nature. In this paper, we use a scalable tactile glove and deep convolutional neural networks to show that sensors uniformly distributed over the hand can be used to identify individual objects, estimate their weight and explore the typical tactile patterns that emerge while grasping objects. Using a low-cost (about US$10) scalable tactile glove sensor array, we record a large-scale tactile dataset with 135,000 frames, each covering the full hand while interacting with 26 different objects. This set of interactions with different objects reveals the key correspondences between different regions of a human hand while it is manipulating objects. Insights from the tactile signatures of the human grasp—through the lens of an artificial analog of the natural mechanoreceptor network—can thus aid the future design of prosthetics, robot grasping tools, and human-robot interactions.
The paper was led by Subramanian Sundaram, who obtained his Ph.D. from our group in 2018 and continued this work after his graduation.
- 1 of 8
- next ›