Our very own Postdoctoral researcher Changil Kim is joining Facebook in Seattle as a research scientist in September, after spending two and a half wonderful years in our lab. During his time in our lab, Changil has actively contributed to many research projects, including motion magnification, replicating oil painting, speech to face, and video processing. Congratulations to Changil and wish him all the best in Seattle!
During the past week, Alex's two recent papers on knitting Neural Inverse Knitting: From Images to Manufacturing Instructions (ICML 19) and Knitting Skeletons: Computer-Aided Design Tool for Shaping and Patterning of Knitted Garments (UIST 19) have received lots of media attention, including a recent interview from BBC. The story is also widely reported by TechCrunch, 7News Boston, Mashable, Fortune, VentureBeat, Engadget, ZDNet, Geek, Market Research Finance, Interesting Engineering, Fibre2Fashion, and MIT News.
Alex's latest work Knitting Skeletons: A Computer-Aided Design Tool for Shaping and Patterning of Knitted Garments is officially accepted to UIST 2019. In this paper, Alex and Liane present a novel interactive system for simple garment composition and surface patterning. Both casual users and advanced users can benefit from their system. Check out these beautiful samples to explore the new possibility brought by their tool!
Jie's SIGGRAPH paper Learning to Fly: Computational Controller Design for Hybrid UAVs with Reinforcement Learning has been covered by media outlets including Engadget, UASWeekly, Mashable, VentureBeat, SlashGear, Robotics Business Review, Eletronics360, UberGizmo, DroneLife, Unmanned Aerial, Science Daily, and AI News in the past week. If you are at SIGGRAPH, come to room 151 and check out Jie's presentation today!
In our latest work published in Science Advances, our group presented an automated system that designs and 3-D prints complex robotic actuators which are optimized according to an enormous number of specifications. We demonstrate the system by fabricating actuators that show different black-and-white images at different angles. One of our actuators portrays a Vincent van Gogh portrait when laid flat and the famous Edvard Munch painting “The Scream” when tiled an angle. We also 3-D printed floating water lilies with petals equipped with arrays of actuators and hinges that fold up in response to magnetic fields run through conductive fluids.
The research paper Topology optimization and 3D printing of multimaterial magnetic actuators and displays was published in Science Advances last Friday, and MIT News covered our story on the same day.
Our latest work on human grasping "Learning the signatures of the human grasp using a scalable tactile glove" was published in Nature. In this paper, we use a scalable tactile glove and deep convolutional neural networks to show that sensors uniformly distributed over the hand can be used to identify individual objects, estimate their weight and explore the typical tactile patterns that emerge while grasping objects. Using a low-cost (about US$10) scalable tactile glove sensor array, we record a large-scale tactile dataset with 135,000 frames, each covering the full hand while interacting with 26 different objects. This set of interactions with different objects reveals the key correspondences between different regions of a human hand while it is manipulating objects. Insights from the tactile signatures of the human grasp—through the lens of an artificial analog of the natural mechanoreceptor network—can thus aid the future design of prosthetics, robot grasping tools, and human-robot interactions.
The paper was led by Subramanian Sundaram, who obtained his Ph.D. from our group in 2018 and continued this work after his graduation.
Our PhD student Jie Xu will present his latest work "Learning to Fly: Computational Controller Design for Hybrid UAVs with Reinforcement Learning" at SIGGRAPH 2019 this summer. In this paper, Jie proposed a novel neural network controller design for hybrid UAVs, an aerial robot that is challenging to control due to its complex aerodynamic effects. His method allows us to directly apply a controller trained in simulation to real hybrid UAV hardware without any modification. Checkout his website and the project page to find more!
Alex and Tae-Hyun's latest paper on neural inverse knitting is accepted to ICML 2019. In this paper, they introduce the new problem of automatic machine instruction generation using a single image of the desired physical product. Check out the project page to learn more about the way they tackle the problem and see a dataset of real knitting samples!
After spending one and a half wonderful years in our lab, our very own Postdoctoral researcher, Tae-Hyun Oh, joined Faceboook AI Research, Cambridge, MA, which is located across the street. During his stay at MIT, Tae-Hyun has contributed to a wide range of research projects, including Motion Magnification, Soft Segmentation, Speech2Face, and Inverse Knitting. Congratulations to Tae-Hyun, and wish him the best of luck in the future!
Yuanming's latest paper ChainQueen was accepted to ICRA 2019. In this paper, he presented a differentiable deformable body simulator that opens up lots of possibilities for efficient controller and geometry design algorithms in robotics. Check out this video to see what amazining work can be done using his simulator!
- 1 of 8
- next ›