$ 1 million for first responder open source robots


[ad_1]

Tomorrow’s firefighters and other first responders could team up with robotic assistants capable of traversing wilderness and disaster areas, thanks to a University of Michigan research project funded by a new million-dollar grant from the National Science Foundation.

A key goal of the three-year project is to allow robots to navigate in real time, without the need for a pre-existing map of the terrain they must traverse.

The project aims to take bipedal (or two-legged) walking robots to a new level, equipping them to adapt on the fly to dangerous terrain, dodge obstacles or decide whether a given area is safe for walking. The technology could allow robots to enter areas too dangerous for humans, including collapsed buildings and other disaster areas. It could also lead to more intuitive prosthetics for their users.

“I imagine a robot that can walk autonomously in the forest here on the North Campus and find an object that we have hidden. This is what robots need to be useful in search and rescue, and currently no robot can do it, ”said Jessy grizzle, principal investigator on the project and Elmer G. Gilbert Distinguished University Professor of Engineering at UM.

Grizzle, expert in walking robots, is a partner of the project with Maani Ghaffari Jadidi, assistant professor of naval architecture and maritime engineering and expert in robotic perception. Grizzle says the pair’s complementary areas of expertise will allow them to work on larger pieces of technology than has been possible in the past.

To achieve this, the team will take an approach called ‘full-stack robotics’, integrating a series of new and existing technologies into a single open-source perception and motion system that can be adapted to robots beyond those used in the project itself. The technology will be tested on the Digit and Mini Cheetah robots.

A Mini-Cheetah robot at the University of Michigan.
A Mini-Cheetah robot at the University of Michigan. Photo: Robert Coelius / Michigan Engineering.

“What full-stack robotics means is that we attack all layers of the problem at once and integrate them together,” Grizzle said. “So far, many roboticists have solved very specific individual problems. With this project, we aim to integrate what has already been done into a cohesive system, then identify its weak points and develop new technologies where needed to fill in the gaps.

One area of ​​particular interest will be mapping – the project aims to find ways for robots to develop rich, multidimensional maps based on real-time sensory input so that they can determine the best way to cover a plot of land given.

“When we humans are hiking it is easy for us to recognize areas that are too difficult or dangerous and stay clear,” Ghaffari said. “We want a robot to be able to do something similar by using its perception tools to create a real-time map that anticipates multiple steps and includes a measure of the ability to walk. Thus, he will know how to stay away from dangerous areas and can plan a route that uses his energy efficiently. “

Grizzle predicts that robots will be able to do this using mathematics, for example, calculating a standard deviation of the variation in the height of the ground or the glide of a surface. He plans to create more sophisticated perception tools that will help bots collect data by analyzing what their members are doing. Slipping on an icy surface or kicking a bump, for example, would generate a new data point. The system will also help robots navigate soft ground and moving objects, such as rolling branches.

A Digit robot at the University of Michigan.
A Digit robot at the University of Michigan. Photo: Joseph Xu / Michigan Engineering.

Rich and easily understandable maps, Ghaffari explains, will be just as important for humans who may one day use these remote robots in search and rescue or other applications.

“A shared understanding of the environment between humans and robots is essential, because the more a human team can see, the better they can interpret what the robot team is trying to accomplish,” said Ghaffari. “And it can help humans make better decisions about what other resources to bring or how the mission should go.” “

In addition to developing new technology, the project will also collaborate with the UM School of Education on raising awareness for a high school in Detroit, working to share project material with them and develop an interest in robotics. Grizzle and Ghaffari previously piloted a freshman course at UM to show students how engineers use math to solve problems like building maps from LiDAR point clouds, machine learning to allow robots to learn from the map data they have collected and control feedback. on robots and other mobile platforms.

Grizzle is also Jerry W. and Carol L. Levin Professor of Engineering, UM Robotics Institute Academic Program Director, Professor of Electrical and Computer Engineering, and Professor of Mechanical Engineering.

The project is called “Integrated control architecture and learning-assisted semantic perception for the locomotion and navigation of legged robots in nature. “

[ad_2]