The Four-Legged Robot That Can Crawl, Crouch, Clean and Fight COVID

| January 25, 2021 

A team of USC master’s students creates a disinfection robot to use on COVID-19 prevention.

Meet LASER-D, an animal-like robot that can crawl, crouch and disinfect to fight COVID-19. PHOTO/ QUAN NGUYEN.

Meet LASER-D, an animal-like robot that can crawl, crouch and disinfect to fight COVID-19. PHOTO/ QUAN NGUYEN.

There are so many things robots can do on wheels, but in narrower, shallower spaces—like between desks in a classroom or on a stairwell—wheels can be limiting. Enter LASER-D (Legged Agile Smart Efficient Robot for Disinfection ) a four-legged robot created by a team of USC researchers at the USC Viterbi School of Engineering. The animal-like LASER-D combines—for the first time—locomotive agility and chemical disinfection to fight COVID-19, among other applications.

Led by professors SK Gupta and Quan Nguyen, a team of seven USC Viterbi master’s students worked to create LASER-D—built on an earlier project, a UV disinfection robot, and adapted Nguyen’s legged robot platform.

“This is the first time we’ve combined a legged robot with the disinfection task,” said Nguyen, an assistant professor of aerospace and mechanical engineering. This can be challenging, because we need to maintain mobility while positioning for disinfection. LASER-D conserves energy by walking and positioning its body simultaneously. Thus, we can just use the robot orientation to control the spraying of disinfectant, instead of attaching an extra arm to perform this task,” Nguyen said.

The USC Viterbi master’s students leading this project include: from the Department of Aerospace and Mechanical Engineering Yiyu Chen, Tailun Liu, Anthony Nguyen, Abhinav Pandey, Pornrawee Thonapalin and Ruiqi Wang, and from the Ming Hsieh Department of Electrical and Computer Engineering Zhiwei Deng.

LASER-D has been tested around campus. While LASER-D is not yet autonomous, increased autonomy and increased distance between the human operator and the robot are long-term development goals.

Said master’s student Abhinav Pandey, “Like a human, the system should be able to identify what has to be disinfected, whether or not the disinfection took place properly during the first round, and whether or not the robot should move on or perform a second round of disinfection. These are all targets of the future version of LASER-D.”

LASER-D has implications beyond sanitization during the current global pandemic. With its ability to move while spraying, the team believes it could be useful in different areas, including agriculture.

“A robot like LASER-D could perform very localized agricultural tasks like precision pesticide dispensing or precision irrigation,” said Gupta, Smith International Professor of Mechanical Engineering and Computer Science.

Cleaning applications that focus less on disinfection and more on aesthetics—for example cleaning shopping malls and cluttered office spaces —could also be potential applications, Gupta noted.

What does LASER-D See?

LASER-D is an ambitious platform, aimed at achieving an autonomous option that can replicate human tasks that are repetitive, tedious and dangerous, and that also require precision.

Crucial in this process is LASER-D’s vision system. The vision system, Pandey said, is based on machine learning. Traditionally, it would’ve required a lot of data to train their machine, he said—50,000 images or so. But they had limited data.

“We trained the machine on a pair of images instead,” Pandey said. “One image was of the surface prior to disinfection and the second image was of the surface post-disinfection. This created a higher level of accuracy in terms of LASER-D identifying whether or not an object had been disinfected—and adequately.”

The vision system not only helps the human operator review what the robot is seeing, but it also allows the operator to weigh in during the process, versus only at the end, Pandey said. The dual image recognition also helps deal with a diversity of images—for example multiple types of surfaces in a classroom.

Added Professor Nguyen: “When lighting conditions aren’t good, human eyes might not be able to detect areas that were not coated by the disinfectant, however, the vision system can.”

“This is the most challenging aspect,” Pandey said. “Identifying whether an object is a table or a chair has been done before, but identifying if an area has been disinfected is a new problem we had to solve.”

Identifying the Most Efficient Path

 According to Professor Nguyen, LASER-D is programmed to move through its space based on “waypoints,” or particular points defined within the overall map. Human operators are first presented with a map of the overall space and then asked to designate specific waypoints to help guide the robot more accurately and efficiently through the space.

In the future, the researchers hope to enhance the vision system further to help the robot avoid environmental conflicts on its own by adjusting its path autonomously.

Said Gupta: “Currently the robot has to be very close the human operator, but in the future, people could be all the way across campus, or even in a completely different location.”

“Locomotion of legged robots in rough terrain is a challenging problem in and of itself,” Nguyen said,”but the problem for us right now is combining locomotion with disinfection,” Quan Nguyen said. For example, LASER-D would need to maintain its ability to move and spray surfaces while crouching under a table.

Teamwork in Quarantine 

Collaborative problem-solving is a critical part of an engineer’s toolset. In the global pandemic, working together with a team has become an additional hurdle to conducting impactful research.

But the students working on Laser-D were able to overcome these with trial and error and technology.

Said master’s student Ruiqi Wang: “This is the first time I worked on a team of complete strangers. It was pretty hard to bond without any face-to-face interactions, but our shared and ambitious goal helped us to bond. We ended up being a really great team.”

Even remotely, students were exposed to skill areas they counted themselves novices in. Anthony Nguyen worked on the simulation software and robot’s operating system. He said he felt like with a big vision and time constraints, they had to hit the ground running as a team. “As a mechanical engineering student, I had no background in C++ coding or anything like that. I was lucky—there was open source code and a lot of resources I could use to learn. The project relied on a lot of coding—we had to use what was available and then engineer further solutions.”

Said Pandey, “The project was ambitious. As a team, we focused on working on problems which had not been solved before. That is our added value.”

Published on January 25th, 2021

Last updated on August 6th, 2024

Share this Story