Stop reading for a moment, close your eyes and touch your elbow. Easy? That’s because of something called proprioception: the internal awareness of your own body in relation to the environment. Often referred to as a “sixth sense,” it means you can move through space without the need to see or feel every aspect of that movement.
It’s what allows you to type, dance, cook, or steer a bicycle. It’s also thought to be one of the basic physiological building blocks of sentience or bodily self-awareness — and it’s something we humans take for granted, but robots still struggle with.
“Sure, a Roomba vacuum robot easily moves around the house, but we wouldn’t consider it to be self-aware,” said Jasmine Berry, a recent PhD computer science graduate focusing on brain theory. “Its internal states are pre-programmed to avoid certain objects and it learns the space it’s in over time. But that’s very different from biological proprioception.”
Berry, who was a member of the USC Brain Body Dynamics Laboratory, hopes to change that.
In the future, she envisions biologically-inspired robots working and living alongside humans. If we can create machines with human-like proprioceptive awareness, the thinking goes, we might be one step closer to building truly intelligent AI in the form of self-aware machines.
“I would love to see self-aware robots that can autonomously operate themselves and construct a subjective perspective of the world,” said Berry.
Above: Jasmine Berry presents at the Artificial Life conference.
In the lab of Berry’s co-advisor, Professor Francisco Valero-Cuevas, body representations come closer to resembling the design of vertebrates with artificial nervous systems than traditional robotic systems.
In fact, Valero-Cuevas has been investigating artificial proprioception in tendon-driven systems for several years. For instance, a cat-like robot developed in his lab currently uses evolutionary algorithms to teach itself how to behave in a feline manner.
His work with Berry started from an interesting premise: how does the body influence self-awareness in the brain?
It turns out, body movement and proprioception are inextricably linked. Movement produces reams of information that, when processed by the brain, give you an understanding of your body representation and its relationship to the environment.
“You’re getting a flood of sensory information into your nervous system all the time and your brain has to make sense of that,” said Berry. “My research looks at how the brain forms a body representation out of this flood of sensory information.”
A robot with simulated physiological proprioception, said Berry, could build an internal representation of its body and how it moves in physical space. It could then use that model to guide its unique set of behaviors.
To do this, Berry and Valero-Cuevas looked at Gestalt theory: perceptual laws from psychology that allow humans to process sensory information into proprioception.
Usually used in reference to visual information, the pair instead applied the laws to sensory and motor processes of proprioceptive signals in the muscle spindle—stretch receptors within the body of a muscle.
They were able to create a computation model that uses proprioception similarly to humans. Not only that, they provided the methodology for simulating it in robots with an artificial nervous system using algorithms designed to mimic each one of these laws. The robots, developed in Valero-Cuevas’ lab, are known as NeuRoBots.
“The goal is that, if you understand all these aspects that go into sensing your body in the world, then you can recreate them using algorithms,” said Berry.
For example, the bodily movements of transitioning from a sitting position to squatting employs the Gestalt rules of continuity and closure. The idea is that using these rules, a NeuRoBot will be able to generate its own self-model completely from scratch, the way humans do.
The NeuRoBot’s self-model will also let it accurately execute tasks, such as picking up a block and placing it in a box or opening a door, without requiring specific training for each task. Ultimately, posit Valero-Cuevas and Berry, a combination of these laws in the sensory motor space may prompt the emergence of a self.
A big leap
It’s a big leap from body representations to robots thinking profound thoughts, but Berry believes that the difference is one of degree. “There are many different definitions of self-awareness. And within each and every one of those definitions, there’s a gradient, or a spectrum of how self-aware that entity is,” said Berry.
“We’re a long way from humanoid robots taking over the world. But if you give these machines the ability to build a model of themselves on their own or give them their own identity, that will have an effect on how they’re able to perceive the world in their own unique way, without having human operators or human programmers to explicitly do that for them.”
Crucially, this could ultimately make robots much more resilient. Consider driverless cars—you want robots to be able to detect that something has gone wrong and do so reliably. It also affords robots more flexibility in moving from one task to another—it can use the same self-model to learn and plan how to do that new task.
There is also a feedback loop—biologically-inspired robots can also help us better understand humans, by offering themselves as test beds for physiology and biology without having to operate on a person or animal.
“For me, why we build robots in this way is because we see them as becoming more integrated into society and we’ll need to interact with them, just like we interact with other human beings,” said Berry, who plans to continue this line of research in industry after graduation.
“And because of that, we need to build these robots in a way that can understand human interactions, our behaviors, so we can live together feasibly and safely. To have robots and systems running around with their own identities, and with a very small flavor of self-awareness would be a dream of mine, so I am going to start the only way we know how—one bit at a time.”
Published on October 30th, 2020
Last updated on May 16th, 2021