Construction Robots that Read Your Mind
Construction Robots that Read Your Mind
Penn State researchers have devised a way for robots to read the minds of their human co-workers and adjust their work accordingly.
One day, a worker will only have to imagine giving a robot a specific command in order to have the robot carry it out. Think of the implications: you think it, and it gets done.
The robots of the not-too-distant future will be helping us with the jobs that would soon tire us out: things like repeatedly lifting heavy objects, building brick walls, or making delivery runs to restock supplies.
But such human-robot cooperation won’t come without drawbacks, said Houtan Jebelli, assistant professor of architectural engineering at Penn State. A worker suspended in a harness 30 feet in the air doesn’t have the ability to type in even a few commands for a robotic helper to follow, for example.
The humans’ mental health can also suffer when they don’t trust the robots that surround them to perform as needed in such high-stakes situations, Jebelli said.
Anticipating these future scenarios, he and his team are working to design construction robots that act upon workers’ thoughts as they imagine a robot carrying out a specific action, like picking up a saw and moving it to another part of the build site. Their proof-of-concept system found that robots will act upon a workers’ thoughts with 82 percent accuracy.
Editor's Pick: How Can Robots Improve the Building Labor Shortage?
So how exactly do the robots carry out these nebulous “commands”? Because they can’t glean information from subtle forms of human communication, such as facial expression or bodily gestures, the robots measure look at that person’s brainwaves, said Jebelli.
The next questions become, how do they access and then make sense of these brainwaves? The workers essentially send their brainwaves to robots via wearable electroencephalogram (EEG) devices that record electrical activity in the brain.
The EEG device sends the brainwave to the cloud server, and then the specialized software begins analyzing and generating commands. “Simultaneously, these decoded signals will be transferred into the robot’s motion planner for change of action,” he said.
If a worker is too stressed to think straight, the robot will recognize the problem and reduce its pace to provide a safer work environment, he added.
Recommended for You: Service with a Robot Smile
The researchers focused on construction robots because they operate in rugged workspaces where the layout and equipment changes throughout the day, said Mahmoud Habibnezhad, a postdoctoral fellow who works on the project. Also, people use their hands on construction sites for balance and to do their jobs. They don’t have the time or capability to program and control robots at the same time.
Yet, the people who work on the sites still have to work with and control robots, including drones, he added. A drone might fly in to bring something as minute as a nail to the person who issued the command.
To develop their system, the researchers had 14 participants view images of specific actions, such as grabbing bricks with their right hands, and then imagine these actions. When someone imagines their right hand grabbing an object, the right cortex of their brain generates a higher EEG signal than their left-brain area, explained Jebelli.
Read More about An AI-Driven Robot Smiles Back
Machine learning algorithms will help the robots to perfect how they interpret and carry out the EEG signals, he said.
“This level of communication can instill trust in human-robotic controls and facilitate future endeavors in safety design of collaborative robotics,” Jebelli said, adding that “This is one of the first studies that tries to measure and quantify workers’ cognitive load continuously, in near-real-time based on their physiological responses.”
Just imagine how soon those construction workers will be imagining laying bricks to get the robot to actually do it.
Jean Thilmany is a science and technology writer living in St. Paul, MN.
The robots of the not-too-distant future will be helping us with the jobs that would soon tire us out: things like repeatedly lifting heavy objects, building brick walls, or making delivery runs to restock supplies.
But such human-robot cooperation won’t come without drawbacks, said Houtan Jebelli, assistant professor of architectural engineering at Penn State. A worker suspended in a harness 30 feet in the air doesn’t have the ability to type in even a few commands for a robotic helper to follow, for example.
The humans’ mental health can also suffer when they don’t trust the robots that surround them to perform as needed in such high-stakes situations, Jebelli said.
Anticipating these future scenarios, he and his team are working to design construction robots that act upon workers’ thoughts as they imagine a robot carrying out a specific action, like picking up a saw and moving it to another part of the build site. Their proof-of-concept system found that robots will act upon a workers’ thoughts with 82 percent accuracy.
Editor's Pick: How Can Robots Improve the Building Labor Shortage?
So how exactly do the robots carry out these nebulous “commands”? Because they can’t glean information from subtle forms of human communication, such as facial expression or bodily gestures, the robots measure look at that person’s brainwaves, said Jebelli.
Analyzing Commands
The next questions become, how do they access and then make sense of these brainwaves? The workers essentially send their brainwaves to robots via wearable electroencephalogram (EEG) devices that record electrical activity in the brain.
The EEG device sends the brainwave to the cloud server, and then the specialized software begins analyzing and generating commands. “Simultaneously, these decoded signals will be transferred into the robot’s motion planner for change of action,” he said.
If a worker is too stressed to think straight, the robot will recognize the problem and reduce its pace to provide a safer work environment, he added.
Recommended for You: Service with a Robot Smile
The researchers focused on construction robots because they operate in rugged workspaces where the layout and equipment changes throughout the day, said Mahmoud Habibnezhad, a postdoctoral fellow who works on the project. Also, people use their hands on construction sites for balance and to do their jobs. They don’t have the time or capability to program and control robots at the same time.
Yet, the people who work on the sites still have to work with and control robots, including drones, he added. A drone might fly in to bring something as minute as a nail to the person who issued the command.
Machine Learning
To develop their system, the researchers had 14 participants view images of specific actions, such as grabbing bricks with their right hands, and then imagine these actions. When someone imagines their right hand grabbing an object, the right cortex of their brain generates a higher EEG signal than their left-brain area, explained Jebelli.
Read More about An AI-Driven Robot Smiles Back
Machine learning algorithms will help the robots to perfect how they interpret and carry out the EEG signals, he said.
“This level of communication can instill trust in human-robotic controls and facilitate future endeavors in safety design of collaborative robotics,” Jebelli said, adding that “This is one of the first studies that tries to measure and quantify workers’ cognitive load continuously, in near-real-time based on their physiological responses.”
Just imagine how soon those construction workers will be imagining laying bricks to get the robot to actually do it.
Jean Thilmany is a science and technology writer living in St. Paul, MN.