Dear Commons Community,
Professor Matthias Scheutz of Tufts University and his team have programmed several robots to read one another’s thoughts. Below is an article describing this development courtesy of Science on Tap – American Association for the Advancement of Science. Here is a link to youtube videos demonstrating Scheutz’s robots.
Mind Sharing Robots Learn from Each Other
By Suzannah Weiss
Imagine you’re an astronaut aboard a spacecraft orbiting Mars, scouting it out for a colonization mission. But suddenly, there’s a break down. Luckily, two autonomous robots on board each know what the other is doing and perceiving, without any outward communication, and are able to get to work repairing the ship. This special capability is known as “mind sharing.”
Professor Matthias Scheutz of Tufts University and his team created this exact scenario in a virtual reality environment and had humans navigate it under two conditions: when the robots had mind-sharing capabilities and when they didn’t. They discovered that when the robots mind shared, the task was completed more quickly, and more tubes aboard the spacecraft were repaired.
While this Mars mission may exist only in virtual reality, robot mind sharing does not. Scheutz’s team has programmed several real-life robots to essentially read one another’s thoughts, working to coordinate tasks like one giant hive mind.
The technology began with a neural chip that ran on one computer, which multiple robots could access to rapidly process images. “That was how we set up a system that allowed robots to access a joint resource, and it became a way to share anything on the fly on an as-needed basis,” Scheutz recounts.
Scheutz was inspired by a concept in organizational psychology called shared mental models — people’s conceptions of teams they belong to and tasks they’re collaborating on. The more similar people’s mental models tend to be within a group, the greater its success.
“In the case of robots, we can organize and build them so there’s no longer a need to continuously update them,” he explains.
While communication among machines isn’t new, what is unique about this system is that robots can share tasks even if the physical structures performing those tasks are very different. For example, robots with two different types of arms can still share information about how to pour liquid out of a container. In addition, a robot that lacks a certain component can essentially borrow it from another robot; for instance, one that needs to pick up an object, but cannot see, could use visual information from another robot’s camera to obtain the object.
Scheutz sees many applications for this technology, one being the coordination of tasks by household robots.
“If you teach one robot in the kitchen context how to slice a cucumber, then if another robot is being taught how to make salad and you get to the point where you’re trying to explain how to slice a cucumber, it already knows how to do that,” he explains. “It’s very practical because you basically save the effort of retraining every single machine.”
This same setup can also benefit robots in industrial settings, like factories. If each robot is being taught by a human, they all will get smarter and smarter in tandem.
In another, more lighthearted application, shared on the Tufts University Human-Robot Interaction Laboratory’s website, robots synchronize a dance performance as each picks up on moves taught to other members of the ensemble.
Scheutz is currently in talks with several companies interested in adapting his technology. One of them, Thinking Robots, Inc., is integrating it with office robots so that they can seamlessly delegate tasks within a workplace. A video on the Thinking Robots website shows two robots — a mobile platform and a stationary arm — responding to a person’s command to pick up and deliver an object, each knowing which part of the task to complete. Scheutz expects that these robots will be on the market within a year or so.
We could also soon see these kinds of robots popping up in stores, where they could work together to stock shelves, clean up items that have been dropped or spilled, or fetch staff members when customers need help.
The next step for Scheutz is to figure out how these machines can better accommodate human preferences. For instance, it seems that people don’t like to be left out of robots’ conversations. In Scheutz’s research, if someone told one robot to instruct another to leave the room, participants preferred that the robot did this out loud, even if it did not need to.
“Humans have to do it the hard way by communicating explicitly and giving each other updates, and robots don’t have to do that,” Scheutz explains. Going forward, we’ll have to strike a balance between taking advantage of this capability and allowing it to coexist with our own limited, self-contained minds, Scheutz says.