Artificial eyes and “guts” are not the only features of robots that will soon become reality, as researchers at Carnegie Mellon University and University of California at Berkeley have found.
The researchers have shown that robots can have “eyes” and “tears” which are both similar to what humans have, and have been described as “bio-mechanical” by other researchers.
Their paper, published in the journal Nature Communications, was published in January.
“It seems that [the robots] have a kind of physical structure,” said lead author Anastasia Efimov, an assistant professor in the department of robotics engineering.
“There are two distinct features.
The first is that they have the capacity to move,” she told Business Insider in an interview.
“This means they have muscles that move, and the muscles of the eyes can have muscles which move.”
The second feature is that, in order to move, they have to have some sort of mechanical power, and those mechanical power can be stored in some form of electrical or chemical energy.
“That energy can be turned into kinetic energy that drives a motor,” she explained.
Efsov explained that if the robot has some sort to do, such as fetch something, it needs to have a certain amount of power to do that, and this is what the researchers discovered in their study.
In this case, the researchers were interested in seeing how the muscles in the robot could be manipulated to move in certain ways.
The robots have four eyes, each with a pupil in the same size as a human’s.
The pupils are connected to a pair of lenses, each of which are connected with a motor that moves the pupil of the robot.
This mechanism, called “eye movement,” can be seen in the video below.
As the researchers noted in the paper, it was possible for the robots to have eyes that were both large enough to be seen, but small enough to make them look like they were making the same movements as a real human.
“We also showed that the muscles that could be used to move these eyes could also move the muscles on the muscles connected to the motor,” Efivos said.
“So in order for a robot to make a movement, there needs to be a motor which can drive the eye movement, and it’s a very good idea that these motors have a little bit of flexibility.”
As the robot is moved around, it has to change direction, so the researchers wanted to know if the movement could also be controlled by the robot’s eyes.
To test this, the robots were able to “move” around, but were also able to perform actions that would not be possible if the eyes were on the ground.
The robot can “sense” the movements of the human and also the robot, allowing it to see what is happening in the environment around it.
It can also take pictures of objects, such the way it can “see” the human body in the image below.
And it can even see a human in the next room, which is something it can’t do with its eyes on the floor.
“In the end, we showed that these are very similar to our human eyes, and we think that they could be a big step toward replacing the human eye in robots,” Eifov said.
The authors of the paper said that they did not have a way to tell whether the robots eyes are “seeing” the world or whether they are merely looking through a window, and they didn’t know if there were other “gaps” in the eyes that the robots could be using.
They also said they didn-t know whether the robot can be programmed to move its eyes in specific ways, or to have an eye that doesn’t move when the human is looking at it.
“You can’t really see what’s going on in the head of the robotic eye,” Effov said in the interview.
That being said, she added that they were confident that the technology could be “a huge boon for robots” and that there were “many ways that the eye could be controlled.”
This is a big leap forward for robots, Efov said, because “people can’t control the eyes of a robot.”
Efovy told Business Week that the research has not only helped researchers “improve their understanding of robots” but also “reduced the complexity of robotics.”
Eifimov said she was “extremely proud” of the research, and that “people are going to be able to have very different roles in the future.”
Efilov told BusinessWeek that she plans to start using robots to help “build cities, cities that can be automated.”
For example, if you are an engineer, you can take a robot, and make it more efficient, and you can do this with robotics in the near future.
For more, check out the video above.