September 13, 2010
by Niall Firth
It sounds like something straight out of Stanley Kubrick’s 2001: A Space Odyssey.
But, in a chilling echo of the computer Hal from the iconic film, scientists have developed robots that are able to deceive humans and even hide from their enemies.
An experiment by researchers at the Georgia Institute of Technology is believed to be the first detailed examination of robot deception.
The team developed computer algorithms that would let a robot ‘decide’ whether it should deceive a human or another robot and gave it strategies to give it the best chance of not being found out.
The development may alarm those who are concerned that robots who are able to practice deception are not safe to work with humans.
But researchers say that robots that are capable of deception will be valuable in the future, particularly when used in the military.
Robots on the battlefield with the power of deception will be able to successfully hide and mislead the enemy to keep themselves and valuable information safe.
‘Most social robots will probably rarely use deception, but it’s still an important tool in the robot’s interactive arsenal because robots that recognise the need for deception have advantages in terms of outcome compared to robots that do not recognise the need for deception,’ said the study’s co-author, Alan Wagner, a research engineer at the Georgia Tech Research Institute.
A search and rescue robot may need to deceive a human in order to calm or receive cooperation from a panicking victim.
The results were published online in the International Journal of Social Robotics.
The researchers looked at how one robot could attempt to hide from another robot to develop programs that successfully produced deceptive behaviour.
Their first step was to teach the deceiving robot how to recognise a situation that warranted the use of deception.
Wagner and Arkin used interdependence theory and game theory to develop algorithms that tested the value of deception in a specific situation.
A situation had to satisfy two key conditions to warrant deception – there must be conflict between the deceiving robot and the seeker, and the deceiver must benefit from the deception.
Once a situation was deemed to warrant deception, the robot carried out a deceptive act by laying a false trail about its movements.
The robot was even able tailor its deception based on how much it knew about the particular robot it was trying to trick.
To test their algorithms, the researchers ran 20 hide-and-seek experiments with two autonomous robots. Coloured markers were lined up along three potential pathways to locations where the robot could hide.
The hider robot randomly selected a hiding location from the three location choices and moved toward that location, knocking down coloured markers along the way.
Once it reached a point past the markers, the robot changed course and hid in one of the other two locations. The presence or absence of standing markers indicated the hider’s location to the seeker robot.
‘The hider’s set of false communications was defined by selecting a pattern of knocked over markers that indicated a false hiding position in an attempt to say, for example, that it was going to the right and then actually go to the left,’ explained Wagner.
The hider robots were able to deceive the seeker robots in 75 percent of the trials, with the failed experiments resulting from the hiding robot’s inability to knock over the correct markers to trick the ‘finding’ robot.
‘The experimental results weren’t perfect, but they demonstrated the learning and use of deception signals by real robots in a noisy environment,’ said Wagner.
‘The results were also a preliminary indication that the techniques and algorithms described in the paper could be used to successfully produce deceptive behavior in a robot.’
August 9th, 2010
By: Heidi Blake
The humanoid machine, called Nao, hunches its shoulders when it feels sad and raises its arms for a hug when it feels happy.
It has been designed to mimic the emotional skills of a one-year-old child and is capable of forming bonds with people who treat it with kindness.
Nao is able to detect human emotions through a series of non-verbal “clues”, such as body-language and facial expressions, and becomes more adept at reading a person’s mood through prolonged interaction.
It uses video cameras to detect how close a person comes and sensors to work out how tactile they are.
The wiring of the robot’s “brain”, designed to mirror the neural network of the human mind, allows it to remember its interactions with different people and memorise their faces.
This understanding, along with a set of basic rules about what is “good” and “bad” for it, allow the robot to indicate whether it is “sad” or “happy”.
The actions used to display each emotion are preprogrammed but Nao decides by itself which feeling to display, and when.
“We’re modelling the first years of life,” said Lola Cañamero, a computer scientist at the University of Hertfordshire who led the project to create Nao’s emotions.
“We are working on non-verbal cues and the emotions are revealed through physical postures, gestures and movements of the body rather than facial or verbal expression.”
Cañamero believes that robots will act as human companions in future.
“Those responses make a huge difference for people to be able to interact naturally with a robot,” she said.
“If people can behave naturally around their robot companions, robots will be better-accepted as they become more common in our lives.”
Nao was developed as part of a project called Feelix Growing, funded by the European commission.
Though some scientists believe that robots could be used to help around the house, or to care for the elderly, in the future, others have warned that the humanoids could spin out of control and attack their owners by accident.
July 23, 2010
By: Rhodri Phillips
A ROBOT billed as ‘toaster on legs’ has smashed a world record by walking 14.3 miles in 11 hours.
The Ranger, built by US scientists, set the record for ‘untethered robotic walking’ earlier this month.
Guided by students using a remote control, the odd-looking robot strode 108.5 times around a 212m indoor track.
It completed about 70,000 steps before it needed a recharge.
Andy Ruina, 57, lab manager on the project at New York’s Cornell University, said: “The Ranger is nothing special to look at but the motion is rather graceful in a way you don’t see in many robots.”
The Ranger, whose parts alone cost 15,000, took four years to perfect but runs on batteries and cost less than 1p to travel four miles.
It smashed the previous world record set by Boston Dynamics’ BigDog in Boston by 1.5 miles.
The Ranger is energy efficient because it copies the physics of human walking, using gravity and momentum to roll its legs forward.
While the movements of other robots are controlled by motors, the Ranger has a more laid-back gait.
July 14, 2010
Two robots with surveillance, tracking, firing and voice recognition systems were integrated into a single unit, a defence ministry spokesman said.
The 400 million won (£220,000) unit was installed last month at a guard post in the central section of the Demilitarised Zone which bisects the peninsula, Yonhap news agency said.
It quoted an unidentified military official as saying the ministry would deploy sentry robots along the world’s last Cold War frontier if the test was successful.
The robot uses heat and motion detectors to sense possible threats, and alerts command centres, Yonhap said.
If the command centre operator cannot identify possible intruders through the robot’s audio or video communications system, the operator can order it to fire its gun or 40mm automatic grenade launcher.
South Korea is also developing highly sophisticated combat robots armed with weapons and sensors that could complement human soldiers on battlefields.
It has a largely conscripted military of 655,000 against Pyongyang’s 1.2 million-strong force, but a falling birth rate means Seoul will struggle in the future to maintain troop numbers.
April 27, 2010
Imagine something between a computer game and a pet that helps makes you slim. One inventor did just that and came up with Autom — a robot that will look dieters in the eye and tell them what they need to hear.
Users can have daily conversations with the 38-centimetre-tall (15-inch) robot, which will crunch calories and provide feedback and encouragement on their weight-loss progress.
For those who hate manuals — there isn’t one. Switch Autom on and it’s ready to go.
Its blue eyes open and its head swivels as a computer inside its head allows it to search for a human face in front of it and maintain eye contact.
“Hello, I’m Autom! Press one of the buttons below to talk to me,” it says in a robotic female voice with an American accent. “I’m ready to get started. Let’s keep working together.”
Users tap their details onto the robot’s screen in response to its spoken questions about weight, diet, exercise regime and goals and over time it builds up a knowledge of the dieter’s strengths and weaknesses to tailor its questions and advice accordingly.
The information is also processed to provide graphs on their progress and habits over time.
The brainchild of Cory Kidd, a graduate of the Massachusetts Institute of Technology with a doctorate in human-robot interaction, Autom hits the US market later this year, retailing for about 500 dollars.
The 80-billion-dollar US weight loss market has already been targeted by Nintendo with its Wii Fit and My Weight Loss Coach games but Kidd is banking on Autom offering dieters a more personalised way of using technology to slim down.
It is a so-called sociable robot, a new generation of robots that adapt their behaviour in order to interact with humans.
Autom looks fairly simplistic, with a head and neck attached to a rectangular box-shaped body on two stumpy legs. Its face has no nose and only the hint of a mouth.
But the cutting-edge field of human-robot interaction combines insights from the social sciences as well as technology and medicine
“It draws heavily on human psychology — so understanding how we as people interact with one another,” Kidd told AFP. “It relies on cues that people use in everyday communication.”
April 23, 2010
A US Air Force unmanned spacecraft has blasted off from Florida, amid a veil of secrecy about its military mission.
The robotic space plane, or X-37B, lifted off from Cape Canaveral atop an Atlas V rocket at 7:52 pm local time (2352 GMT) Thursday, according to video released by the military.
“The launch is a go,” Air Force spokeswoman Major Angie Blair told AFP.
The lift-off appeared to proceed as planned without major problems, judging by the commentary in the Air Force webcast.
Resembling a miniature space shuttle, the plane is 8.9 meters (29 feet) long and has a wing-span of 4.5 meters.
The reusable space vehicle has been years in the making and the military has offered only vague explanations as to its purpose or role in the American military’s arsenal.
The vehicle is designed to “provide an ‘on-orbit laboratory’ test environment to prove new technology and components before those technologies are committed to operational satellite programs,” the Air Force said in a recent release.
Officials said the X-37B would eventually return for a landing at Vandenberg Air Force Base in California, but did not say how long the inaugural mission would last.
“In all honesty, we don’t know when it’s coming back,” Gary Payton, deputy undersecretary for Air Force space programs, told reporters in a conference call this week.
Payton said the plane could stay in space for up to nine months.
Flight controllers plan to monitor the vehicle’s guidance, navigation and control systems, but the Air Force has declined to discuss what the plane is carrying in its payload or what experiments are scheduled.
Pentagon officials have sidestepped questions about possible military missions for the spacecraft, as well as the precise budget for its development — estimated at hundreds of millions of dollars.
The results of the test flight will inform “development programs that will provide capabilities for our warfighters in the future,” Payton said.
Industry analysts have speculated the Pentagon must have military capabilities in mind for the unmanned spacecraft or else would not have invested so much time and money in the effort.
The space plane — manufactured by Boeing — began as a project of NASA in 1999, and was eventually handed over to the US Air Force Rapid Capabilities Office.
Once in space, the vehicle is powered by solar cells and lithium-ion batteries.
The Air Force has plans for a second X-37B, scheduled to launch in 2011.
February 26, 2010
By Carolyn Thompson
One of the first things Mike Ameroso asked while contemplating robotic surgery for his prostate cancer was how many surgeries his doctors had done with the robot.
He liked the idea of the robot’s smaller incision and steady miniature “hands” and the promise of less pain and a quick recovery — but had his doctors put in time at the controls?
After all, “an aircraft is only as good as the pilot who flies it,” concurred Thenkurussi Kesavadas as he and Ameroso took part Thursday in the rollout of a new robotic surgery simulator that lets surgeons practice endlessly in a field that’s growing by leaps and bounds.
The “RoSS” simulator closely approximates the touch and feel of the widely used da Vinci robotic surgical system. It was developed through a collaboration between the Roswell Park Cancer Institute and University at Buffalo, where Kesavadas heads the Virtual Reality Lab.
Nearly all prostate surgeries in the United States are now performed by robot, with doctors peering through a viewfinder at a magnified image and moving instruments in the air to control the ones inside the patient. Robotic systems are increasingly being used in everything from weight loss surgery to children’s operations.
Ameroso’s successful 2007 surgery made him a believer. The 68-year-old Amherst resident came out of it not only cancer-free but pain-free and with only a half-inch incision.
But “it is never about the machine,” said Dr. Khurshid Guru, a surgeon and director of the Center for Robotic Surgery at Roswell Park in Buffalo. “What’s more important than the machine is the person who manages or operates the machine.”
Guru and Kesavadas co-founded a spin-off company, Simulated Surgical Systems LLC, to commercialize the RoSS simulator and have already taken five orders for the roughly $100,000 machines.
The simulator uses virtual reality technology developed over 10 years at UB to let surgeons practice anything from cutting tissue and sewing incisions to full procedures and versions of procedures where complications arise.