AlphaGalileo is a service for the media.
Journalists should register for free access to embargoed
news and press office contact information.
Please register view details
Please register to view contact details
Please log in or register to view articles older than 3 months
This item is under embargo and is only visible to journalists
Bookmark this item in My Area
This item is bookmarked
Add comments to this news release
Humanity came one step closer in January to being able to replicate itself, thanks to the EU’s approval of funding for the Human Brain Project. Danica Kragic, a robotics researcher and computer science professor at KTH Royal Institute of Technology in Stockholm, says that while the prospect of living among humanoid robots calls to mind terrifying scenarios from science fiction, the reality of how humans cope with advances in robotics will be more complex, and subtle.
“Robots will challenge the way we feel about machines in general,” Kragic says. “A completely different kind of society is on the way.”
The Human Brain Project will involve 87 universities in a simulation of the cells, chemistry and connectivity of the brain in a supercomputer, in order to understand the brain’s architecture, organisation, functions and development. The project will include testing brain-enabled robots.
“Will we be able to – just by the fact that we can build a brain – build a human? Why not? What would stop you?” Kragic asks.
Nevertheless, consumer-grade robots are a long way from reality, says Kragic, who in addition to serving as Director of KTH’s Centre for Autonomous Systems, is also head of the Computer Vision and Active Perception Lab.
She says that in order for robots to offer some value to households, researchers and developers will have to overcome some daunting technological challenges. Robots will have to multitask and perhaps even be programmed to have emotional capacities programmed into their logical processes, she says.
“Based on the state of the environment and what it is expected of the robot, we want the outcome action to be acceptable to humans,” she says. “Many things that we do are based not just on facts, so should machines somehow have simulated emotions, or not? Either way, it is difficult to predict how that will affect their interaction with humans.”
Kragic sees robots making a largely positive contribution to society. But they will also present some novel problems for which humans have few reference points, such as what are the social norms for interacting with robots?
“There is a discussion about robot ethics and how we should treat robots,” Kragic says. “It’s difficult to say what’s right and wrong until you are actually in the situation where you need to question yourself and your own feelings about a certain machine – and the big question is how your feelings are conditioned by the fact that you know it’s a machine, or don’t know whether it’s a machine.”
Kragic predicts that one of the most popular consumer application of robots will be as housekeepers, performing the chores that free up time for their owners. They could also take over jobs that are repetitive, such as operating buses or working in restaurants. On the other hand, the robot industry will expand and create jobs, she predicts.
As for the possibility that one day robots will turn on us – Kragic is skeptical. “A robot rebellion - that’s the ultimate science fiction scenario, right? It’s worth placing some constraints on robots, such as (author Isaac) Asimov’s Three Rules of Robotics. At the same time, we have rules as humans, which we break. No one is 100 percent safe, and the same can happen with machines.”
Human rebellion against robots is far more likely, she says, pointing out that even as society’s attitudes toward automation evolve over generations, the debate over whether humans have the right to “play God” will likely continue. “There will be people for and against it,” she says. “But what is wrong with building a human? We have been raised in a society that thinks this is wrong, that this is playing God.
“Subsequent generations could have a different view.”
This item has been withdrawn. Registered users can contact the publishing organisation for further details by logging in