CHRISTCHURCH—An upbeat West Indies touched down yesterday to continue preparation for their two-Test series against the hosts, bowling off next month.
You are here
Robots have moved from the realms of science fiction and may well cause some social friction as they migrate from the factory environment into society. These new so-called social robots will be interacting more and more with humans in ever-expanding domains. These include physical therapy, care for the sick and elderly, household chores and entertainment. People are afraid that these increased interactions between mankind and robot-kind will be fraught with dangers leading to death and injury.
To allay such fears and indeed for robots be successful in their prescribed role, they will have to be endowed with social behaviours and rules appropriate for their domains of interactions. Humans learn the rules of engagement for a large variety of situations. So the rules of engagement or behaviour, if you wish, in the classroom is different from that in the restaurant or in a social event like a wedding.
In the restaurant we may talk and laugh loudly (well Trinis tend to be loud) and of course, eat. That would not be acceptable in a classroom or during a wedding service. Similarly robots can be endowed with the rules of engagement for their interactions. In the factory environment, which is a strictly regulated one for both humans and robots, the interactions are quite safe.
The home environment, however, is a lot more complex than that of industry and hence the robots themselves must be programmed to regard human safety as their highest priority. Collision, with fixed objects and humans or pets, is a danger that immediately comes to mind. The faster the robot moves, the more the likelihood of a collision. Robots then clearly move at fairly low speeds when they are in the home environment.
Collision-avoidance has now become quite a mature technology with which many vehicles are being equipped. Similar technology would easily be adopted to make household or social robots collision-proof. People have come to realise that robots can make quite quick and good decisions but there is a lingering fear that robots have no feelings and hence would have neither remorse nor compassion. This could make them dangerous, like cold blooded killers and criminals.
This is far-fetched, as robots do not suffer from the anti-social tendencies and inadequacies of human beings. Nevertheless, the concern is a legitimate one as humans are gregarious creatures who need and form emotional bonds. So if robots are to provide companionship, in addition to household help, then programming robots with a some sort of ability to recognise and show emotions would be necessary.
A few months ago, a Japanese robot named Pepper was launched which, according to its creators, could recognise and demonstrate emotions. The Japanese have and are heavily investing in this area as their robots are being targeted for companionship and care of an ageing population. There are many who think that it would not be possible to achieve the goal of a caring, companionate robot. It is however very possible and desirable.
When we are speaking to someone, they generally nod their heads or murmur at appropriate intervals, to acknowledge they are listening and understanding what we are saying. An undergraduate, engineering student can design and build a robot that can nod and murmur “uhmm” at interval that can be either random or periodic, realistically simulating human behaviour. Some may argue that the robot does not understand what we are saying and is merely mimicking human-like actions.
But is it not true that many a time we are unable to tell if the human to whom we are speaking really care about what we are saying and in fact may be mocking or humouring us? We can be sure that the robot would not.