robotik

robotik

human interaction

HUMAN INTERACTION

Kismet can produce a range of facial expressions.
If robots are to work effectively in homes and other non-industrial environments, the way they are instructed to perform their jobs, and especially how they will be told to stop will be of critical importance. The people who interact with them may have little or no training in robotics, and so any interface will need to be extremely intuitive. Science fiction authors also typically assume that robots will eventually be capable of communicating with humans through
speech, gestures, and facial expressions, rather than a command-line interface. Although speech would be the most natural way for the human to communicate, it is quite unnatural for the robot. It will be quite a while before robots interact as naturally as the fictional C-3PO.
Speech recognition: Interpreting the continuous flow of
sounds coming from a human (speech recognition), in real time, is a difficult task for a computer, mostly because of the great variability of speech. The same word, spoken by the same person may sound different depending on local acoustics, volume, the previous word, whether or not the speaker has a cold, etc.. It becomes even harder when the speaker has a different accent.[52] Nevertheless, great strides have been made in the field since Davis, Biddulph, and Balashek designed the first "voice input system" which recognized "ten digits spoken by a single user with 100% accuracy" in 1952.[53] Currently, the best systems can recognize continuous, natural speech, up to 160 words per minute, with an accuracy of 95%.[54]
Gestures: One can imagine, in the future, explaining to a robot chef how to make a pastry, or asking directions from a robot police officer. On both of these occasions, making hand gestures would aid the verbal descriptions. In the first case, the robot would be recognizing gestures made by the human, and perhaps repeating them for confirmation. In the second case, the robot police officer would gesture to indicate "down the road, then turn right". It is quite likely that gestures will make up a part of the interaction between humans and robots.[55] A great many systems have been developed to recognize human hand gestures.[56]
Facial expression: Facial expressions can provide rapid feedback on the progress of a dialog between two humans, and soon it may be able to do the same for humans and robots. Frubber robotic faces have been constructed by Hanson Robotics, allowing a great amount of facial expressions due to the elasticity of the rubber facial coating and imbedded subsurface motors (servos)to produce the facial expressions. [57] The coating and servos are build untop of a metal skull. A robot should know how to approach a human, judging by their facial expression and body language. Whether the person is happy, frightened, or crazy-looking affects the type of interaction expected of the robot. Likewise, robots like Kismet and the more recent addition, Nexi [58] can produce a range of facial expressions, allowing it to have meaningful social exchanges with humans.[59]
Artificial emotions Artificial emotions can also be imbedded and are composed of a sequence of facial expressions and/or gestures. As can be seen from the movie Final_Fantasy:_The_Spirits_Within, the programming of these artificial emotions is quite complex and requires a great amount of human observation. To simplify this programming in the movie Final_Fantasy:_The_Spirits_Within, presets were created together with a special software program. This allowed the producers of decreasing the time required tremendously to make the film. These presets could possibly be transferred for use in real-life robots.
Personality: Many of the robots of science fiction have a
personality, and that is something which may or may not be desirable in the commercial robots of the future.[60] Nevertheless, researchers are trying to create robots which appear to have a personality:[61][62] i.e. they use sounds, facial expressions and body language to try to convey an internal state, which may be joy, sadness, or fear. One commercial example is Pleo, a toy robot dinosaur, which can exhibit several apparent emotions.[63]

TWO SNAKES

TWO ROBOT SNAKES

Two robot snakes. Left one has 64 motors (with 2 degrees of freedom per segment), the right one 10.
Snaking: Several
snake robots have been successfully developed. Mimicking the way real snakes move, these robots can navigate very confined spaces, meaning they may one day be used to search for people trapped in collapsed buildings.[43] The Japanese ACM-R5 snake robot[44] can even navigate both on land and in water.[45]
Skating: A small number of skating robots have been developed, one of which is a multi-mode walking and skating device, Titan VIII[dead link]. It has four legs, with unpowered wheels, which can either step or roll.[46] Another robot, Plen, can use a miniature skateboard or rollerskates, and skate across a desktop.[47]
Swimming: It is calculated that when swimming some fish can achieve a propulsive efficiency greater than 90%.[48] Furthermore, they can accelerate and maneuver far better than any man-made boat or submarine, and produce less noise and water disturbance. Therefore, many researchers studying underwater robots would like to copy this type of locomotion.[49] Notable examples are the Essex University Computer Science Robotic Fish,[50] and the Robot Tuna built by the Institute of Field Robotics, to analyze and mathematically model thunniform motion.[51]

Friday, April 3, 2009