Japanese researchers shoed off a whole new HRP-4C female fashion model robot at Fashion Week In Tokyo. It (She?) has 42 motion motors to mimic the movements and facial motions of human fashion models.
Its walking and facial motions are still far from those of real humans, there's absolutely no question that those problems will be solved in the near future. But.. How about touch? Current humanoid robots has many sensors to control its walking. To name a few, accelerometers, gyroscopes, cameras, encoders, etc. Touch (in robotics, tactile is more appropriate but let me keep the term touch in the following) sensors are also used but they are just for delivering foot pressure to a central motion controller.
Robots should feel touch because they are designed to operate among us. And it means that literally the whole surface of the robot is covered with large, flexible (at least, curved) touch sensors. Many researchers have been working on making those kind of touch sensors. Robots with touch will interact with humans very naturally and harmlessly.
Mechanics, AI, control, and robot touch sensors have been studied for a long time. I think there is one missing link among the research areas: interpreting touch data. Touch data from robot skins will be very different from ordinary sensor data. It is distributed and large. Robot's brain has to process it very fast and control the body accordingly. Lumelsky et al.  exactly indicated this point:
Sensitive skin devices will include thousands and millions of elements that generate and process tremendous amounts of information, in parallel and in real time. This will hence be a new physical basis of information technology.With the eventual ubiquity of the sensing skin on various machinery, it is likely to bring the biggest leap in information technology hardware since the introduction of computers.
Because I totally agree with their view points, I quote some more paragraphs from their paper:
A. Sensing and Dynamic Control
Consider our home Helper mentioned above. When its arm senses an obstacle, the control system must analyze it and modify the motion accordingly. Here, the arm dynamics, sensing resolution, and the allowed speed of motion are all tied in some relationship. For example, if the sensing is “myopic” and the arm is heavy, the Helper will move slower, “fearing” a collision, no matter how good its control system is. Since the Helper’s arm is a highly nonlinear system, realizing good real-time control is a challenging problem of control theory.
B. Need for New Control Theory
Note, for example, that the admittedly complex control system of today’s flying aircraft focuses primarily on achieving desired properties of motion at a single point of the aircraft — say, its center of gravity. Other characteristics, such as accounting for body dynamics, appear as constraints on control. However, when controlling a sensitive skin-equipped machine, the control system should be able to focus intermittently on various single and multiple points of potential collision on the machine’s body, and modify the control accordingly, all within the draconian constraints of real-time operation and changing kinematics of the body. Perhaps a better analogy is the control of a bat flying among tree branches, or attempts of reconfigurable control for the changing shape of a jet fighter in battle. These complications call for novel, exciting control theory.
C. Motion Planning Based on Sensitive Skin Data
This research is likely to make use of tools from graph theory, search algorithms, computational geometry, differential geometry, and topology. One serious issue, largely not addressed today, is the symbiosis of real-time motion planning algorithms with control of the machine’s body dynamics and with nonholonomic constraints on the machine motion. That is, based on the continuous stream of data from the sensitive skin, the planning algorithm not only has to produce collision-free motion for every point of the machine’s body, but it has to do it within the system’s dynamic constraints (masses, accelerations, etc.) and the design constraints of its actuators (e.g., an automobile cannot turn on the spot and must move along a curve).
E. Man–Machine Systems
Human and machine intelligence could be merged in real-time motion planning. Envision, for example, a pilot trying to keep his helicopter hovering low above the ground to collect samples from the rain forest without colliding with the underbrush. Besides informing the pilot of any undesirable contacts with plants below, data from the sensitive skin-covered helicopter underbody can be directly used for automatic control of the hovering height to avoid collision. Prototypes of control systems that combine human and machine intelligence in real time have already been demonstrated.
Not directly related to touch, Zbikowsky  made an interesting comparison between a jet fighter and a fly.
Whereas the F-35 Joint Strike Fighter, the most advanced fighter plane in the world, takes a few measurements—airspeed, rate of climb, rotations, and so on—and then plugs them into complex equations, which it must solve in real time, the fly relies on many measurements from a variety of sensors but does relatively little computation.
The fly attains remarkable performance, yet is computationally quite simple, relying on extensive, distributed measurement of parameters of interest. From an engineering viewpoint, this opens up new possibilities in control, as well as in sensors and instrumentation.
Multi-touch is not just for touch screens…
 V. Lumelsky, M. Shur, and S. Wagner, “Sensitive skin,” Sensors Journal, IEEE, vol. 1, pp. 41-51, 2001.
 R. Zbikowski, “Fly like a fly [micro-air vehicle],” IEEE Spectrum, vol. 42, pp. 46-51, 2005.
Links on HRP-4C
- HRP-4C fashion model robot, Boing Boing
- HRP-4C Female Fashion Robot, Aerospace and Paranormal and UFOs News
- HRP-4C Robot Woman Is A Cybernetic Humanoid, Technovelgy.com
- HRP-4C robot ready to put human fashion models out of work, DVICE
Post a Comment