Our fascination with Japan's fascination with robots continues. The Expo 2005 Prototype Robot Exhibition in Aichi Prefecture ended on Sunday after eleven days of synthetic marvel — from dancers to companions, surveyors to guides, laborers to entertainers, musicians to simulacrums, the Exhibition presented sixty-three robots and countless possibilities. Alongside the hardwired societal hopefuls were the ethical, aesthetic and metaphysical questions that have followed robotics through decades of straddling science and science fiction. When is a robot a supplement and a welcome substitute? When does it displace a human being who could and should have undertaken the task it performs? Are programming flaws worth the trade for man's emotional shortcomings?
One of the robots, Kawada Industries' humanoid HRP-2, played a wadaiko based on the movements of a master percussionist like a MIDI driver might execute a sequenced musical score. Nicknamed "Promet" by its body designer, animé oldhand Yutaka Izubuchi, HRP-2 is a personified study in bipedal robotic agility, equilibrium, recovery and cooperative physical activity. Kawada intends to "rent" HRP-2 out to private and public designers alike, "a humanoid robot R&D platform" that sounds remarkably like an open-source arrangement. The Expo drum performance, likely intended for the headlines and pool photographs it duly received, does pose an interesting challenge to the human sense of musical rhythm and tempo. Mechanical timekeeping is nothing new: since the introduction of commercially available sequencers in the early 1980s, music — especially pop — has almost completely conformed to a standard of quantization, or the limitation of musical time values to a discrete set. Just as high compression is today's familiar and requisite sound, consumers' ears have grown accustomed to steady beats. A bubblegum tune without a dozen dance hall remixes is a song that will not sell, and remixing is practically impossible with arbitrary tempo fluctuations.
At the same time, some minimalist compositions of the late Twentieth Century, particularly those of Steve Reich, are both aurally and theoretically compelling precisely because they are live performances on traditional instruments of music normally associated with — and relegated to — synthesizers and sequencers. Metronomic and repetitive, a Reich piece like the 1970 Drumming would fail if produced electronically; on marimbas and glockenspiels, it is a seventy-minute masterpiece, wholly dependent on the happy medium between human consistency and imperfection. A friend's old college classmate was recently praised by Reich for the "tightest performance" of Music for 18 Musicians the composer had ever attended. Run through a computer there would have been no tension, no risk — and excitement is a difficult element to code. Digital clocks may prevail in the making of organized sound but the subtle judgment that brings nuance, as Kawada's HRP-2 project demonstrates, is yet beyond replication.
Which is not to say that robotics designers are absent their share of little victories:
Repliee Q1 appeared yesterday at the 2005 World Expo in Japan, where she gestured, blinked, spoke, and even appeared to breathe. Shown with co-creator Hiroshi Ishiguru of Osaka University, the android is partially covered in skinlike silicone. Q1 is powered by a nearby air compressor, and has 31 points of articulation in its upper body. Internal sensors allow the android to react "naturally." It can block an attempted slap, for example. But it's the little, "unconscious" movements that give the robot its eerie verisimilitude: the slight flutter of the eyelids, the subtle rising and falling of the chest, the constant, nearly imperceptible shifting so familiar to humans.
Or ethical dilemmas. How soon until Nexus 6?
UNCANNY: Repliee in motion.