June 25th, 2010 by Aaron Saenz
With it’s cyclops eye and shiny plastic shell the Myon robot may not exactly look like your typical eight year old, but its designers hope that you’ll think there’s a vague resemblance. The Myon project is aimed at understanding how robots could gain and develop language skills. For that, it needs to interact easily with human partners and so it’s been given the rough size and shape of a young child. Unlike any human kid, however, each of Myon’s limbs are completely modular, with their own power supply and controls. Hack off an arm and the robot can continue working with no problems. Myon recently made its debut at the recent DMY International Design Festival in Berlin. Check out a simplified version of the bot taking a walk in the video below.
Robots designed to look and learn like children isn’t a new concept. We’ve seen the iCub work it’s way towards understanding visual cues and eye contact, and the Diego-san robot from UCSD and Kokoro was made for cognitive experiments (as well as terrifying everyone who sees it). Without seeing Myon tackle actual language tasks, it’s hard to know how it stacks up to these other projects. However, Myon is part of the ambitious and rather successful Artificial Language Evolution on Autonomous Robots (ALEAR) project, which is a point in its favor.
Where Myon seems to stand out is its hardware. It has 192 sensors, 48 joints, and a touchscreen on its chest. Most impressively it has a modular design structure so that it can function even after losing a major portion of its body. Myon was developed by the Lab of Neuro Robotics at Humboldt University in Germany, in collaboration with design firm Frackenpohl Poulheim. They hope that the modular design may help answer questions such as what happens when a learning robot swaps out a limb with an untrained sibling. It’s shiny shell is made of makrolon, a polycarbonate product by Bayer Material Sciences. The external covering has structural importance – it provides support for torsion forces used in movement. Still, I’m not too impressed with the walking skills shown in the video below. It’s not a bad bit of locomotion, but I’ve seen better from Lego bots.
I think it’s kind of interesting that the pursuit of understanding artificial cognition almost always pushes researchers into acts of mimicry. Myon and similar projects use robots in the rough shape of children. The Blue Brain Project is trying to simulate a human neocortex. We’ve also seen robots helping shed light on evolutionary forces in animals. All of this copy-catting is cool to watch, but I always wonder, when will we just let robots be robots? The most impressive part of Myon is that it might be able to be trained and then share that education by trading the relevantly experienced limb. That’s a style of learning unique to robots (and bacteria, I guess). In the future it should be interesting to see if projects like Myon can find original ways to study cognition with or without following the human intelligence paradigm.
A warning to the machines: stop trying to train your kids to learn like human children, or I’ll starting training my (hypothetical) human children to learn like robots.