It's spring training at Carnegie Mellon's MultiRobot Lab. On a 6-by-4-meter, green-felt field, little robot dogs run through drills: shooting, passing, goaltending. Every Wednesday, the Sony AIBOs line up for a full scrimmage, their heads swiveling to find the ball and their rumps pointed to the sky. It's last week's code against this week's code—may the best robots win.
The CMU robot dogs, known as CMDash'05, are the defending champions in the four-legged division of the RoboCup U.S. Open. After going for a repeat title in May, they'll head for the RoboCup world championships in Osaka, Japan, and a potential matchup with the juggernaut defending champs from Germany. The championships will include more than 150 robot teams in five leagues: simulation, small-size, middle-size, four-legged, and humanoid. Each division has the same goal: "By the year 2050, develop a team of fully autonomous humanoid robots that can win against the human world soccer champion team."
Will a team of robots beat the World Cup champions—or at least the best team in MLS—in our lifetime? After a couple of days spent watching foot-long, plastic dogs waddle after a bright orange ball, I admit it's hard to imagine. But even if it does take longer than a half-century, the robot-soccer scientists will one day meet their objective. And when the robots do kick us into submission, it won't be because they've unlocked some dominant strategy or because they've dramatically surpassed our stamina, coordination, and flexibility. It'll be because the robots have finally learned to see as well as you and I do every day.
Computer scientists have proved pretty successful at tackling strategic challenges. When researchers started playing around with robot soccer 10 years ago, they ran into the "Little League" problem: Every little bot dumbly chased after the ball. Today's robo-athletes look more like middle-schoolers: Some teams play zone defense, and several have developed effective passing routines. In RoboCup's small-size league, where the simple robots receive perfect information about the state of the game, the most effective teams change tactics during the match to exploit their opponents' failings. Watching these little bots play even a rudimentary game undermines your faith that some peculiar human intelligence is required to win at sports. It's not so hard to imagine that, within a decade, we might be learning soccer (and football and basketball) strategy from machines.
Robots still have a lot to learn from the human body, however. The coordination and balance required for everyday activities, much less professional athletics, set a very high bar for a bipedal being of our size and shape. Human stamina, which might seem outclassed against a team of tireless terminators, is likewise difficult to match. Keeping a human-sized robot running, jumping, and kicking for 90 minutes will require spectacular feats of energy storage and generation.
Even so, robots are getting more agile all the time. Honda's ASIMO can run (albeit less than 2 mph), and Sony's new QRIO balances on uneven terrain and does something rhythmic its creators call "dancing." Last month, NASA's Jet Propulsion Laboratory hosted a showcase for artificial muscles. It's now a matter of getting these beasts pumped up to sprint faster and with greater coordination—and without being plugged into the wall.
The eyes, though, still present a huge problem. Today's soccer bots primarily use color-coding to identify objects on the field: white lines, orange ball, yellow-and-pink posts. But color-coding is a very brittle way to perceive the world. If the tint or intensity of the lighting changes, so do the colors—if a cloud passes overhead, the robots are effectively blind. To confuse them, a human team would need only to keep the ball in the air, where the shifting background and light would make the colors almost impossible to identify.
The researchers I spoke with all marveled at the complexity of human visual perception, how we're able to gather data in a distracting environment and assemble it into a seamless, real-time model of the world. But with the help of enhanced visual algorithms like edge detection, the robots are slowly improving. A few RoboCups ago, spectators couldn't wear orange shirts for fear of confusing the competitors. This year, the championships will be played under ambient light, and organizers hope matches will move outdoors in the very near future.
So, let's assume the day has arrived: A fleet of robot Pelés can beat the Brazilian national team. Will we let them? The best professional athletes may fall back on a singularly human quality: cowardice. Will a future Freddy Adu or LeBron James be willing to risk the embarrassment—and the loss of endorsement dollars—that will come with losing to a bucket of bolts?
After a human soccer team finally does agree to play a bunch of mechanized superbots, there will be all sorts of fair-play questions, too. Will the robots be allowed to surpass human strength and speed? Will they be permitted to recharge at halftime? How about wireless communication of superaccurate field-position data? (Current FIFA rules are silent on the subject of nonhuman players.)
TODAY IN SLATE
Scalia’s Liberal Streak
The conservative justice’s most brilliant—and surprisingly progressive—moments on the bench.
Colorado Is Ground Zero for the Fight Over Female Voters
There’s a Way to Keep Ex-Cons Out of Prison That Pays for Itself. Why Don’t More States Use It?
The NFL Explains How It Sees “the Role of the Female”
The Music Industry Is Ignoring Some of the Best Black Women Singing R&B
Theo’s Joint and Vanessa’s Whiskey
No sitcom did the “Very Special Episode” as well as The Cosby Show.
The Other Huxtable Effect
Thirty years ago, The Cosby Show gave us one of TV’s great feminists.