Beam Pro telepresence robot: How it works and why it is strangely alluring.

The $16,000 Robot That Lets You Be in Two Places at Once

The $16,000 Robot That Lets You Be in Two Places at Once

Innovation, the Internet, gadgets, and more.
May 1 2014 11:44 PM

Wish I Were There

The Beam telepresence robot lets you be in two places at once.

Photo by Juliana Jiménez Jaramillo.
Seth Stevenson via Beam telepresence robot.

Photo by Juliana Jiménez Jaramillo

To view the promotional video for the Beam Pro telepresence robot is to glimpse a strange, disquieting future. Observe as a corporate executive (ensconced in the comfort and privacy of her own home) fires up her global fleet of remote robot slaves, logging into each one, seeing what they see and hearing what they hear. She leads a meeting during which she commands flesh-and-blood underlings from within the fortress of a cold, robot shell. She sneaks up deskside on a surprised colleague, using the robot’s near-silent motile capabilities. And—perhaps most chillingly—she engages in a hallway conversation with another robot, screen facing screen, motors whirring. Scary days are just over the horizon, my fellow humans.

Telepresence robots had a coming out moment in March when NSA leaker Edward Snowden used a Beam to appear in robot form on stage at a TED Talk. It was unclear how using the Beam was better than simply projecting Snowden’s face onto a screen at the front of the auditorium. But the novelty intrigued me: I couldn’t help but wonder what life was like from behind the controls of one’s own personal, mechanized avatar.

My borrowed Beam arrived at Slate’s New York office packed in a giant trunk that looked like something out of a magician’s attic. When I unclasped its locks and rolled out the robot, I encountered a big screen mounted atop a pair of long poles that emerged from a motorized, wheeled base. I plugged in the Beam’s charging platform, linked the robot to the office Wi-Fi, and, after a few hiccups and a quick call to tech support, I was operating my electrically powered buddy.


You control the Beam using the arrow keys on your laptop, receiving a constant audio/video feed of the physical environment surrounding the robot. It’s amazingly intuitive, and I had no difficulty maneuvering through the office from my perch on a couch in the waiting area. A front-facing camera and set of microphones on the Beam let you see and hear with the robot’s ears and eyes as it rolls around. A downward-facing camera is there for safety, so you can tell whether your wheels are about to hit any obstructions. The webcam and microphone on your own computer let you broadcast your face onto the robot’s screen and throw your voice over its speakers.

Seth Stevenson Seth Stevenson

Seth Stevenson is a frequent contributor to Slate. He is the author of Grounded: A Down to Earth Journey Around the World.

The first thing I noticed was how well the thing generally works. The video feed is clear, and you can zoom in if necessary (like, say, if someone wants to show the robot something small on a piece of paper). The microphones are sensitive, and I could hear clearly as I moved about the room (even picking up the moments when people talked rudely about the robot behind its back).

The Beam moves swiftly and silently—so silently that I could roll right into colleagues’ offices without them realizing I’d entered until I said “hi.” I nudged one friend’s desk chair with my wheels, prompting her to turn around and gasp in surprise when she found herself suddenly staring at my enormous video face hovering over her shoulder.

I did have a couple of connectivity issues. The robot shuddered to a halt in the office kitchen area and, while I could still see and hear, I lost movement control. “Hey,” I shouted to a startled woman who walked past, “can you roll me back out into the hallway?” She kindly obliged, and the robot whirred back to life. In another instance, I lost all contact in the middle of a robot-to-human conversation in my editor’s office. The Beam automatically flashed up a message on its screen asking any nearby human to gently jostle it until it reawakened.

I also found it hard to express nonverbal cues using the robot’s body. When I’m in the office and I stop at someone’s desk to chat, I’m able to indicate when it’s time for the conversation to wrap up so I can walk away. This may involve subtle movements—turning the body slightly, or backing up half a step—that suggest an imminent departure. But these sorts of subtleties are impossible to convey when operating the Beam. Any motion the robot makes is a choice by the user to press a button, and thus cannot be passed off as subconscious body language. The Beam is blunt, and socially inept.