Archive | May, 2010

Ready for the Robotics Virtual Conference?

The Robotics Summit Virtual Conference Series starts soon, and I’m curious to see how it goes. The first event is Autonomy, Navigation, & Mobility Solutions on June 16.

I’ve worked as an exhibitor at dozens of industrial trade shows and conferences in the past, for the general industrial robotics market (Robot & Vision) to specific processes (Packaging, Assembly, Semiconductor, Solar, Medical Manufacturing, etc), so I know what it takes to prepare and staff a booth in the real world. But I’m not entirely sure what goes on for a virtual event.

Instead of talking face to face with someone about their technology, I’ll be “chatting live” with people from the company. This just conjures up memories of the online customer service from companies like Comcast.

Some of the presentations should be worthwhile – after all, it’s the same presentation whether I’m watching it in a room crammed with people or at my desk.

It’s the physicality of robots that I’m going to miss though. I’m in this industry because I like robots – I like making stuff move with just a program, I like the intricacy of the mechanical parts moving all together. The robotics world (industrial and mobile) was created by tinkerers and hands-on geeks, people who like to take things apart and make them better (or at least different). I don’t know if you can generate the same level of excitement about a gadget with a virtual conference.

Despite my reservations, I’ll be attending, and am looking forward to learning a bit. If you’re interested, check out the agenda, and register (it’s free so what have you got to lose). I’m curious to know what other people get out of it as well. I must say though, as an engineer at an industrial robotics company I’m a little more interested in the next conference, New Applications for Industrial Robotics on September 15. I’ll try to give a full report here of what I learn at the event, and how I think it goes.


Flying robot with amazing agility

Autonomous UAV

Normally I don’t pay a whole lot of attention to flying vehicle tech. It’s not that I’m not interested – on the contrary, there are some really cool videos out there. It’s just not my area of expertise; I’m not all that familiar with them. But like any good tech addict, I love a good demonstration of amazing ability.

One such demonstration (found thanks to Evan at BotJunkie) is this Quadrotor autonomous UAV, from GRASP Lab at the University of Pennsylvania.


The innovative people at GRASP lab seem to be working on a number of cool projects, including being one of the 11 members of Willow Garage’s PR2 Robot Beta Program.

PR2 Robot

Willow Garage PR2 Robot

Their proposal for the PR2:

University of Pennsylvania, GRASP Laboratory with the proposal PR2GRASP: From Perception and Reasoning to Grasping

The GRASP Lab proposal aims to tackle some of the challenges facing household robotics. These challenges include tracking people and planning for navigation in dynamic environments, and transferring handheld objects between robots and humans. Their contributions will include giving PR2 a tool belt to change its gripper on the fly, helping it track and navigate around people, and performing difficult two-arm tasks like opening spring-loaded doors.


The Future Is Very Far Away (but in the same room)

At the Stanford Robot Block Party in April (part of National Robotics Week) I got to spend a little time with one of Willow Garage‘s engineers via Texai, their telepresence robot. Turns out, they’ve had engineers telecommuting using these devices for quite a while. This is a whole new strain of telecommuting. Not just working on your laptop in your pajamas from home while logged in to the corporate network, but actually able to roam the halls, hang out by the water cooler, catch up with coworkers.

Figure 1 Texai telepresence at Robot Block Party (not me in the picture)

Thanks to some recent reporting at a fundraiser in San Francisco by Singularity Hub (Tony Robbins, Sergey Brin Become Robots – The Telepresence Revolution), we see Google co-founders and other “stars” of the tech world rubbing mechanical elbows with people. Expect to see a great deal more of these devices at media-covered events and parties, as the acceptance of mechanical visitors increases, and the requisite infrastructure (bandwidth, power) is available.


But you should also prepare to see more of them in “the real world.” I’d say nearly everyone who may be reading this has used a conference call before, and the majority have probably used an online meeting service, such as WebEx or GoToMeeting, or even made a video call using Skype or iChat. But those don’t really get you there. Voice chat and webcams just don’t give the same sense of being actually at the event, but whirling around to see who’s sneaking up on you, being able to scan the room and see the expressions on their faces, can really change the way you see a remote event.

Are they perfect? Far from it. The telepresence devices I’ve seen so far don’t seem particularly stable, they don’t appear to be able to handle terrain more adventurous than shag carpet, and their reliance on off the shelf components makes them seem like a jumble of random stuff stuck together (monitor, webcam, wifi router, mobile base)

I anticipate a rapid maturing of this technology. In general, people seem to be very accepting of the devices roaming through a crowd. We’ll be seeing them get a little more friendly-looking, more capable, and cheaper. The QB from AnyBots is one example, although without a screen to announce who you are, it’s probably going to make people feel like they’re being spied on. And that’s important – telepresence isn’t about spying on your friends or coworkers from far away, that’s what surveillance cameras are for. It’s about joining the conversation as a person.

Joining the conversation as a person is where the future of telepresence lies.

How far is it going to go? Just consider how many people are striving to make a humanoid robot. Consider the movie “Surrogates“. It didn’t get a lot of attention, and wasn’t really that good. But it’s worth watching for anyone interested in the future of robotic technology, because it may well be in the foreseeable future. In the movie, most people interact with the world solely through a “surrogate”, a mechanical body made to look like them, and driven remotely from the safety of their own bed. It’s pretty far off in the future, but the seeds of the future may already be planted.

Consider telepresence the “killer app” for those who are trying to make robots that look like people. Creepy people, but people.


Powered by WordPress. Designed by WooThemes