Archive | mobile robots

Flying robot with amazing agility

Autonomous UAV

Normally I don’t pay a whole lot of attention to flying vehicle tech. It’s not that I’m not interested – on the contrary, there are some really cool videos out there. It’s just not my area of expertise; I’m not all that familiar with them. But like any good tech addict, I love a good demonstration of amazing ability.

One such demonstration (found thanks to Evan at BotJunkie) is this Quadrotor autonomous UAV, from GRASP Lab at the University of Pennsylvania.

[youtube=http://www.youtube.com/watch?v=MvRTALJp8DM]

The innovative people at GRASP lab seem to be working on a number of cool projects, including being one of the 11 members of Willow Garage’s PR2 Robot Beta Program.

PR2 Robot

Willow Garage PR2 Robot

Their proposal for the PR2:

University of Pennsylvania, GRASP Laboratory with the proposal PR2GRASP: From Perception and Reasoning to Grasping

The GRASP Lab proposal aims to tackle some of the challenges facing household robotics. These challenges include tracking people and planning for navigation in dynamic environments, and transferring handheld objects between robots and humans. Their contributions will include giving PR2 a tool belt to change its gripper on the fly, helping it track and navigate around people, and performing difficult two-arm tasks like opening spring-loaded doors.

1

The Future Is Very Far Away (but in the same room)

At the Stanford Robot Block Party in April (part of National Robotics Week) I got to spend a little time with one of Willow Garage‘s engineers via Texai, their telepresence robot. Turns out, they’ve had engineers telecommuting using these devices for quite a while. This is a whole new strain of telecommuting. Not just working on your laptop in your pajamas from home while logged in to the corporate network, but actually able to roam the halls, hang out by the water cooler, catch up with coworkers.

Figure 1 Texai telepresence at Robot Block Party (not me in the picture)

Thanks to some recent reporting at a fundraiser in San Francisco by Singularity Hub (Tony Robbins, Sergey Brin Become Robots – The Telepresence Revolution), we see Google co-founders and other “stars” of the tech world rubbing mechanical elbows with people. Expect to see a great deal more of these devices at media-covered events and parties, as the acceptance of mechanical visitors increases, and the requisite infrastructure (bandwidth, power) is available.

[youtube=http://www.youtube.com/watch?v=KIett_q3eHo]

But you should also prepare to see more of them in “the real world.” I’d say nearly everyone who may be reading this has used a conference call before, and the majority have probably used an online meeting service, such as WebEx or GoToMeeting, or even made a video call using Skype or iChat. But those don’t really get you there. Voice chat and webcams just don’t give the same sense of being actually at the event, but whirling around to see who’s sneaking up on you, being able to scan the room and see the expressions on their faces, can really change the way you see a remote event.

Are they perfect? Far from it. The telepresence devices I’ve seen so far don’t seem particularly stable, they don’t appear to be able to handle terrain more adventurous than shag carpet, and their reliance on off the shelf components makes them seem like a jumble of random stuff stuck together (monitor, webcam, wifi router, mobile base)

I anticipate a rapid maturing of this technology. In general, people seem to be very accepting of the devices roaming through a crowd. We’ll be seeing them get a little more friendly-looking, more capable, and cheaper. The QB from AnyBots is one example, although without a screen to announce who you are, it’s probably going to make people feel like they’re being spied on. And that’s important – telepresence isn’t about spying on your friends or coworkers from far away, that’s what surveillance cameras are for. It’s about joining the conversation as a person.

Joining the conversation as a person is where the future of telepresence lies.

How far is it going to go? Just consider how many people are striving to make a humanoid robot. Consider the movie “Surrogates“. It didn’t get a lot of attention, and wasn’t really that good. But it’s worth watching for anyone interested in the future of robotic technology, because it may well be in the foreseeable future. In the movie, most people interact with the world solely through a “surrogate”, a mechanical body made to look like them, and driven remotely from the safety of their own bed. It’s pretty far off in the future, but the seeds of the future may already be planted.

Consider telepresence the “killer app” for those who are trying to make robots that look like people. Creepy people, but people.

0

Robot waiters, bartenders

From Singularity Hub:

Bangkok’s First Robot Restaurant Full of Motoman Waiters (video) Singularity Hub

[youtube=http://www.youtube.com/watch?v=cpzrcnMimUA]

Ok, so I’ve seen Motoman‘s Dexter Bot before, most notably as the RoboBar:

[youtube=http://www.youtube.com/watch?v=fnnblb5HbYs]

And i think it’s really cool. It shows a very different concept of human-robot interaction and collaboration. The thing that’s most unique about this, in my opinion, is that it’s an industrial arm that’s been built to do more humanoid things, as opposed to a lot of the  mechanisms out there, which seem to come from the hobby field. Mechanisms coming from such different backgrounds, while both considered “robots”, have very different characteristics, good and bad.

The industrial robots are going to be structurally much more solid, because the manufacturers have had a decade or two to fine-tune them for their markets. These are typically assembly/factory lines, and (especially for articulated arms) welding and painting. These require a very stiff mechanism, the engineers have tuned the resonant frequencies to reduce jitter and improve settling time, they can pack the most efficient motors and gearboxes into them, balance the weight of the links against the power required to move them and the resulting performance. They are marvels of engineering, but tend to be rather expensive ($40-$60k for a reasonably small robot arm), and are often ugly and clunky-looking

(as an example, the SV-3 from Motoman, the precursor to the HP3 used in DexterBot)

Motoman SV-3

On the other hand, mechanisms that come from the hobby industry tend to be significantly cheaper, sexier looking, and have more of a friendly appearance to them.

But the DexterBot is starting to bridge that gap. It comes from an industrial arm (the HP3), but doesn’t have any use in an industrial environment. It’s almost completely useless to the current factory market – it’s way too slow to be a suitable replacement for human workers, it’s way too expensive to be competitive with automation solutions. Likewise, it has almost no value in the non-industrial market for the same reasons – too slow and expensive to compete with a minimum-wage cook, for sure.

So what’s the point?

It’s a step. A money-loser but a market-grower. It opens the eyes of the industrial robot manufacturers to the concept of human-robot collaboration, shared workspace, and consumer uses for industrial products. Just like Honda’s Asimo robot wasn’t cost-effective, it did its share of climbing obstacles (figuratively and literally) to open our eyes to real walking robots. Don’t expect to see DexterBot serving you drinks at your local dive bar, but expect more of the industrial robots to be used in consumer-friendly spaces.

0

National Robotics Week

National Robotics Week 2010 has come and gone, but left some cool robotics tech in its wake.

Wednesday’s Robot Block Party at Stanford was a blast, got to see some really cool displays from Willow Garage, Intuitive Surgical, Adept Technology, the Stanford Stickybot and DARPA Urban Challenge vehicle

Robot Block Party:

[youtube=http://www.youtube.com/watch?v=LQfw_NI5gHc]

Then on Friday Adept had an Open House with lots of demos to show:

WiiMote-Controlled robot featuring an AdeptQuattro robot picking and placing parts on a moving robot, controlled by the user wielding a Nintendo Wii controller:

[youtube=http://www.youtube.com/watch?v=mnJqYHp4vHw]

Here, the visitor draws graffiti onto an Apple iPad using GraffitiAnalysis. The tags are uploaded to a server, then downloaded by the robot controller and repeated in realtime:

[youtube=http://www.youtube.com/watch?v=LrsjVvy8HVQ]

And more demos (other videos may come later).

I may come back later and add more detail about some of the tech, but that’s a brief recap! Some pictures can be found on the flickr page, some videos on YouTube, look for the tags roboweek and roboweek2010!

0

Powered by WordPress. Designed by WooThemes