11.18.09

Demonstrating again

Posted in General at 12:45 am by wouter

When we started this project, we defined our goals as designing a robust and interesting robot platform for research, education, and last but not least: for promotion. In that context, we have been demonstrating what Zeppy can do at two different events lately, which resulted in some nice pictures.

The first event was for in honour of our first-year-students at the AI department receiving their first 60 points, their propedeuse. Their parents were all invited to come and see for themselves what their children actually do all day, and we were there to show off Zeppy. Though we must admit, it had been a while since Zeppy actually flew, so some last-minute hardware changes had to be made.

But after that we were all set and ready to fly!

The second event was the open house the university held. All bachelors programs were there to promote what they’re doing, and Zeppy was there among other cool stuff, to show what AI can be like.

Both were really fun events and we were very happy that we could once again show our work.

11.17.09

Done with the framework

Posted in Hardware, Software at 7:07 pm by wouter

Time to start on some actual AI!

First of all let me refresh your memory on the architecture that we had in mind for our project. As you might recall there are two computers involved in controlling our robot: there is the software on the Ground Station, called Uplink (this has been introduced about three posts below). This software is able to send and receive UDP-messages to the other computer, which is Zeppy itself, flying around. These two communicate via Bluetooth, so we can tell Zeppy to ‘do stuff’ from Uplink, and Zeppy can then send back to Uplink what it senses (if needed). Our idea is that then we can implement different types of cognition in Zeppy ranging from simple remote control, where the messages determine entirely what action is to be done, to completely autonomous behavior where Zeppy determines everything for itself. (and perhaps communicating with Uplink in the process, so the system could work out hard calculations)

Well as it seems now the entire software architecture on Zeppy has been completed! Also, the “remote control”-cognition has been implemented, so now we have basically created an rc-blimp (which can even be controlled with a joystick or a compass). This may not seem that impressive to you, but this framework is way more flexible and robust than what we had before, and it allows us to easily implement very different types of behavior.

Encouraged by this thought and the ideas of our supervisors we have chosen to take on the challenge of implementing SLAM; Simultaneous Localization And Mapping. This would mean that we could make Zeppy fly around in a room or an entire building for a while and have it tell us what it ’sees’, and simultaneously Zeppy can localize where it is. To be realistic, we dont expect this to be an overnight implementation exercise. However, we’re looking forward to trying this, since the end result would be really cool.

However we’re unfortunately still held back by some other issues we have to resolve. A lot of work is still needed on the hardware level. First of all we noticed that the motors we use to propell the blimp are simply not strong enough. Also we’re still worried about the envelope we currently have. Although a lot of work has been put in by the engineering students, it still isn’t as robust as we hoped. Also it is still very big. We’re now looking into other options such as a professionally made blimp.

As you see, although we have been silent on here for a long time, there’s quite a lot going on right now. Work in progress…