DroidToiz - Group 8.1 Status Report (2010-04-08)
We are definitely progressing more towards modest deliverable for the course project. For our demonstration on the
13 , our goal is to show that we have achieved communication with two robots. One simply flashes some LEDs.
The other returns some environmental information (e.g. its current temperature). Each robot has a slightly different
hardware configuration, which we will explain in our presentation.
Our ideal remains for the robot to contain all information needed to run it, so that no storage of controls or download
of control schemes to the handset is necessary. Since Android layouts are XML-based, it seems logical that we can
simply perform a Bluetooth file transfer of the layout from the bot to the phone. However, we would then need to
figure out to dynamically bind the resulting controls to events within an end user’s GUI. As well, the storage of image
icons on the robot might present a storage problem.
By the end of the course, we are still on track to deliver a basic set of Bluetooth functionality including discovery and
two way communication. We can deliver along with this a few sample, hard-coded user interfaces to work with the
box. Future developers should be able to use this functionality with their own applications. We will deliver the more
ambitious features after the course, most likely.
Bluetooth 2-way communication
Basic GUI Programming
Formalization into a proper framework
A formal messaging protocol
The ability to generate GUIs dynamically (ideally, including image icons!)
Batch queuing and sending of messags. Currently, we have implemented dialogue based communication, where
the robot requests each input from the user.
Phase A-1 = A hard coded GUI exists. When the user presses an icon the toy performs the corresponding action.
Toy capabilities are hard-coded.
At present we have more of a dialogue-based system working. We need to make this more graphical.
However, we do have our Toiz responding to commands from the handset over Bluetooth.
Phase B-1 = A user presses sequence of icons (i.e. assembles a routine or macro), then transmits it, and the toy
performs the corresponding routine.
This is what we’re calling queued or batch commands. In a way, this is implemented, as we have the
FlashToi (LED flashing) robot accepting the number of pulses for each LED and then implementing the
command. However, the “queueing” is taking place on the bot, rather than the handset. We want the user
to be able to send over entire command sequences from the handset.