Learning Center
Plans & pricing Sign in
Sign Out
Your Federal Quarterly Tax Payments are due April 15th Get Help Now >>



  • pg 1


                                    (James) Elliott Coleshill
                           University of Guelph, Guelph, Ontario Canada
                                           (647) 283-1154
                                         Fax: (905) 790-4400

                                        Dr. Alex Ferworn
                                         Ryerson University
                           350 Victoria Street, Toronto, Ontario M5B 2K3
                                     (416) 979-5000 ext. 6968
                                        Fax: (416) 979-5064


Techniques for synthesizing panoramic scenes are widespread. Such a scene can be
automatically created from multiple displaced images by aligning and overlapping them using an
image registration technique. The ability to generate panoramic scenes has many applications
including the generation of virtual reality backgrounds, model-based video compression, and
object recognition. These techniques--and consequently their associated applications share the
restriction that all scenes are limited to a 360° view of the horizontal plane at the particular
moment in time the images were taken. Until recently, there has been little motivation to develop
techniques for the presentation of complete spherical views in real time---scenes that present the
entire potential visible fields of view, through time. With the advent of space exploration and
associated micro-gravity environments, "up" and "down" are relative terms and locally fixed points
of reference are difficult to come by. It may be useful to rethink how video is captured and
presented to a user working in such an environment employing extended notions of what a
panorama is. We have built a prototype camera which allows a user to view and pan/tilt through
arbitrary angles of view including elevation and declination. This camera provides the view in
real time from an network of seven synchronized CCD video cameras whose video outputs are
selectively "stitched" together to provide a smooth transition between different camera fields of
view. In this way, the user can smoothly pan/tilt through all the fields of view that are generated
by the system. All video processing is done in software--there are no moving parts.

              INTRODUCTION                           be insufficient to support the task. The
                                                     solution to this problem is a free-flying video
Video cameras have always played a role in           camera system that can produce wide-angle
human space flight and space applications.           panoramic video in real time, and can be
From initial construction of the International       deployed and retrieved quickly.
Space Station (ISS) , video cameras
mounted on the U.S. Space Shuttle , and              The National Aeronautics and Space
the ISS robotic arm (Canadarm2) , provided           Administration (NASA) have developed a
enough coverage to monitor and support               free flying robotic camera “ball” called the
assembly      and     inspection     activities      Autonomous       EVA    Robotic     Camera
associated with the project. However, as             (AERCam) . AERCam has been designed
ISS expands and grows these cameras will             and developed to perform visual and non-
visual inspection activities around the           possible. Therefore the challenge for any
outside of the ISS. It is also used as a          robotic video system is to capture just
support tool for spacewalking Astronauts          “enough” data to support the task at hand.
providing different views of work being           For visual inspections of static space
performed. Unfortunately AERCam has only          hardware, the camera can stop and look,
restricted utility in that its field of view is   thus one to five frames per second is all that
highly restricted. With only two color video      is required for the system to perform its
cameras pointing to the front of the robot        duties.
many adjustments are required in the robots
orientation when changing the angle of view.           HARDWARE IMPLEMENTATION

In this paper we describe a Spherical             The current SPVC system configuration
Panoramic Video Camera (SPVC) system              consists of seven analog video cameras
for supporting micro-gravity applications.        connected to an interface board, which in
The system is a multi-camera network . It         turn is connected to a central processing
will employ several CCD cameras wrapped           unit.
around a small sphere that allows wide
angle imaging with no moving parts. We
also describe the software controls and user
interface to the camera system which allows
the system to change its field of view without
re-orientating the robot itself.

                 THE SPVC

The Spherical Panoramic Video Camera
(SPVC) is an inexpensive cluster of CCD               Figure 2: System Architecture Block
cameras attached to a central hub. All                             Diagram
cameras are synchronized and will produce
images through time that will be relayed to a     Camera Design
processing unit through a cable tether. The
processing unit will contain software that will   The SPVC cameras are unique in design.
stitch the images together to form panoramic      Each camera consists of two electronic
views, which can be made available for            boards, a power board and a camera board,
monitoring on a display.                          stacked on top of each other in order to
                                                  reduce the physical size of the camera. The
                                                  two electronic boards are connected using
                                                  ribbon pin connectors.        To avoid any
                                                  confusion the camera board is keyed to the
                                                  power board to prevent it from being
                                                  installed backwards.

                                                          Figure 3: Picture of Camera

           Figure 1: SPVC Design                  The power board is responsible for all the
                                                  power data and video routing to and from
Due to limitations onboard the ISS for            the camera and contains all the interfacing
information storage and transmission, real-       jacks. The camera board holds the surface
time video (30 frames/second) is not always       mounted CMOS camera chip, all electrical
                                                  components required to configure the
camera chip, and the lens mounted on top of    Quad             Analog                Switch
the CMOS chip.                                 Multiplexer/Demultiplexers.

Each camera is equipped with three
interface jacks. The 120V AC Adapter, and
RCA video jacks allow the camera to be
plugged directly into the wall and TV for
testing purposes.

                                                       Figure 5: SPVC Interface Board
  Figure 3: Diagram Showing Test Setup
                                               Each camera is connected to its own analog
The SPVC is connected together using off-      switch on the interface board. As a camera
the-self Ethernet networking cable.     All    is selected through the SPVC software a
power, data, and video are passed back and     binary signal is sent to the interface board
forth between the Interface Board and          through a parallel connection with the
Camera through one cable.                      computer. The Decoder/Demultiplexer chip
                                               picks up this binary command, decodes it
                                               and commands the corresponding analog
                                               switch to close allowing the video feed to run
                                               out to the video capture card.

Figure 4: Diagram Showing Interface Board

Interface Board

The SPVC interface board provides the                  Figure 6: MUX Interface Diagram
main connection points between the cluster
of video cameras and the processing                    SOFTWARE IMPLEMENTATION
computer.     Its main responsibility is
supplying power to all the electronic          The current SPVC Software is a
components in the system, acting as a video    multithreaded   system     that    produces
multiplexer (MUX), and providing camera        panoramic video at a rate of one frame per
network synchronization.                       second. It can be operated in three different
                                               control modes:
The MUX is responsible for all the video
routing within the system. Using commands          •     Manual,
received from the processing computer, the         •     Single Camera Video Recording,
MUX will open and close switches to route                and
selected video from a specific camera to the       •     Panoramic Video
video capture card located in the processing
computer. The MUX consists of two main         Manual mode allows the user to operate a
chips, a Decoder/Demultiplexer and two         single camera at a time, and view the full 30
                                               frames a second video in a preview window.
Using the camera selection buttons allows      The shared memory interface is used to help
the user to command the MUX on the             speed up the processing time of the
interface board to route the video to the      software. It is a set of pre-allocated memory
preview window.                                that acts as a buffer for all raw video data.
                                               Both the application and stitch threads have
Single camera video record mode is an          read and write access to this memory.
extension of the manual mode. This mode
allows the users to record an AVI video from   Each video camera connected to the system
the currently selected video camera. The       has its own allocated space within the
system displays the current video in the       shared memory. As the application thread
preview window, while recording the video      grabs raw data from the video cameras it
stream to the local hard drive.                stores the raw data in the corresponding
                                               input buffer for the currently selected
Panoramic video mode allows the user to        camera.
stitch and stream panoramic images at a
rate of one frame per second to the preview
window. Using the Pan/Tile controls located
on the graphical user interface a user can
change the view of the panoramic video in
real time without physically moving the
cluster of video cameras.

                                                    Figure 8: Share Memory Interface

                                               When signaled by the application thread that
                                               all raw data is ready the stitch thread will
                                               read and convert the data stored in the
                                               shared memory to two-dimensional arrays of
                                               Red, Green, and Blue (RGB) pixel values.
    Figure 7: Software Users Interface         These two-dimensional arrays are then used
                                               to stitch together the final panoramic image
Software Architecture                          which is stored in the output buffer to be
                                               displayed to the user interface.
The SPVC software architecture consists of
three main modules:                                       SYNCHRONIZATION

   •   Application Thread,                     Synchronization plays a key role within the
   •   Shared Memory Interface, and            hardware and software of the SPVC system.
   •   Stitch Thread.
                                               Software Synchronization
The application thread is responsible for
grabbing raw data that the stitch thread       Software synchronization had to be
requires for producing the final panoramic     maintained between the application and
image. This process is accomplished by         stitch threads. This was done by defining
extracting one frame of data from each         and implementing a “frame of operation”
camera and storing the data in the shared      which allows the threads to communicate
memory interface.      For each camera a       and wait for each other. One frame of
binary command is sent to the SPVC             operation consists of three steps. The first
interface board to select the video stream     step is the application thread performs all
from the individual cameras.                   the video routing and stores one digital still
from each camera connected to the system          broadcast sync pulse through the interface
in the shared memory. Once completed it           board.
makes a call to the stitch thread to inform it                   RESULTS
that all the raw data is buffered and ready to
be stitched. At this point the application        The current design and implementation of
thread waits for the stitching process to be      the SPVC demonstrates that the notion of a
completed. While the application thread is        spherical panoramic camera is feasible,
sleeping the stitch thread extracts all the raw   however, there are problems that need to be
data from the shared memory, stitches             solved.
together a virtual environment of the images,
and writes the final panoramic output to the      During the design and testing of the SPVC it
shared memory. The stitch thread then             was assumed that a predefined stitch point
informs the application thread that the           could be used during the generation of the
panoramic image is ready to be displayed          panoramic image. When placing cameras
and waits until the next set of raw data is       around a spherical shape a predefined stitch
ready for manipulation.                           point is not possible. As the cameras are
                                                  tilted up or down, the horizontal stitch point
Hardware Synchronization                          stayed intact but the alignment between the
                                                  upper and lower layers with respect to the
Hardware synchronization requires the             horizon of the sphere rotated by
routing of the video through the interface        approximately ten degrees.
board be synchronized with the software,
and the refresh rate of all the cameras be
synchronized with each other.

The synchronization of the software with the
video routing on the interface board is done
with system messages and ready flags. The
SPVC software defines an interrupt
message within the operating system called                Figure 10: Rotation Problem
WM_AUTO_CAPTURE. This interrupt is
used to prevent the software from routing         The predefined stitch point also caused
the video too quickly and is raised when a        problems with the focal length of the
full capture of raw data has been collected.      cameras.      Originally the SPVC was
                                                  constructed in the lab where the stitch point
                                                  was defined according to the walls and
                                                  tables around the cluster of cameras.

                                                   Figure 11: Predefined Stitch Point in Lab

                                                  When the SPVC was move to another
    Figure 9: Synchronization Diagram             environment where the focal point was
                                                  substantially different to the lab the defined
The cameras designed for the SPVC system          focal point no longer worked causing the
have the option of restarting their frame         software to overlap the camera views when
synchronization.       The   frame     sync       generating the panoramic image.
commands the image sensor to restart the
drawing of the video output. The SPVC
software takes advantage of this option and
synchronizes all the cameras by sending a
                                                 design and development of the hardware
                                                 along with various software enhancements.
                                                 Eventually we hope to deploy a free-floating
                                                 version of the system requiring very little
                                                 physical movement in order to view the
                                                 entire environment.
The current SPVC software captures one
frame of data from each camera and                              REFERENCES
stitches an environment view every second.
The refresh rate is acceptable when the            NASA’s International Space Station
cameras are adjacent to each other.              Website, http://spaceflight.nasa.gov/station
However, when a camera view is moved
over a stitch point where there is significant        NASA’s      Space    Shuttle     Website,
time elapse between the frame grab of two        http://spaceflight.nasa.gov/shuttle
cameras, the panoramic video does not
refresh correctly causing a moving object to              MD       Robotics            Website,
be split.                                        http://www.mdrobotics.ca
                                                   Howie Choset, Carnegie Mellon University,
                                                 David Kortenkamp, Texas Robotics and
                                                 Automation Center. “Path Planning and
                                                 Control for AERCam, a Free-Flying
                                                 Inspection Robot in Space”
       Figure 11: Moving Object Split              C. Fermuller, Y. Aloimonos, P. Baker, R.
                                                 Pless, J. Neumann, B.Stuart. Computer
Another problem which was discovered was         Vision Laboratory, University of Mayland.
in the movement of the camera cluster itself.    Multi-Camera Networks: Eyes from Eyes.
The software does not handle the movement        IEEE, 2000
of the cluster very well. When the SPVC          6
becomes the moving object, the refresh of           (James) Elliott Coleshill, “Spherical
the stitching produces a chaotic image.          Panoramic Video – The Space Ball”,
Depending on the speed at which the SPVC         Masters Thesis, The Department of
is moving will determine how badly the           Computing and Information Science, The
panoramic video is generated.                    University of Guelph, August 2003.


In this paper we have described a prototype
of a Spherical Panoramic Video Camera to
support micro-gravity applications.        The
SPVC system is used to capture panoramic
scenes of a working environment and
stream them to a user. This process is
accomplished via a multi-threaded software
system controlling multiple camera steams
through a multiplexer circuit board interface.
Still digital images are captured from the raw
video streams, stitched together using basic
image processing techniques, and displayed
to the user’s interface at a rate of one frame
per second.

The current design and implementation,
while working, can be improved. The next
stage of work will concentrate on refining the

To top