Try the all-new QuickBooks Online for FREE.  No credit card required.


Document Sample
					Appendix                                                 Author: Chris Georgeiou

                INTERACTIVE 3D SPACE

                                   By Chris Georgeiou

                                      3rd May 2007

                       Supervised by Dr. Chris Kirkham


Interactive 3D Space Environment                                            71
Appendix                                                                  Author: Chris Georgeiou

I would like to thank Dr. Chris Kirkham, who as my project supervisor provided me with the
necessary help and guidance towards the successful completion of the project. Special thanks to Mr.
Toby Howard for guidance and advice on various graphic programming issues.

Interactive 3D Space Environment                                                               72
Appendix                                                                      Author: Chris Georgeiou

This document describes the development procedure of an interactive 3D Space environment.
Information is provided within this context about the background research undertaken and how it
relates to design, which in turn over iteration became the final implementation and the end result.
The resulting application is evaluated and a number of possible expansions are identified, in hope for
future continuation and optimisation of this work.

As the name suggests, the software allows the user interaction with a 3D space environment.
Through the use of multiple custom engines and effect classes for the environment’s visual and
physical design, the user has a space craft with various basic tools to interact with space. While some
objects have a solidly defined behaviour others are based on the physics engine and react to this
behaviour directly. Various environmental objects have a particle engine for atmospheric effect and
this works together with the physics engine to create varying levels of realistic and stylistic effects.

The core aim was to learn and explore programming techniques, methodologies and frameworks for
interactive systems with objectives designed to support this. While the aim was achieved, the
objectives evolved as did the framework necessary to accommodate the growing size of the
application. This constant evolution meant the more interesting sections couldn’t be developed to
achieve the desired objective.

Interactive 3D Space Environment                                                                    73
Appendix                                                              Author: Chris Georgeiou

List of Figures                                                                           i
List of Tables                                                                          iii

 1 Introduction                                                                               1
        1.1             The Project Idea                                                      1
        1.2             The Project Plan                                                      2
        1.2.1           Primary Objectives                                                    3
        1.2.2           Secondary Objectives                                                  3
        1.2.3           Further Possible Extensions                                           4
        1.3             Report Overview                                                       4

 2 Background
        2.1             The OpenGL API                                                    5
        2.2             The Windows API                                                   5
        2.3             The Tutorials                                                     6
        2.3.1           Model Design Applications and Implementation Libraries            6
        2.3.2           NeHe and OpenGL                                                   8         Blending and the Z-Buffer                                         9         Texture Mapping                                                  10         Physics Engine                                                   11         Particle Engine                                                  11         Object Movement                                                  12         Mouse/Keyboard Interrupt Handling                                12
        2.3.3           Efficiency Techniques                                            13
 3 Design
        3.1             Design Approach                                                  15
        3.2             Interface Design                                                 22
        3.2.1           Initial Ideas                                                    23
        3.2.2           Looking at Quake 3 source code                                   24
        3.2.3           Final Idea                                                       24
        3.3             Environment Design Techniques                                    24
        3.4             Player                                                           30
        3.5             Controls                                                         31
        3.6             Windows                                                          31
        3.7             Area Planning                                                    32
        3.8             Environment Objects                                              32
        3.9             Game Engine Structure                                            32
        3.9.1           Physics Engine                                                   33
        3.9.2           Particle Engine                                                  33

 4 Implementation
        4.1             Implementation Methodology                                       34
        4.2             The C++ Classes                                                  35

Interactive 3D Space Environment                                                          74
Appendix                                                        Author: Chris Georgeiou
           4.2.1         Menu                                                      35
           4.2.2         Menu Effects                                              36
        Sliding and Fading Bar effects                            36
        Spot Effects                                              37
        Star Effects                                              38
           4.2.3         Delay and Random                                          39
           4.2.4         Menu Navigation (screensaver mode)                        39
           4.2.5         Engine                                                    41
           4.2.6         Image Loader                                              42
           4.2.7         Handler                                                   42
           4.2.8         Window                                                    44
           4.2.9         Physics Engine                                            45
           4.2.10        Particle Engine                                           45
           4.2.11        Math components                                           46
           4.2.12        Environment Objects                                       48
           4.2.13        Object View                                               50
           4.2.14        Object Movement                                           50
           4.3           The bigger picture                                        51
           4.4           Implementation Issues                                     51
           4.4.1         Updating Structures                                       52
           4.4.2         Real Time Expansion                                       52

 5 Testing and Results
         5.1             Testing Methodology                                       54
         5.2             Testing and Tweaking                                      55
         5.2.1           Environmental Objects                                     60         Visual & Behavioural Result Tweaking                      60
         5.2.2           Player                                                    64

 6 Evaluation and Conclusion
        6.1           Accomplishments                                              66
        6.2           Evaluation                                                   66
        6.3           Future Expansion                                             67
        6.4           Conclusion                                                   68

 References                                                                        69

 Appendix                                                                          71

Interactive 3D Space Environment                                                   75
Appendix                                                            Author: Chris Georgeiou

List of Figures
Reference Name                                                         Page Number

2.1        Pistol Drawing                                                          7
2.2        Pistol Model                                                            7
2.3        3DS-MAX Complex Interface                                               8
2.4        Program using Qt library                                                9
2.5        Using the depth buffer in OpenGL                                        9
2.6        Using blending in OpenGL                                               10
2.7        2D & 3D Texture Mapping                                                11
2.8        Message passing example code                                           12
2.9        Triangle Strip explanation                                             14

3.1        UML Diagram Design1                                                    16
3.2        UML Diagram Design2                                                    17
3.3        UML Diagram Design3a                                                   18
3.4        UML Diagram Design3b                                                   19
3.5        UML Diagram Design4                                                    20
3.6        UML Diagram Design5                                                    21
3.7        UML Diagram Design6                                                    22
3.8        Interface Design example                                               23
3.9        Universe Example                                                       26
3.10       Terrain Generation Algorithm part 1                                    27
3.11       Terrain Generation Algorithm part 2                                    27
3.12       Terrain Generation Algorithm part 3                                    27
3.13       Terrain Generation Algorithm part 4                                    27
3.14       Terrain Generation Algorithm part 5                                    28
3.15       Terrain Generation Algorithm part 6                                    28
3.16       Recursive pseudo-random Grammar based algorithm part 1                 29
3.17       Recursive pseudo-random Grammar based algorithm part 2                 29
3.18       Collision Detection                                                    29

4.1        1st Spot Effect Routine                                                35
4.2        Sliding Bar Routine                                                    36
4.3        Fading Bar Routine                                                     37
4.4        2 Spot Effect Routine                                                  37

Interactive 3D Space Environment                                                       76
Appendix                                                    Author: Chris Georgeiou
            st    nd
4.5        1 & 2 Spot Effect Routines                                     38
4.6        1 Star Effect Routine                                          38
4.7        2 Star Effect Routine                                          38
4.8        Screensaver Calling Routine                                    40
4.9        Screensaver Routine Example                                    40
4.10       MenuNavi Routine Example                                       41
4.11       WinMain Engine Initialisation Routine                          41
4.12       EngineWatcher Example                                          42
4.13       Image Loader Class Toolset                                     42
4.14       Handler modeChange() Example                                   43
4.15       Main Window Drawing Routine Call                               44
4.16       Resistance Calculation                                         45
4.17       Universe Construction Example                                  48
4.18       Star System Attributes                                         49
4.19       Twinkle Texture Selection Example                              49
4.20       Twinkle Function Example                                       50

5.1        Application Loaded                                             55
5.2        Passive Texture Mapped Expanding Sphere Effect                 56
5.3        Spot Effect 1                                                  57
5.4        Spot Effect 2                                                  57
5.5        Warping Effect with and without Hyper Speed                    58
5.5        Cruising in a Star Field Effect                                58
5.7        Moving in the Star Field Effect Down and Left                  59
5.8        Screensaver Loaded                                             59
5.9        Example of error messages                                      60
5.10       Asteroid Example                                               61
5.11       Asteroid Stress Test                                           62
5.12       Particle Engine Example                                        62
5.13       Fire Algorithm Test                                            63
5.14       Shooting Star Test                                             63
5.15       Player Craft                                                   65

Interactive 3D Space Environment                                               77
Appendix                                 Author: Chris Georgeiou

List of Tables
Reference    Name                             Page Number

2.2a         WinMain Function                              5
2.2b         WindowProc Function                           5
2.2c         CreateWindowEx Function                       6
3.4a         Initial Player Attributes                   30
3.4b         Final Player Attributes                     30
4.2.3a       Random Functions                            39
4.2.3b       Delay Functions                             39
4.2.7a       Handler Functions                           43
4.2.8a       Window Functions                            44

Interactive 3D Space Environment                               78
Appendix                                                                       Author: Chris Georgeiou

                                       CHAPTER 1

1.1 The Project Idea
Idea 1: The story provides reasoning to the choices made for the design of the application (see
Appendix: Initial Story Theme).

Initially the project was to be a first person-perspective shooter (fps) action adventure in which the
player would navigate through a series of levels each with unique environmental puzzles, collecting
various key items on which his progress depended upon. Three main environments were envisioned:

The Surface: A surface level terrain with mountain sides guiding the player to the goal. Terrain is
created by fractals and populated with procedurally modelled objects such as plants, trees, shrubs,
bushes. This level would simulate Newtonian physics and require the careful manipulation of these
physics with the tools provided in the level to successfully progress to the next stage. For example,
acquiring a shield and shooting a set of exploding barrels while jumping to use the additional force
to propel you over a gap.

The Sea: Much continues the same as before, essentially here an underwater effect will be overlaid
on the viewport to give the illusion of being underwater. This will react to the direction of the
player’s movement, in addition to this visual change, the particle engine will be disabled or the
particle life reduced to give the illusion that sparks burn out faster. Finally the physics engine will be
modified to reflect the environment.

The Space: In this final environment the player has complete freedom to roam anywhere and the
terrain is similar to that of “the surface” but turned inwards. Everything is gently being sucked
towards the centre of the screen and the physics environment is simulating a stylistic space
environment. This will allow the player to cause damage to the boss, a mysterious vortex, by making
various successive passes around the stage collecting ‘attack’ objects and firing them at the vortex
causing it damage. Failure to do so causes a small loss of health.

Idea 2: Upon further development the over ambitiousness of the first idea was realised and the 3
level design was reduced to just the one, which this report will attempt to explain.

Final idea: Further development saw the end to this dream and embraced a new concept entirely. In
conjunction with the fps action adventure idea, a menu was to be designed based on a space theme
reflecting current techniques used in the game industry and to evolve them to the next stage. A part
mini-game part screensaver was created to keep the user entertained. While ambitious all of this is

Interactive 3D Space Environment                                                                      79
Appendix                                                                      Author: Chris Georgeiou
highly attainable with the proper framework, once a backbone exists that abstracts and
accommodates each factor of the system then abstraction can once again be used to develop the more
intricate details of the system, i.e. design of objects and their uses, this will be discussed in further
detail in Chapter 4. Essentially the framework will be completed to accommodate for an interactive
space environment and various objects will be created for this environment to breathe life into its
black skies, of which the details are discussed in Chapters 3 and 4.

This report explains the transitions between ideas, when and why they occurred, and precautions
taken in the event of such problems.

1.2 The Project Plan
Given the scale of the project, much time for planning was spent splitting the project up into
achievable targets. This was done so given the eventuality that something major was overlooked and
things would have to be condensed due to limitations, the project could adapt to a new idea with
versatility on the work already done. Thus the first few steps were initial development of the menu
and the space environment as successful completion of these implied the necessary framework
required for the next stages of the project.

Tackling this project involved the implementation of two main aspects. Firstly, the implementation
of a space environment taking advantage of as many graphically advanced techniques as possible
and secondly, the implementation of the player, a functional device bound to various behavioural
rules that determines a list of its possible interaction directives. Completion of these stages means a
series of engines that provide the backbone for the application would have already been created i.e.
physics engine, window and device handler (see Chapter 4 for more information). This in turn meant
that the project could be adapted and extended to achieve some kind of level implementation as well
as a being able to complete some additional objectives. Thus the menu and space environment were
sandbox stages in the development where many things were tried and tested. Further progress is
discussed in more detail in the chapter 4.

This was implemented using C++, an object orientated language allowing for easily extensible
objects which was handy for the careful extension of the environment. In addition the OpenGL API
was used in conjunction with the glaux.h (GL auxiliary) library, and as the project was to be
designed on a windows platform the Windows API was used for handling the display window and
hardware (graphics card and input device) context, refer to Chapter 2 for more information on these.

The prime task was to learn the fundamentals of the Windows API so that focus could quickly be
averted to learning OpenGL as this was to comprise the main aspects of the project i.e. all visual
aspects and is the core of player-world interaction. In addition what makes an effective graphic
programmer is the understanding of how something can be represented, then given the scenario and
its quirks choosing the best technique. A skill trained through practice in the sandbox stages, various
tutorials and careful planning. Optimistically, it was preferable to design various items, objects and
effects using 3DSMax or AC3D and importing them via Maverick or QT libraries. Time was
invested in learning how to use such programs via web based tutorials and a conclusion was drawn
as to the feasibility of integrating such objects based on the time scale, the current development of

Interactive 3D Space Environment                                                                     80
Appendix                                                                   Author: Chris Georgeiou
the project and a study as to how the Maverick and QT libraries imported such files, for more
information see Chapter 2.

A personal aim was to experience building a game (interactive application) from scratch, thus it was
my intention to use as few pre-made classes as possible. This implied creation of all types of
managers from device to images and window to engine etc. Planning of these came at the next major
breakpoint, once the project had substantial content.

Initially graphically oriented work was done to ensure progress in aspects of the project that can be
demonstrated, thus when the project did reach a size where it will be unmanageable with a very
primitive framework, each framework was updated in turn. An additional prospect to this approach
was that, inevitably as the project progressed decisions were made that change various initial design
ideas and thus the framework was able to adapt to cope with these changes. An alternative approach
would be to design a more flexible and dynamic framework.

1.2.1 Primary Objectives
The primary objectives constitute:
   • Learning the basic OpenGL API functions to be able to create a graphically advanced and
       well optimised environment.
   • Learning the Windows API to be able to initialise an OpenGL application under windows
       operating systems.
   • Designing and creating a space environment.
   • Designing and creating a tool by which to interact with the environment in a fun way.
   • Designing and creating a solid framework which can cope with the scope of the project and
       handle expansion. This involves creation of an engine state handler, device handler, window
       handler, physics engine and particle engine.

1.2.2 Secondary Objectives
Upon successfully completing the primary objectives, implementation will follow to:
   • Design and create 3 levels.
   • Design and create a story engine that handles an abstraction of the many engine states which
      will have come into play due to creation of the 3 levels. This handles the progress of the
      player and ensures certain things are and aren’t possible depending on what the player has
   • Implement an object draw handler which keeps a record of all the objects which are to be
      drawn in the current environment loaded into memory, but only drawing a select few which
      are visible or in some kind of direct vicinity to the user during real time interaction.
   • Replace the OpenGL z-buffer with a device of my own.
   • Extend the space environment to make it as vast and interesting as possible.

Interactive 3D Space Environment                                                                 81
Appendix                                                                   Author: Chris Georgeiou

1.2.3 Further Possible Extensions
The project is limited purely to the imagination of the programmer. This, in turn, gives rise to a
number of possible extensions, namely the implementation of:
   • A player UI to have a visibly explicit method of using various ship functions and
      representing various things about the player state i.e. items, ship status, ammunition.
   • Fully functional menus, utilising mouse over pop-ups to offer further information on window
      objects of interest.
   • Collision detection to harness the physics engine, which adds a new level of interacting with
      the environment.
   • Any further development to the space environment i.e. more graphical content or extended
      functionality e.g. boundary zones to the infinite universe or a pseudo-random, dynamically
      created universe which has already to some extent been attempted.

1.3 Report Overview
This report aims to reflect the evolution of objectives within the project and how these changes were
dealt with, minimising the damage where possible, but more so planning the implementation in such
a way that each section naturally led on to the next and the project was constantly at a “safe” stage
where it was only ever being extended. The approach taken was an iterative scheme, where each part
is worked, tested, evaluated and reworked if required or if time allowed.

Chapter 2 (Background) presents the researched information collected that was required to pursue
and complete the project goals.

Chapter 3 (Design) describes the evolution of the design stage and the choices made for the final
design aspects of the project.

Chapter 4 (Implementation) presents implementation details including all developed classes and
their interconnections and details of some of the most challenging parts of coding.

Chapter 5 (Testing and Results) illustrates how the software tool works in practice by showing
how it is intended to be used by a user and describes how the software was tested.

Chapter 6 (Evaluation and Conclusion) evaluates whether or not everything works correctly,
summarises what has been accomplished and what has been learnt from the whole process.
Furthermore this chapter gives recommendations for further activity that can be undertaken in the
future on the software tool.

                                    CHAPTER 2

Interactive 3D Space Environment                                                                 82
Appendix                                                                     Author: Chris Georgeiou


2.1 The OpenGL API
As the main aspect of the project is graphical it was important to know how OpenGL worked at the
most basic levels; only from there can advanced concepts be grasped. In addition to lessons in the
basics of OpenGL, the red book (reference manual) of OpenGL was used, some early NeHe tutorials
and OpenGL online reference manual [1] was consulted occasionally. After the basics were grasped,
advanced functions were studied and finally the uses of these functions to create various desired
effects that have been used in the project.

2.2 The Windows API
Via documentation on the website [2] it was possible to reference and study the C++ Windows
specific code necessary for the project. However, as no framework was known, this took slightly
longer than expected. Fortunately, key aspects of the framework followed logically from what
already was the de facto standard with additions only to allow adaptation to the Windows operating
system (o.s.) framework e.g. WinMain() (see table 2.2a).

Table 2.2a
int WINAPI WinMain(                           The WinMain function is the user-provided entry
    HINSTANCE hInstance,                      point for a Microsoft Windows-based application.
    HINSTANCE hPrevInstance,
    LPSTR lpCmdLine,
    int nCmdShow

WinMain() also has bound to it windowProc() (see table 2.2b), which made it easy to see how an
independent device handler could be implemented.

Table 2.2b
LRESULT CALLBACK WindowProc(                  The WindowProc function processes messages sent to
    HWND hwnd,                                a window.
    UINT uMsg,
    WPARAM wParam,
    LPARAM lParam

A key aspect of learning how to use the windows API to setup a window meant it was later possible
to devise a series of classes to handle windows, making it now possible to make a distinction
between windows. While there would only ever be one main window in which the application would
be run, it might be possible to have a series of smaller internal windows without borders that act as a
movable series of interfaces to the main application. Another desirable interface concept was to have

Interactive 3D Space Environment                                                                   83
Appendix                                                                     Author: Chris Georgeiou
a mouse over help feature, upon the mouse being over a certain type of item for a period of time, a
new window would pop open at the position of the mouse or at a preset position with a description
of the item (see table2.2c). As these have essentially become industry standard, I wanted to
implement them in a fresh style.

Table 2.2c
HWND CreateWindowEx(                               The CreateWindowEx function creates an
DWORD dwExStyle, LPCTSTR lpClassName,              overlapped, pop-up, or child window with an
LPCTSTR lpWindowName, DWORD dwStyle,
int x, int y, int nWidth, int nHeight,             extended window style.
HWND hWndParent, HMENU hMenu,
HINSTANCE hInstance, LPVOID lpParam

2.3 The Tutorials
Once the project was already safely underway, research into the feasibility of incorporating 3D
Models was done. Primarily time was spent inspecting tools for rendering models i.e. AC3D and
3DSMAX and importing them using Maverick and QT libraries.

2.3.1 Model Design Applications and Implementation Libraries
Via a series of online tutorials, [4], [5] & [6], it was possible to learn the basics and some advanced
techniques using 3DS MAX, e.g. texture mapping, shading, smoothing, morphing techniques,
modelling complex objects and keeping the polygon count low. After two weeks of investigation it
was concluded that even relatively simple models required too much time and effort to transform to
the desired state and this didn’t even involve importing the model into my application. It was thus
decided that it was too easy to lose focus from the project’s primary objectives given the time scale.
The conclusion came about as a result of the time spent modelling a pistol object and analysis of the
time required to integrate with the time available. Modelling was an entirely new field and required
practice and experience to be used efficiently. In addition each object generated too many polygons
thus creating many world objects would dramatically reduce the performance without optimisation
techniques, which was in itself is another issue.

Figure 2.1

Interactive 3D Space Environment                                                                   84
Appendix                                                                      Author: Chris Georgeiou

                                                               Using this pre-made model of a
                                                               pistol (left), the model (below) was
                                                               constructed following the
                                                               instructions of the tutorial.
                                                               (1420 polygons)

Figure 2.2

Study on QT library via website [7], showed that while useful classes existed most of them were not
suited to the requirements of the project and adapting them required experience with the library
framework; reason enough was not found to invest further time in this path.
Qt is a good choice for GUI programming. It is easy to understand and fast for development, but Qt
aims to cover every level of development which generates a high dependency on their source code,
which would become a problem given the project framework. A possible solution could be to use it
only for graphical display, but in this case, data translation between the current object data structures
and Qt objects is complicated without the aforementioned experience. The Maverick library was
hardly studied for the reason that it seemed better to create everything from scratch.

Figure 2.3

Interactive 3D Space Environment                                                                      85
Appendix                                                                    Author: Chris Georgeiou

(An example of the complexity of the 3DS MAX 8: Above is just one perspective window and one
set of options for a particular mode of editing (on the right of the image) attained from [8].

2.3.2 NeHe and OpenGL
NeHe is a website with many useful OpenGL resources and tutorials on many commonly used
techniques in graphical programming. While it provides an excellent run through of what the
technique achieves it uses very basic and often highly impractical solutions which demonstrate in a
simple approach a way of representing the data. This was ideal as the aim was only to learn about the
concepts and adapt them to the project.

Game programming forums such as were used and some books were consulted to
examine the consensus on implementing various game features that had initially been planned i.e.
terrain creation. Techniques such as random terrain generation, background and foreground
mapping, various methods for modelling data i.e. using a single class and a series of interlinking
struct type objects that describe the make-up of a particular aspect of that class as opposed to using
many smaller classes and defining each aspect in more detail. In the end it boiled down to the
application, the aim was to keep things open-ended and versatile so that in the case that the project
took a major change, only a few class functions would have to be changed, rather than entire classes
rendered useless wasting weeks of work.

Figure 2.4

Interactive 3D Space Environment                                                                  86
Appendix                                                                      Author: Chris Georgeiou

Above is a program using Qt library for image loading attained from [9]. These are the kind of
applications Qt is more often used for. Blending and the Z-Buffer
The z-buffer is one of the most important tools in a graphical programmer’s artillery, and its use is a
reflection of its creator’s artistic wisdom. As an artist starts with the background and works to the
foreground, it is the job of the z-buffer to ensure when enabled it detects objects behind the current
object aren’t visible and hopefully aren’t rendered. Unfortunately OpenGL’s z-buffer is primitive
thus in future expansions, it is desirable for efficiency purposes to create a custom z-buffer.

Figure 2.5

                                       Enabling the Z-Buffer in OpenGL requires just this

Blending works when one object is behind another and the front object’s opacity is low thus the
object behind it is visible as a combination of both properties. This requires disabling the z-buffer to
ensure the object behind is actually drawn; enabling both causes unexpected results. These two
techniques will be the most commonly used when describing how to render a scene so it is important
to know how they work and the relationship they have with other functions in the library. To get the
transparency effect correct, polygons must be drawn from the back of the scene to the front, this is

Interactive 3D Space Environment                                                                    87
Appendix                                                                         Author: Chris Georgeiou
known as the 'Painter's Algorithm'.

Figure 2.6

                                                                   Set the blending options.
                            Enable Blending.
                                           Finally the specified polygon must have alpha value
                                           1.0f for the transparency effect to work. Texture Mapping

Texture mapping is the technique of wrapping textures on graphical objects by referring to points on
a 2-axis grid. 2D texture mapping is manual, this means user specified coordinates of the image
plane are mapped to specified vertices of the object. 3D procedural texture mapping 'creates' a
virtual texture dynamically by using a procedural function applied at each pixel; OpenGL uses
functions with quadratics to provide a routine for automatically mapping the image plane
coordinates to more complex graphical objects accurately. While the automatic method is simpler to
implement it is often slower as slightly more complex calculations are performed. However it can’t
be used for intricate objects such as blast trails, these must be dealt with manually and the texture
positions updated to the object position as the object moves. Texture mapping moving and obscure
objects requires application specific mathematical routines to accurately describe behaviour and
shape of the object.

To make the graphical style of the environment more unique it is possible to write a rendering
engine and use a table of limited colour palette, not only is this highly efficient to render, it is easy to
manipulate e.g. for certain objects receiving significantly more or less light, the cords of the polygon
can be taken and a planar surface created which maps directly on top, which can be coloured varying
levels of opacity. This approach requires completion of the project to a degree not achieved and has
itself many complications, where OpenGL functions would have to overwritten.

Figure 2.7

Interactive 3D Space Environment                                                                       88
Appendix                                                                      Author: Chris Georgeiou
                                                                         This form of texture
                                                                         mapping works with bitmap
                                                                         images whose resolutions
                                                                         are scaled as squares, e.g.
                                                                         (Left 2D texture mapping)

                                                                         (Below 3D procedural
                                                                         texture mapping) Physics Engine
A physics engine is essentially a class which describes properties of physics possible on an object
and contains functions that simulate how the object behaves. Depending on the properties of the
object and the environment it is interacting with, it simulates the effects on the object under different
conditions to represent the environment. Almost all objects will be affected by the environment in
some way thus such objects inherit from the physics class.
The motivation for the space environment experience was to have realism at the cost of little CPU
cycle consumption, thus a system will be designed that simulates the rules of Newtonian physics. Particle Engine
A particle engine is a means by which to simulate certain phenomena which are hard to reproduce
with conventional rendering techniques e.g. fire, explosions, smoke, clouds, dust, meteor trails.
While NeHe provided an example of a particle system it was extremely primitive. Upon clearer
description from website [10] it was possible to grasp the intricacies of a particle engine and thus
create a model best suited to the application. Similar to the physics engine, a description of a particle
was first required, followed by a particle emitter with functions explaining the behaviour of a
particle in its environment under certain situations thus creating a clear abstraction between its uses.

While possible to have particles inherit from physics objects, it is simpler for implementation
purposes to leave them as particles and use the desired sections from the physics engine’s movement

Interactive 3D Space Environment                                                                     89
Appendix                                                                    Author: Chris Georgeiou
function, then elaborate on the rules to create a more stylised particle system. Using physics
properties to define particle movement would have been ideal for ideas 1 + 2 (See Section 1.1)
however for the final idea the versatility of the particles was limited only by imagination. Due to
problems updating the application framework, it was never possible to implement the Object
movement class as a class particle engine would inherit from. The class was a series of fixed and
pseudo-random movements which could be applied to any moveable object. Implementation of this
class would allow interesting movement patterns to be assigned to the emitted particles creating a
stylised visual effect. Object Movement
Limited research was done on object movement as information discovered dealt with motion
planning for robots in an artificial intelligence approach and in general methods which abstracted
from the source code language. In turn a lot of these application were dealing with problems of no
concern to the project, i.e. avoiding collisions or dealing animated objects and structures to support
objects such as kd-Trees, octrees or variants of Bounding Interval Heirachy (BVH). While these
were useful for future consideration and expansion, they were techniques better suited once the
project had completed its main objectives. Mouse/Keyboard Interrupt Handling
Window specific API handles device callbacks by taking into account the active window instance
running the application in a function called WinMain() to which all messages are passed and then re-
routing them to WindowProc() as parameters via the DispatchMessage() function.

Figure 2.8

…some code…

Using a switch statement on the parameters it was then possible to define what to do in the event of a
key press/release, mouse movement, mouse button press/release. As discussed previously (see

Interactive 3D Space Environment                                                                  90
Appendix                                                                       Author: Chris Georgeiou
Section 2.3.1) it was impractical to adapt other libraries to the project as they were not as versatile as
the designs in mind so a custom handler was created. Its design is such that any keyboard or mouse
device can be customised and used with the addition of flexible coding to leave the class readily
adaptable to any other devices i.e. gamepad. The idea was simply to have the properties of the device
described and copy over the functionality, or update the handler’s knowledge of its existence (see
Section 3.5 & 4.2.7).
Research into the Quake 3 source code for interface design ideas also showed how extensively
everything was linked together working as one. It was from this that the idea came to treat a device
handler as something that specifies the type of devices active for use i.e. the number of buttons, their
status and then have functions describing what they do under different engine states. Additionally
specialised functionality was to be abstracted e.g. warping the mouse cursor to screen centre

2.3.3 Efficiency Techniques
Given the eventuality that the project was nearing completion but was performing at a low frame rate
or too slowly, time was invested researching optimisation and efficiency improving techniques.

One way of increasing efficiency is by not rendering objects invisible to the viewer. This can be
achieved by supplying each face on an object with a direction vector and finding the dot product
with the user’s direction vector. Anything with an angle over 90 will not be visible and thus is not
drawn. This implies that a class must be created that handles objects being drawn, which requires big
changes to the framework and further abstraction. It is a considerable task and thus was left as a
consideration for future extension. It was concluded this technique seemed impractical given time
constraints and pre-requisites thus would not be attempted until completion of the project. While
techniques requiring this framework were not used, the research below shows what could have been

Anything outside the viewing frustum won’t be visible and thus shouldn’t be drawn. This is
relatively simple to implement, all objects to be drawn appear are saved in a list, and the area of the
frustum is calculated dynamically with respect to the user’s position via some function. By taking
the position of each object from the list and using its radius to form a sphere representing the space
of the object: if it lies within the frustum then draw it else don’t. For CPU efficiency though it is
enough to crudely take into account the distance between the player and object position, then if it lies
within a certain range, draw it. A hierarchical tree structure could be defined to describe objects and
how they should be drawn e.g. an asteroid is an object but the particles it emits could be sub-
structures and depending on a varying levels of priority only higher level structures would be tested
to avoid overworking the solution.

This trade off between accuracy and efficiency of the technique often falls in favour of efficiency. It
is simply not practical to implement certain optimisations because for the often small frame rate that
will be gained the cost of CPU cycles is often greater, which can lead to greater frame rate drops.
Unfortunately this theory could not be put to the test and while clock ticks were counted a CPU rate
counter was neither researched nor implemented.

Interactive 3D Space Environment                                                                      91
Appendix                                                                    Author: Chris Georgeiou
More practical techniques which could be implemented immediately would be to reduce the number
of objects by decreasing the array size of the object. This would drastically reduce the stress on the
environment, i.e. stars were once rendered as spheres so that any direction they were viewed from
created a 3D effect, however having many quadrics with automatic texture mapping immediately
slowed down the performance. A secondary billboard technique was used in which each texture was
drawn on a square plane formed using a triangle strip.

Figure 2.9
                                          A GL_TRIANGLE_STRIP is a drawing feature of OpenGL
                                          in which the first three vertices are specified in a counter
                                          clockwise order, assuming the object is supposed to face the
                                          viewer. Each additionally specified vertex is automatically
                                          joined to the original triangle to create a chain of triangles.
                                          This technique saves the graphics card from converting
                                          polygons to triangles.

While this drastically improved performance allowing an increase in the number of stars from the
100-400 range to 1000s, a new problem arose requiring the rotation of each image plane to face the
viewer as the camera view changed position. This had the added advantage of being able to apply
multiple textures to stars and even switching between the two to give a twinkle effect.

Interactive 3D Space Environment                                                                  92
Appendix                                                                      Author: Chris Georgeiou

                                     CHAPTER 3


3.1 Design Approach
As the application was going to be built from scratch, ample research was done to ensure
implementation ran smoothly. Unfortunately the design at this stage overlooked the scale of planning
and the difficulty of the framework was underestimated, which is the fundamental reason for the
alterations in the UML class designs. While more research could have been done to ensure a stronger
framework for the program, a series of UML diagrams will show the application’s evolution,
explaining the problems encountered and how they were dealt with.

Before referencing the UML Designs in the appendix see Appendix: Design Stage Reference User

(see Appendix: UML Diagram Design1 & UML Class Design1)
The UML class diagrams are designed to show the flow of the project and how it was initially
approached. The design is heavily flawed for a final application but this is ok as the first task was to
apply the research while preparing the first environment of the program. In this stage key
mechanisms are missing as they are not fully required to realise the goals of this initial stage e.g.
physics engine, device handler, engine state handler, window handler etc.

Essentially M_Effect class (with reference to the mentioned appendix entry) represents the base
properties of a menu effect. This then splits into two categories, textured and non-textured effects i.e.
M_Textured and M_NonTextured respectively. Given the lack of properties in each class, it would
have been better to have all three as one class or to keep M_Effect and include M_Textured and
M_NonTextured structures to achieve the desired affect of saving memory while decreasing the
work the CPU has to do calling the parent class. For the moment this was of no concern.
Classes FX_Star, FX_Warphole and FX_Body provide various graphical features and were designed
to help with the learning curve of OpenGL. These were created to provide some menu functionality
that could immediately be accessed via hotkeys. As these were not menu effects, they are later
converted to individual entities and at some point became entities of the physics engine to provide a
more uniform behaviour to the environment. Classes FX_SlidingBar and FX_Spot were designed for
the purpose of animating the menu.

Interactive 3D Space Environment                                                                     93
Appendix                                                                    Author: Chris Georgeiou

Figure 3.1

(see Appendix: UML Diagram Design2 & UML Class Design2)
In this next diagram, before making any drastic changes, all repeated functionality was abstracted,
making some of the initial classes redundant. The abstraction was done at this stage so that when the
next level of framework was implemented the object’s properties would just be copied over without
affecting parent classes. Essentially this is the ongoing technique used to allow for a stable project
growth. In preparation to extending the framework some important classes for handling various
graphic related mathematical routines were implemented i.e. Ray, Vector, Vertex and Quaternion.
The Matrix class was more of an afterthought with the sole purpose of replacing OpenGL’s
framework for expressing rotations, translations etc. using direct matrix transformations. The Matrix
class was also intended to replace all the maths performed on objects to improve efficiency; however
time was not available to realise this.

Interactive 3D Space Environment                                                                  94
Appendix                                                                    Author: Chris Georgeiou

Figure 3.2

(see Appendix: UML Diagram Design3a & UML Class Design3a and UML Diagram Design3b &
UML Class Design3b)
Diagram 3a sees the implementation of the mathematical classes integrated into the architecture and
some advanced environmental objects are realised. A class Asteroid_Group is defined inheriting
from FX_Asteroid, where a collection of asteroid objects are governed as one body. As there were
now a variety of objects in space, the next step was to add some kind of animation to their
movement. The class Object_Movement was created to provide abstraction for fixed and pseudo-
random movement patterns on objects, however at this point only basic features are implemented. A
list of basic movements was created and using procedural algorithms on a subset of these basic
movements, advanced movements were created. Variables were used to add an element of
randomness at key procedural points to create unpredictable patterns (see Section 4.2.14 for further

The idea of a camera system is implemented with basic functionality which required the extraction
of camera variables from other classes. At some point it was hoped to factor in a camera for every
object and have a “Director” class which provided a series of fixed and interactive camera viewing
features. This adds to the user’s sense of deep space exploration and instant interaction with any
object in the space environment, a feature which regrettably did not make the final cut. It could have
even been used to give the application an expedition mode e.g. any item the player discovers i.e.

Interactive 3D Space Environment                                                                  95
       Appendix                                                                     Author: Chris Georgeiou
       comes into close proximity with, is marked and can be remotely viewed from anywhere in space.

       Math classes are used initially in the description of the G_Player class which is programmed under
       the pretence that idea 1-2 (see Section 1.1) is going to be implemented (see Appendix: UML Class
       Design3a - G_Player). These were mainly used for camera rotations, movement calculations,
       weapon/object use, jumping and other like functions. Their uses will be described in more detail
       with reference to the final design and in the implementation (see Section 4.1.13). Interactions are at
       this point directly controlled by the crude callback handler soon to be replaced, as the functionality
       has already been written it is simply ported to the handler class once implemented.
       In addition the Physics engine was written along with a Particle engine which borrows a lot of its
       functionality from the Physics engine. Fortunately this was not integrated with the rest of the classes
       just yet…

Figure 3.3

       This was a major turning point at which the original idea was reformed to having just an interactive
       space environment and building on that (see Section 1.1 – Final Idea & Appendix: UML Class
       Design3b - G_Player). The player class was reformed to handle the interactions necessary of the
       final idea and it was now possible to integrate the physics engine classes with the player. A
       G_Player is a Physics_Object and has Physics properties; this reasoning is reflected in the
       inheritance hierarchy for the player – physics engine combination. Similarly, Asteroid_Groups and
       G_Player objects are particle emitters that have Particle_Groups, which is a Particle governed by
       the Particle_Engine.

       Interactive 3D Space Environment                                                                   96
       Appendix                                                                      Author: Chris Georgeiou
Figure 3.4

       (see Appendix: UML Diagram Design4 & UML Class Design4)
       Upon the introduction of mathematical and graphical classes it becomes possible to provide a better
       architecture for some of the previous environmental classes. Although regrettable that this was
       missed earlier on, it was simple to integrate these new structures into the overall architecture.
       FX_UFO and FX_Bird classes now use quaternions and are upgraded to use the physics engine
       giving them the added benefit of uniform behaviour with the rest of the objects in space.

       New classes FX_UFO and FX_Bird (see REF: UML_Class_Design4 –FX_UFO and FX_Bird) were
       created to flesh out the space environment. The idea was to inspire a limitless environment in which
       anything could take place, as well as modelling some of the more exciting things already happening
       in our cosmos. This idea led to the creation of UFOs and spirit birds, creatures ranging from a line
       drawing of a bird in which each vertex is represented as a star like object utilising the FX_Star class.
       The spirit bird has properties, that when enabled, allow particle emission from the bird, i.e. from
       random vertices, its position, all vertices etc. While the bird can move and the particles lend to the
       illusion of its movement, any focus on wing movement would be an afterthought as there are many
       complications involved in adopting this design, specifically in the physics engine. The
       Physics_Object class would have to adopt the concept of having many physics entities, each
       connected and their behaviour somehow interlinked.

       Interactive 3D Space Environment                                                                    97
       Appendix                                                                   Author: Chris Georgeiou
Figure 3.5

       (see Appendix: UML Diagram Design5 & UML Class Design5)
       During development of FX_UFO and FX_Bird classes, it became apparent the desired behaviour
       would make these classes as complex as the G_Player class. While this was acceptable as many
       functional features already written for the player could be transferred, the framework for making
       such a jump at this point was not available. It was decided their development should be paused until
       the framework was vastly improved with a diverse device handler and window handler.

       It was at this point important to introduce the heart of the program, the Engine class, which keeps
       track of the program state and ensures relative content is loaded and removed with respect to the
       state. This had to be done first as the window and device handler’s various states depend on the
       engine’s various states.

       The device handler was designed with the aim of versatility (see Section 4.2.7). A list of available
       actions for each state was created then the device descriptions i.e. keyboard and mouse were
       abstracted. Next, default bindings for each engine state and a key-assigning function allowing user
       specified bindings were created; however the latter function was not integrated as there was no
       interface to support this. For the time being, it was simple enough to change key bindings by simply
       changing how they are stored in their default state, as the default binding is always loaded with
       respect to its corresponding state.

       Interactive 3D Space Environment                                                                98
       Appendix                                                                   Author: Chris Georgeiou
       The window handler was designed to support functionality which was not yet supported (see Section
       4.2.8), thus it was broken into portions. The window class describes the core aspects of any window,
       then class Main was created which inherits from window. Main represents aspects of the main
       window of focus whose context is updated with respect to the engine state change. Classes
       representing program states were upgraded to inherit from window so their content could be used in
       Main e.g. Menu, SolarSystem and Game classes. An added benefit to this design is the ability to
       create a SubWindow class inheriting from Window, which could be used for things such as interface
       control menus or information pop up windows, which unfortunately due to time restrictions were
       never implemented.

Figure 3.6

       (See Appendix: UML Diagram Design6 & UML Class Design6)
       This is the final design for the completed application. Upon successful completion of this design a
       user will not only be able to explore and interact with space but be constantly captivated by a vast
       universe void of repetition. Many classes have been redesigned to abstract objects and functions
       which don’t really belong. The new model sees the introduction of Environment_Object, a class
       which adds a logical relationship between objects for the polymorphism required to refer to and
       render each object in the Universe class. The new Universe class describes a universe as a collection
       of star systems (Star_System class), which contains a unique background of neighbouring star
       systems. Aside from the usual bodies, stars and clouds, neighbouring star systems aren’t in the star

       Interactive 3D Space Environment                                                                 99
       Appendix                                                                 Author: Chris Georgeiou
       system and thus are viewable as a background image. The respective environment objects (see
       Figure 3.7) inherit from Object_View, so that one day a camera can be setup for them.

Figure 3.7

       3.2 Interface Design
       Interfaces are often secondary devices that provide simplified access to more complex operations
       and provide a means by which a user interacts with the program to achieve certain functionality.
       While not an objective of high priority, the main menu was intended to have buttons used in the
       context of manipulating the internal states of the program i.e. to prepare for the use of the main
       interactive application feature. Though the necessary framework existed to setup pop-up windows
       describing each menu item and provide other handy features, there were more important things to do
       in terms of building the environment and player interaction thus this task was designated upon
       completion of the project. It is for this reason that such an interface is unavailable.

       Interactive 3D Space Environment                                                              100
Appendix                                                                    Author: Chris Georgeiou

From a small study interfaces of a variety of space games [11] and popular guidelines [12] it was
found that the main application interface, where the user interacts with the environment, logically
demands that clear and easy access is available to all tools and functionality should be grouped
relevantly, i.e. movement functions, environmental interaction functions, user state view functions.
While preferable that all of these appear as individual icons which can be grouped together by the
user, for practical purposes and time constraints it was more logical to have predefined button groups
which can be enabled/disabled by the user via hotkeys. This allows the user to choose whether they
want to use the interface or not, and place it anywhere on the view plane on demand. It was intended
to load images of equal size in a SubWindow split into sections, then depending which section was
clicked i.e. by looking at the coordinates and dividing them by the modulus of the number of
sections for the length and width. Essentially each button would provide an alternative method to use
functions handled by the device handler, unfortunately time for this implementation was unavailable.

Figure 3.8

   Main                 Interface
   window               window


The concept of mouse over help could be achieved by storing the text in the item’s object as a string
with the same variable name and then initialising the pop-up window class with this text. The pop up
window size would be scaled to hold the text and appear at the position of the mouse after
performing various simple checks to see which side of the mouse the window should appear on, as to
avoid being drawn off the screen. Unfortunately this was another feature researched and given
planning consideration under the pretext that it could be a simple further developed upon completion
of the application.

The main reason these were unable to be implemented lies in the run-time problems encountered
with the window framework switching content in the main window.

3.2.1 Initial Ideas
Games of today often have menu designs which while allowing access to the menu may also reflect
the environment the player has progressed to, such as in [13], or a looping video streaming
highlighting exciting features of the game. Rarely some even navigate around a map where each
section represents a new portion of the menu [14]. While originally inspired by a 3D Desktop [15] to
use a cubical interface that rotated for each section of the menu, this simple approach helped the
learning curve for a more adventurous approach.

Each planet in the solar system could be assigned a menu section and when the player switches
sections the solar view appears and the camera is at the location of the planet representing the

Interactive 3D Space Environment                                                                  101
Appendix                                                                     Author: Chris Georgeiou
current section. It accelerates from the current planet and upon reaching a certain distance all space
animations stop and the view speeds through space until close enough to the target planet to resume
animation and align the view with the planet. This would overcome a lot of complications with the
planets moving around too much and provides entertainment without much delay. While the function
to perform this action was implemented, the structure of assigning planets menu sections was never
completed because the final framework was incomplete.

3.2.2 Looking at Quake 3 source code
Research on the quake 3 source code [16] for ideas on how to implement a user interface led to the
conclusion that the project’s interface design didn’t require as much detail as a real game with online
multiplayer capabilities and the current structure was good enough for me to retain. In addition the
source code was extremely abstract, using many structures and classes together for one purpose
which made it even harder to extract just the relevant information. Finally it was written in C making
it too time consuming to convert to C++ and incorporate as an object orientated design.

3.2.3 Final Idea
While the space environment state was designed for menu navigation and a function was provided
for the desired behaviour of moving from menu section to section, the incomplete Window handler
framework made it impossible to evolve the final part of the structure required to realise each planet
as a section. Once that had been done though it was simply a case of creating a function to monitor
menu section changes and pass the appropriate parameters to the navigation function.

Even though the design of the space environment was further developed to produce pseudo random
environment generation, there were problems to overcome before this could be fully realised. Firstly
only one star system would have to be rendered per universe and an ongoing check would have to
procure whether or not the player was about to travel to another star system, so that, for efficiency
purposes, rendering would be enabled on the new star system and disabled on the previous. In
addition each star system requires a tight check on the object count so that the programme always
runs smoothly. This is because there are currently no advanced rendering efficiency techniques being
implemented. Dynamic solar systems took careful planning to ensure bodies didn’t collide, and a
vast amount of texture generation was required to add diversity to the environment; this partially
involved texture manipulation in Photoshop on and some internet research [17].

3.3 Environment Design Techniques
As the aim of the project was to provide an interactive space environment it only made sense to
begin with space design first. Extending part of a solar system of a previous coursework exercise to
give a basic framework, it became possible to improve on what already existed, and work to more
complex graphical techniques to balance the learning curve.

Interactive 3D Space Environment                                                                   102
Appendix                                                                     Author: Chris Georgeiou

To give life to the emptiness of space using stars was the first choice. Three types of stars were
designed where the majority would be stars that never changed state, however some simply
alternated texture for a twinkling effect. The second was a much lesser but still noticeably frequent
number of stars that appeared for a few seconds shooting across the sky with a trail size varying in
proportion to the speed of the star. Lastly, occurring as almost a rarity, exploding stars with a
random number N chunks shooting out in random directions, where each piece was size of the star/N
to keep in proportion to the planet and depending on the speed of chunk it may or may not have a

An afterthought was to create a function grouping stars in 5-15s forming various random
constellations; however given the number of stars it was uncertain as to whether the patterns would
even be noticeable. An a plausible solution was to draw, using the line function, trails to highlight
constellation patterns at various intervals, as if the sky was speaking to the player.

Using an RGB calculator it was possible to find colour ranges that best suited the particular object or
effect in mind. This was used to write the fire emitting algorithm in the particle engine which could
be applied to the sun, asteroids, the player craft, weapons and various other effects.

To efficiently create an exciting atmosphere, it was originally intended to split space into a class
using building blocks called star systems (see REF: UML Class Design6 - StarSystem). Essentially
this is a large area of space which has assigned to it various stars, planets and other environmentally
enhancing features. While it was never possible to abstract the current space environment to the
stage which saw this manifest in its entirety, a lot of time and research was spent in this domain
which was one of the major falls of being able to achieve all the objectives.

A set of pre-rendered background textures were compiled and it was intended that each star system
would be assigned an image. The focus on the background image would vary as the player looked
around, thus giving the illusion of being wrapped around space. It was also designed so that once the
player reached a certain distance from the end of a star system, the background would begin to fade
out in proportion to the distance from the next star system while the next star system’s background
would begin to increase in alpha with the same proportion, providing a seamless transition between

It had been planned in building the content of space and demonstrating the particle system that
asteroid and like objects would be ideal to place grouped circling planets, sitting idly in large
clusters in the galaxy and star systems. In addition smaller clusters of asteroids will be shooting
across space leaking fire particles giving an unpredictable edge to the environment. Such movement
patterns are the objectives of the Object_Movement class, whose further development was
abandoned in favour of fixing the framework. (for more info on Object_Movement see Section

Interactive 3D Space Environment                                                                   103
Appendix                                                                      Author: Chris Georgeiou
Figure 3.9

   A Universe can be 1, 2x2x2 or 3x3x3 star systems.
                                                       A star system has can have stars, planets,
                                                       moons, suns, clusters of particles forming
                                                       clouds. It can have these items
                                                       individually making for a simple system
                                                       or grouped together. A crude recursive
                                                       algorithm exists to ensure objects aren’t
                                                       touching or overlapping. Some star
                                                       systems may have wacky colour schemes
                                                       and represent bonus stages in which
                                                       various ship power-ups and boosts can be
                                                       found upon careful exploration.

The above figure gives reference to things which have been coded but only implemented at a very
basic level. To ensure progress continued on the player craft and user interactions it was necessary to
use a single build of the pseudo-random universe.

A lot of time was spent on designing and developing ideas for the original aspect of the project,
ideas 1+2 and the final idea (see Section 1.1). It was intended to use some of the procedural
algorithms researched [18] & [19] to generate various objects in the environment, however there
wasn’t enough time to plan the data structures, implement and integrate such a task. Had there been
more experience in such activities, this would have been attempted at the very least. Following are
some of the random and procedural algorithms designed; the research was used in conjunction with
inspiration from nature.

This algorithm was inspired by and devised from observing the crushing of piece of paper
Pseudo-code Algorithm:
Create a 2D square with a specified size as a structured group of vertices; this will contain the entire
terrain. Essentially lines are cast across the terrain in various directions and use some factor to
determine, upon an intersection with a previous line, whether to stop drawing the line or continue
until the other side of the 2D square plane is reached. Lines are generated by taking any point along
the edge of the plane and using an angle that lies within the square to specify a random direction.

Interactive 3D Space Environment                                                                    104
Appendix                                                                                         Author: Chris Georgeiou
Step 1:
Figure 3.10                                                         Figure 3.11


                TERRAIN                     I
                  SLAB                      D

Figure a, shows the initial 2D Terrain slab plane created as some defined length and width.
In figure b, a few lines are drawn which intersect with nothing, thus continue to the other side of the 2D square plane.

Each intersection whether with the 2D plane edge or another point has its position stored in an array
of positions. The number of lines to split the slab depends on a terrain crumple factor, and terrain
contouring depends on a gradient factor where the height is adjusted by considering the slab as a
series of smaller slabs (see Figure 3.14 & 3.15). After constructing the initial slab and casting lines
to split the slab, the slab is re analysed and a series of interlinked slab structures are created that
describe how the slabs are linked. Sections which share the same vertices and edges affect each other
thus are grouped as neighbouring sections (see Figure 3.13).

Figure 3.12                                                         Figure 3.13

                                                                                2                          4

                                                                            8           9   10       11

In figure 3.12, a line is drawn and with a certain probability it is allowed to pass the first line but fails to pass the second
line. Figure 3.13 shows a completed terrain slab, each enclosed section is numbered and the following structure is
created to recognise neighbouring sections (sections that share edges and vertices):
1= {2,5,6}. 2= {1,5,6,3}, 3= {2,4,6,7,9,10,11}. 4= {3,7} etc.
This is just a simple example of what really happens, in order to capture the model accurately, we must go through each
vertex of each section and check which other section shares this vertices, then build a model of shared edges.

The next stage chooses one of the sections as specified by the programmer and a random vertex on
that slab where a height is specified and a gradient factor. If the gradient is set high then the raised
slab section affects only the closest neighbouring slabs using the structure previous built. A lower
gradient means a smaller random number is subtracted from the original specified height and thus
more surrounding slabs are affected until the current height of a slab reaches the original ground

Interactive 3D Space Environment                                                                                            105
Appendix                                                                                      Author: Chris Georgeiou
level height the 2D plane was modelled on. Ensuring the surfaces are almost always at ground level
simplifies the problem of joining multiple slabs to build complicated terrains and can be used for
creating various rocks.

Figure 3.14                                                      Figure 3.15

                                     Height beings at 0,
                                     and gradient factor
                                     ranges from 1-100.

                                               Height= 10.0;
    Terrain slab ecliptic view.                Gradient= 70;

Figures 3.14 & 3.15 don’t continue from the previous example, they simply demonstrate how the height and gradient
factors work. Figure 3.14 represents a completed plane with an ecliptic view. The 1st Arrow on figure 3.14 indicates the
vertex to apply the height to, while the 2nd shows the direction of the gradient slope. Figure 3.15 shows that the vertex is
moved 10 units into the air and due to the steepness of the gradient it only takes a few moves to reach ground level

This somewhat primitive technique allows many diverse slabs to be created simply of varying sizes,
allowing for an effectively flexible mould which ensures that almost any joined group of slabs adjust
so that they form one fully joined terrain. While this will take substantial planning in the
implementation stage it would provide a powerful terrain generation tool and if a particular slab has
problems joining, an adjusted replacement could be created quickly, using the same template and
varying slightly the parameters.

Fractal terrains were also researched [20], but time was not available to design a fitting
implementation which would utilise the diversity of fractals. However fractals would have been the
final choice for terrain design had the final idea not changed.

Environmental foliage generation was going to be done procedurally. Essentially it was going to be
done using the billboard techniques while avoiding the problem of viewing the same image from
multiple angles. Each section of the plant is a rectangle which is calculated by the algorithm and
vertices are passed to a function that creates the rectangle using OpenGL GL_TRIANGLE_STRIP
providing a planar surface ready to have the appropriate textures mapped. Each piece can be rotated
in turn with respect to the viewer so that it always faces the viewer, and the depth buffer will ensure
pieces not visible aren’t drawn, thus a realistic effect from a perspective point of view.

It is regrettable this algorithm was not created as it could have been adapted to create odd looking
objects in space with any textures imaginable, half formed plant like planets, or even odd looking
space entities emitting particles. Had the framework been ready then this would have definitely been
the first feature to have next been implemented.

Interactive 3D Space Environment                                                                                        106
Appendix                                                                                      Author: Chris Georgeiou
Figure 3.16                                                            Figure 3.17
  Procedure: Up, Left, Up.                                          Procedure: Up, Left, Up. Properties described
  Properties set as:                                                below.
  Up->dist= 10, Left->angle=45, Up->dist=5.


The algorithm begins with figure 3.16 where a rectangular plane is drawn with a predefined initial height and width of
X and Y, respectively. Assume the level of detail selected is N levels of iteration. In figure 3.17, which is iteration 1 of
N, from the top leaf the procedure repeats unchanged Up, Left, Up, with the same properties. It should be noted that the
algorithm will keep track of the spine of the object because this item is allowed to be unusually long. In essence if this
was applied to the space idea, the rules would have been left very open ended to allow very random to be rendered.

However on the same iteration on the second leaf our algorithm goes Up (notice it continues from the direction defined
as up from that particular point, this implies recording the states at leaf nodes) then it realises that going an extra 45 o
degrees would be an extreme angle change as it is already bent at 45 o degrees (another 45 o is 90 o not very useful if it
continues like this) so with some random probability it has chosen to use a random scaling factor on the angle so that
this time Left implies only 15o degrees. Also note that because it is a branch and not a spine the distance to draw was
reduced by a random factor, the reduction was based on a simple probabilistic analysis. The further away from a spine
branches are drawn, the shorter the branch i.e. spine->branch1->…->branchN N should be significantly shorter, as it
doesn’t need to support its own branches.

The algorithm has adapted to deal with some defined problematic situations. In addition, as the iterations get closer to N
the width of each branch gets proportionally thinner.

As the application is meant to be interactive, one of the best features to express this is collision
detection. There are a lot of issues which plague accurate collision detection even with the
sophistication of today’s techniques and programmers. Firstly given the code structure, a
representation is selected to model the situation efficiently; it may be difficult finding a way to
represent the problem given the current data structures and code may require restructuring. Deciding
upon the level of detail between object collisions is important as it specifies the type of calculations
that must occur. A ray can be used along the directional path of moving objects to check for
collisions, upon a detected collision, CPU intensive calculations must be performed to assess which
surfaces collide first. If this can accurately be calculated, using the surface normal and the direction
of incoming impact, the angle can be calculated using the dot product. However this is in itself a
huge problem and due to lack of experience each object would have been represented as a sphere i.e.
its position + radius.

Figure 3.18


Two objects moving towards each other (arrow represents direction), their positions displayed as a dot, upon collision
the distance between their positions becomes less than the sum of their radii. A new ray is created between their
positions and the middle point found, this represents the position of the collision on each object. The angle between the
direction of the object and the collision can be used to determine the new direction of the objects with respect to the
forces exchanged. The lower object has more force so upper moves in the new direction give by the angle θ.

Interactive 3D Space Environment                                                                                        107
Appendix                                                                        Author: Chris Georgeiou
If the angle is 0 all the force is transferred, higher angles imply less force on the point of collision
and spin would be ignored as it adds more complexity to the problem.

Such techniques were always intended to be implemented at the final stages, thus given the delay
caused by the window framework integration issues, which resulted in the failure to fully complete
environmental techniques used for generating the space environment, implementation and
integration of collision detection was never possible.

3.4 Player
The concept of the project changed and so did the realisation of the player, fortunately much was
salvageable and written source code was easy enough to update.

The player was designed to have hit points, mind points and stamina which determine:

Table 3.4a
Name:                   Description:
Hit points (HP)         Points of health replenishable by items or naturally over time if the player’s
                        health falls below a certain point.
Mind points (MP)        Points of mind power replenishable by items or naturally over time if the
                        player’s mind power falls below a certain point. Primarily used to cast spells
                        or use abilities that require mind power.
Stamina points (SP)     Points of stamina replenishable by items or naturally over time. Primarily
                        used to cast spells or use abilities that require stamina.

Final ideas for a player system saw a versatile approach in which the player could expand his craft
depending on various interactions in space. A plan was made to enable certain features for the user
upon discovery, by coming into close proximity, of certain environments in space. Such extensions
saw upgrading of technology i.e. access to new weapons and new types of the ship, which in turn
affects other factors. Other things such as bonus items saw a temporary boost in attack power, or
boost speed, increased or accelerated healing in single or many player attributes.

Table 3.4b
Ship Type:              Properties:
Normal                  Smallest size, lowest density, lowest weight carrying capacity, lowest thrust
                        speeds per gear (0-3 gears) and lowest weapon firing rate.
Fighter                 Slightly improved on all of the above.
Freighter               Reasonably better than Fighter.
Battler                 Reasonably better than Freighter.

In addition to the free range of ship movements, the addition of a teleport drive, hyper speed booster
and gravity ray where added. Hyper speed was intended for long distance travel and to avoid
confrontations, however the teleport was intended to work using an interface, where the user enters
the xyz coordinates into a popup window which are then passed to the teleport function to perform
instant travel, ideally for places that had already been visited. While features were written for

Interactive 3D Space Environment                                                                       108
Appendix                                                                      Author: Chris Georgeiou
drawing text on screen, integrating this into the application would have been a finishing touch and
was thus left in favour of attempting to solve framework issues.

While weapons were planned for, they were never realised due to more pressing camera and control
issues with the ship. This was mainly because a window specific function could not be found which
keeps the mouse cursor only inside the window frame and not free to roam the desktop.

3.5 Controls
Initially devices of keyboard and mouse were identified and then defined. A keyboard has 256 key,
the number generated from a key press is then used to index the array and its functionality is
selected. The functionality was identified and then assigned as a predefined list of features to various
keys depending on the engine state, this solved the problem of binding a key to a function. To find if
a key was pressed, the keyboard device structure also had a boolean array which used key presses to
index a keyboard button in the array which was assigned the value true when that index was
referenced, and false when it was released.

The mouse was realised as a device with an x, y position and sensitivity which became a scaling
factor for the total calculated mouse movement in each direction. Depending on the number of
buttons and mouse wheels a similar system is setup to the keyboard. Depending on the button press
or wheel movement (up or down), the corresponding functionality is selected, and the button is
marked as pressed or released via a boolean variable.

As the functionality was abstracted from the device by predefining the features, all that was required
to add a new device was to formally describe any device, then copy over the functionality. Using the
engine states for each separate set of controls provides a solid handle on the scale of the application.

3.6 Windows
The design began as one window for all content as the content at the initial stages was very limited.
However content expanded and new engine states were realised, thus the window handler was
improved. The handler, for each engine state i.e. Menu, Game, Screensaver, loads its content to the
window and upon changing states the appropriate garbage collection function is called freeing
memory used by the current state, before loading the next. The main purpose of the design is to keep
the minimum loaded at any one point. This was achieved by abstracting the concept of a window. A
window class was designed to encapsulate the properties of a window which were mainly windows
specific code. Then two classes Main and PopUp would inherit from window and describe the
conditions for a main window and a popup window, however only the Main class was realised due to
time constraints (see Appendix: UML Class Design6).

Interactive 3D Space Environment                                                                    109
Appendix                                                                     Author: Chris Georgeiou

3.7 Area Planning
Ideas 1+2 (see Section 1.1) had their first area completely pre-planned in the log book as a series of
sketches of level, objects, characters and story board designs to give an idea of how the environment
would feel like and play. The final idea was set in space, which meant anything goes. The universe
was split into star systems and while each star system could be split up into smaller sections it
seemed limited to describe a vast space in terms of cubes, granted the universe was designed that
way. Instead, star systems will be populated completely at random upon every visit to a star system
to give the universe its feel of infiniteness. In turn the star system environment will have a defined
number of objects, all with positions and radii so that objects can be checked upon positioning
whether they collide with one another. This solves the limitations of pre-defining an area, expressing
to the user the vastness of space and generating a pseudo-random environment.

3.8 Environment Objects
Environmental objects designed for ideas 1+2 (see Section 1.1) and their purposes described in the
logbook. The objects for space were a bit more obvious to design. Initially obvious things such as
suns, planets, moons, stars and asteroids were designed and then more advanced things were sought
after. Dust clouds made of tiny colourful particles forming patterns based on various mathematical
routines e.g. fractals were designed but never implemented. The functionality exists to implement
clouds however more research into fractal and other interesting shape configurations was required
UFOs were designed and implemented as objects which belonged to a certain planet and went
around visiting or attacking others. The standard UFO cruises along the galaxy and stops for a
moment at any randomly chosen object in its vicinity to observe it for a short period, maybe
performing a series of standard movements using the object movement class it inherits from. If two
UFOs come into contact which are of disagreeing heritage the more aggressive one may initiate a
battle in which case it begins firing at the normal UFO, this then flies or teleports away as they have
more advanced technology. A mistake occurred ambitiously creating this class at the start of the
project when the framework was quite basic. As the project continued more important aspects had to
be implemented, like the player class and as a result the UFO class was left at a standstill.

3.9 Game Engine Structure
The engine being the heart of the program contains the WinMain function for initialising the runtime
environment. Based on a set of predefined states the engine class begins by loading the Menu
window framework into a polymorphic Main window handler (see Section 3.6 & 4.2.8). The device
context is loaded and the device handler is always setup based on the engine state. The engine keeps
track of the current mode, whether the game has been paused or whether then entire application is
paused. It is in charge of sending device callback inputs to their corresponding places and upon
mode change it prepares and loads the corresponding window context.

Interactive 3D Space Environment                                                                   110
Appendix                                                                        Author: Chris Georgeiou

The structure was chosen to best match the framework required by the project’s aims. Each
interrelated application concept is abstracted and calls to it in engine are dealt with by passing the
call, so upon mode change the minimum work is done. The structure is versatile so to add a state
first it must be defined. Corresponding instructions are then added to each relevant function to
provide instructions on how to update the application to deal with this new state.

3.9.1 Physics Engine
As the engine will have to cater for 3 different environments, the various generic environment
properties were recorded in the physics class so that manipulation of these properties allowed for
modelling of different environments. Fortunately, this plan meant that during the changing of ideas
(see Section 1.1) the physics engine was unaffected thanks to well planned abstraction.

Essentially class Physics holds the physics properties of an item, Physics_Object then inherits from
this and specifies the more object specific properties. The aim of the physics engine is to provide a
fun way for exploring space and to achieve this a certain degree of realism was realised while
providing simplified calculations for various rules. For example, resistance to velocity is based on
the environment, object density and velocity, higher environmental resistance (0-none 10-a lot)
causes a significantly larger drag that is greatly multiplied if the object’s density is high, finally the
faster the object is moving the more resistance acts on the object. Everything with physical
properties inherits from Physics_Object so that it can be described in some way that corresponds to
the rest of the environment and possibly make use of any extra features that define it i.e. a planet
rotating in a solar system would use the physics engine so that its movement is consisten with the

3.9.2 Particle Engine
The particle engine is built in much the same way and for the same reasons as the physics engine.
Essentially the Particle_Engine class like Physics describes the physical aspects involved in particle
manipulation i.e. position, speed, direction etc. Particle inherits these properties and describes the
properties of a particle i.e. colour, texture, size etc. Particle_Group again inherits and realises a
collection of particles as a group, these can then be assigned to a particular object i.e. the emitter (see
Appendix: UML Class Design3b and UML Diagram Design3b).

While the particle movement function behaves similar to the physics engine, a fire algorithm has
been devised to build on the effect of particle motion and colour behaviour, as the particles move
away from the source and run out of life they fade out and their colour isn’t as vibrant. In addition a
colour palette was devised and a set of functions utilising this palette to create various predefined
colour schemes. The purpose of such tools was to keep the environment exciting and stylised,
without too much emphasis on what should and shouldn’t be in space. Instead it provided the user
with an imaginative interactive experience.

Interactive 3D Space Environment                                                                       111
Appendix                                                                     Author: Chris Georgeiou

                                   CHAPTER 4

After the development of some ideas, commonly used properties between existing objects required
abstraction, the necessary abstractions were made and further development of items was stopped.
This process was the general idea to be iterated throughout the implementation to ensure the project
was ready to handle the next set of ideas.

The project had reached a size where it was not possible to handle everything in MMenu.cpp, thus
objects were created and stored in a shared header file. It was hoped that this would be evolved into
a separate object rendering handler entity for drawing only what was screen visible, however this
never left the design phase.

4.1 Implementation Methodology
As OpenGL is coded in C and it was desirable to achieve a collection of well abstracted inter
working classes the best programming language choice was C++. C++ being an improvement over C
meant instant compatibility with any custom libraries that could have been chosen to be
implemented i.e. an “ArrayList” header file. Additionally C++ is a very efficient programming
language and OpenGL is a highly supported 3D library allowing access to a detailed specification of
the language for general query and referencing.

The implementation procedure follows the design approach described in Chapter 3 with some
dynamic changes made on the fly discussed in this chapter. Consequently, class groups are split to
various steps which are discussed further detail in this chapter (see Section 4.3). The common theme
is building the application in bite size pieces then abstracting sets of classes to provide a clean way
of extending the content of each theme without affecting another class in an undesirable way.

The project began with the requirement of working in windows specific API and using openGL and
since neither were known to a satisfactory degree it was logical to start coding on the basic
framework first. The initial idea was to create environmental objects for the universe concurrently
with learning OpenGL. This was a good approach to an extent but the project soon got out of hand
without better planning. Abstraction was necessary to remove repeated variables shared between
classes and to provide a logical link between these objects so that it was easy to manipulate as a
group. After some planning these initial ideas were revised and abstractraction was done in favour of
object oriented classes.

Interactive 3D Space Environment                                                                   112
Appendix                                                                   Author: Chris Georgeiou
Eventually a point was reached where the current framework wasn't able to properly support the
objects being used in the different states of the program. Initially globally shared variables were
controlling states, and signalling new key bindings, this had to be replaced so these details were
abstracted away and classes were created to manage device handle and engine state systems in order
to accommodate for every device and state. In the end they worked seamlessly together as part of the

Ideally, abstracted code is ideal for building big and complicated pieces of software. However due to
hardware limitations and software performance it was imperative that a program which would
already be using the processor to perform complex calculations for various drawings would have
enough processing power to run smoothly through all of the routines.

4.2 The C++ Classes
The interactive 3d space environment implementation is divided over a number of interconnected
C++ classes that provide the necessary functionality. A list of these classes together with a brief
description of their functionality is given below.

4.2.1 Menu
The menu was built using background images produced in Adobe Photoshop then rendered on a
quadrilateral using an OpenGL feature GL_TRIANGLE_STRIP (see Figure 2.9). A series of menu
effects had been designed that were described in their own classes, then using an instance of each
class to reference the class functions, a drawing routine was described in the menu as a function
activated when its corresponding key is pressed. A small function was going to be written to create a
pseudo-randomly generating sequence of events using the effects. Additionally the referencing
variables were stored in the Menu_Shared header file to provide access to these classes from any

Figure 4.1
Description           Function
This function is      /*
found in the Main     * Drawing routine for Effect 1 - Spots drawn in random locations
class, it uses the    void Menu::effect1()
functions devised     {
in the FX_Spot              if(resetSpots)    // If spot data needs to be reset
class to describe           {
                                  memset(spotsFX1, NULL, sizeof(spotsFX1)); // reset
the routine for the   it
effect.                           resetSpots=false; // dont do it again for this cycle
                            // After a random delay

Interactive 3D Space Environment                                                                 113
Appendix                                                                     Author: Chris Georgeiou
                                   fxSpot.initSpotsFX1();// Setup a new spot to be
                          } // if
                          // Draw Spots
                    } // Menu::effect1

The drawing routine in this class deals with only menu content, other modes have corresponding
routines. In addition this has been treated as window content meaning it has its own WinMain class
to specify exclusively how to run this portion of the engine. Upon mode change or ending the
application, WinMain, just before returning to the Main window class, calls a garbage collection
function to clear the memory from old content.

While the rest of the menus didn’t exist the functionality was provided to readily switch between the
rendering of each menu i.e. main, options, save, load and exit.

4.2.2 Menu Effects
The menu has some active and passive effects to keep the screen interesting; the user can toggle
through some of these for their own amusement, while others remain active until the next engine
state is enabled. Sliding and Fading Bar effects
Sliding and fading bar effects are passive effects which run on the main menu background. A fixed
number of fading bars appear in a randomly chosen position on the screen either horizontally or
vertically and remain there until they have faded out. When this occurs they are re-initialised and the
process is repeated. Sliding bars start from one side of the screen and slide over to the other either
horizontally or vertically at varying speeds. These bars are initialised and remain with some level of
opacity so that the menu background objects aren’t blocked. Additionally the opacity and speed
properties were tweaked to ensure no items are blocked from view on the screen and that the sliding
sequence finishes in a timely fashion.

Figure 4.2

Interactive 3D Space Environment                                                                   114
Appendix                                                                      Author: Chris Georgeiou
Figure 4.3 Spot effects
The first spot effect (see Figure 4.1) sees spots appear at random positions on the screen, starting at a
size of 0, growing to their maximum size then disappearing and being reinitialised for reuse. They
are a random colour and always transparent to some degree so the background is visible. The other
effect is similar except after a short period of time after the first spot appears, another spot will
appear a fixed distance from the first spot. The position of the second spot is related to the first
depending on an angle factor, the second spot will be drawn at the angle position in relation to the
first spot. This will continue until a fixed number of spots appear in a line sequence, each will, like
the first effect, upon reaching its maximum size, disappear. Only after the entire line of spots has
disappeared will it reinitialise and a few of these sequences will occur on screen at a time. It was
intended to have them going sine pattern from across the screen in various directions, but there was
never time to implement this. A third effect sees the use of spot effects 1 + 2 simultaneously.

Figure 4.4

Interactive 3D Space Environment                                                                     115
Appendix                                                                     Author: Chris Georgeiou
Figure 4.5 Star effects
The first effect sees an array of stars burst out from the centre of the screen towards each edge,
depending on its starting angle. When the stars reach the edge of the screen they are reinitialised for
reuse. In the middle of the screen there are two very small warp holes which can’t dart close enough
to the screen because the user is currently moving too slowly in the illusion. A secondary feature
enables hyper speed; here each star is redrawn with an accelerated speed and a tail trail if it is
moving beyond a certain speed. In addition the warp holes are much larger and their flickering
creates the illusion that the user is moving through space at great speed. Unfortunately the project
didn’t make it to the stage where this effect could be implemented when the user enabled the hyper
speed functionality.

The second effect has stars appearing from a random position close to the centre of the screen as if
the user was flying through space and stars randomly came into focus, then depending on a
directional button pressed the stars would all shift in the opposite corresponding direction. This was
going to be used while the player was in the first person perspective of the craft.

Figure 4.6

Figure 4.7

Interactive 3D Space Environment                                                                   116
Appendix                                                                   Author: Chris Georgeiou
4.2.3 Delay and Random
A helper class called random was created utilizing two basic random functions.

Table 4.2.3a
Function                             Description
static GLfloat randomNoNP()          Can be used in static context to return a random
                                     number between -1 and 1.
static GLfloat randomNoP()           Can be used in static context to return a random
                                     number between 0 and 1.

A delay feature set was incorporated to abstract various kinds of delays and counters required all
throughout the application.

Table 4.2.3b
Function                             Description
bool secDelay(unsigned         int   Return true after a given a number of seconds are up.
bool msecDelay(unsigned int          Return true after a given a number of milliseconds are
mseconds);                           up.
long int secCounter();               From the point of initialisation returns the number of
                                     seconds elapsed, until it is reset.
long int msecCounter();              From the point of initialisation returns the number of
                                     milliseconds elapsed, until it is reset.

4.2.4 Menu Navigation (Screensaver mode)
While it was intended that a function was written to load the solar system after the program was left
idle for some time, this was a half completed endeavour. A function during the draw routine called
screensaver() is called to check if there is device feedback, if for N seconds there is none the
appropriate files are loaded and the solar system loads. A basic random view generator has been
programmed, to after a random amount of time pick a new view at random. If the body view
function was selected a random body to observe the sun from is selected. While it performs the
function of a screensaver, the screensaver was going to also utilise the free view, in which the
camera would teleport and cruise through random interesting sections of space giving a real sense of
the application in an automatic context. This would have been achieved by using the preset
movement functions defined in the Object_Movement class (see Section 4.2.14).

Interactive 3D Space Environment                                                                 117
Appendix                                                                      Author: Chris Georgeiou
Figure 4.8

Figure 4.9

Above is a snippet from the screensaver drawing function demonstrating how a random view for the space
environment is selected.

A menu navigation function for the interface was created allowing, upon a menu change, a piece of
scripted animation in which the view travelled from the planet representing the current menu to the
planet representing the next menu. The biggest problems to take into account were, planets being in
awkward positions while constantly moving at a rate possibly faster than the camera, which could
mean the sequence never ends. This was overcome, by allowing the camera to cruise for the part of
the journey leaving the home planet and arriving at the source while the rest of the animation is
featured in a frozen time frame, in which planetary movement isn’t active. It was never possible to
integrate this pattern sequence as the menu wasn’t finished; buttons were designed but never
implemented into the menu so that the user could click on the next state as opposed to use a hotkey
on the keyboard.

Interactive 3D Space Environment                                                                    118
      Appendix                                                                     Author: Chris Georgeiou
      Figure 4.10

      4.2.5 Engine
      The engine was designed to maintain the overall structure of the application; it keeps track of what
      mode it’s in, whether it’s in an idle state or paused. While the heart of the program resides here, the
      true operations reside in their own sections, thus in the WinMain() initialisations for the device and
      window handler occur and the main menu is loaded initially. Control is then passed to the main
      window, and upon exiting from the application and returning from the main, garbage collection is
      performed for the Engine before it too ends. In addition there are a series of functions that check for
      state changes and apply the appropriate window content initialisation and update the device

      A function WindowProc() exists in which various callbacks have been abstracted, i.e. rendering the
      window always occurs no matter the engine state. In addition any key presses are registered here and
      dispatched to the device handler.
Figure 4.11

      Interactive 3D Space Environment                                                                   119
Appendix                                                                                 Author: Chris Georgeiou

Figure 4.12

engineWatcher() is called in the winMain() block of the Main class (see Section 4.2.8) so that upon the ending of use
of current content the next state’s content is loaded.

4.2.6 Image Loader
Image loading occurs in almost every class, and at some point the created text and crosshair images
were going to be implemented thus functionality exists which isn’t fully used. Each instance of this
class stores the loaded textures and their filenames in a 2D array and the base address for the display
list for text textures, which is also useful for other sequenced images in which clipping may occur to
load the entire contents of an image. Arrays are 2D so they can be accessed first by the type of image
i.e. BMP or TGA, then by the image number, this design allows these single variables to grow to any
size and handle any number of image types allowing for easy future extension.
A series of functions exists for loading textures, font and printing text (see Figure 4.15).

Figure 4.13

4.2.7 Handler
The handler was created much in the same as was described in the design (see Section3.5).
“Handler_Misc.h” exists to hold a list of handler modes i.e. only Keyboard/Mouse or both. It also
has a list of every action available for each device for each mode i.e. MENU_AK_OPTIONS
represents in the (MENU) Menu mode an (A) action for the (K) keyboard exists to load the
(OPTIONS) options screen.
The handler class contains:

Interactive 3D Space Environment                                                                                  120
Appendix                                                                            Author: Chris Georgeiou
Table 4.2.7a
Property                  Description
int   handlerMode         Keeps track of device mode, only keyboard/mouse or both active
int   keyboardMode        Keeps track of the keyboard binding mode i.e. game/menu mode
int   mouseMode           Keeps track of the mouse binding mode i.e. game/menu mode
int   mouseMovement       Keeps track of the action mouse movement should perform i.e.
                          direct/indirect input to the application.
bool keyboardDefault      Is special case default keyboard mode on YES/NO?
bool mouseDefault         Is special case default mouse mode on YES/NO?
bool setHandleBinding     Does device binding need updating YES/NO?
bool setMenuModeBinding   Set the bindings for menu mode YES/NO? A function exists for
                          each corresponding mode.
Keyboard_Handle *keyboard An absracted class defining the nature of a keyboard. Defined as an
                          array keyAction indexable by 256 different keys where each on can
                          be assigned an action. Similarly keyPressed keeps track of the
                          button status.
Mouse_Handle *mouse       An abstracted class defining the nature of a mouse. This is defined
                          as a current and previous x,y position to keep track of the mouse
                          and see if it has been moved from its previous state or remains idle.
                          Sensitivity is a scaling factor for x,y movement to make the mouse
                          move more or less, centre warp determines if the cursor should be
                          centered of not. Then a series of variables keep track of the number
                          of buttons, scroll wheels, their corresponding actions and their
                          active status.

Essentially Mouse and Keyboard classes keep track of their bindings and callbacks, with handler
acting as an intermediary and manager to all devices.

Figure 4.14

This section of the modeChange() function shows how the handler bindings are kept up to date when the internal
engine state mode is updated.

Interactive 3D Space Environment                                                                           121
Appendix                                                                              Author: Chris Georgeiou
4.2.8 Window

The window handler comprises 3 main parts however there was no need or time to implement the
last. A class window describes all the properties of a window, then two children inherit Main and
PopUp classes, the later has not been written.

Table 4.2.8a
Property                          Description
WindowBox *windowProp             A structure defining the window properties.
bool isContentActive              Tracks if the window content is active.
bool fullscreen                   Tracks if the window is fullscreen or not.

For technical details of the window properties for the structure WindowBox (see Appendix: UML
Class Design6 – Window & Main classes). The class has a bunch of virtual functions, WinMain
being the one that every main window application will have some kind of implementation routine
for, which will in turn call DrawGL(), a drawing routine function, by sending the WM_PAINT
message to WindowProc() function. This works as DrawGL() is called from a polymorphic pointer.
Reshape and regWindowClass functions will also have their own unique implementation as these are
also virtual functions. Additionally the class function includes tools for loading various window
specific properties using the window API, i.e. making and closing windows, setting resolution,
toggling full screen.

Figure 4.15

An extract from the WindowProc() function highlighting how the rendering takes place from the main window content

The Main class is pretty similar; it encapsulates the concept of a window holding the main
application. It has a variable to keep track if the window is open so that when it closes it can call the
garbage collection function to free memory currently used. In addition anything inheriting from
Main will give its own implementation to the virtual functions inherited from the window class.

Interactive 3D Space Environment                                                                               122
Appendix                                                                      Author: Chris Georgeiou
4.2.9 Physics Engine
The physics engine consists of 2 main components; the Physics and Physics_Object class (see
Section 3.9.1). The physics class consists of variables that define the properties of something which
has physical characteristics i.e. xyz coordinate position, direction, velocity, resistance, acceleration,
force, momentum, kinetic energy and other variables for use of describing direction and position (see
Appendix: UML Class Design3b – Physics & Physics_Object). Physics is the groundwork of the
engine, while physics object defines the properties of the object with physical attributes via radius,
density, mass, volume and the environmental resistance the object must deal with.

The engine itself runs from the movement function which if integrated would be called continuously.
It keeps an ongoing counter to sample how long the object has been moving for, when the object
stops moving this counter is reset. Currently is not used to calculate anything but remains there for
the purpose of collision detection. Initially a check is done on the environmental resistance; if this is
not zero a custom function calculates resistance which in turn is used to reduce the velocity. In the
resistance function randomNoP() represents the slight variance in resistance which is down to sheer
randomness, just because the craft has a certain density doesn’t mean things will always be entirely
constant, they will just seem uniform to some degree. While this has no immediate major effect,
imagine being in the middle of a battle where manoeuvrability and skill play major roles but sheer
randomness also factors slightly into the equation.

Figure 4.16

Acceleration is sampled every second and kinetic energy, force and momentum are calculated, again
some of these serve no real purpose but provide a basis for the collision detection function. Physics
Object inherits from Physics and any object to which one wishes to make active in the physics
engine i.e. any environmental object, simply inherits from physics object and is initialised
appropriately in the object’s class.

4.2.10 Particle Engine
The particle engine trinity comprises Particle_Engine, Particle and Particle_Group classes. The
particle engine class is much like the physics class, it describes the properties of the engine, such as
position, direction, velocity, resistance, acceleration, density, resistance, environmental resistance,
lifetime and fade. Particle simply describes a particle in terms of rgba colour, size, various variables
for effects and loading images, and whether the particle should be drawn or not.
By separating the properties of the engine from the particle, it becomes clearer how each element

Interactive 3D Space Environment                                                                     123
Appendix                                                                     Author: Chris Georgeiou
can be expanded in more detail i.e. various functions for manipulating particle colour and size exist
in particle while movement and variants on particle behaviour are handled by the engine. Finally
particle group is a collection of particles and provides algorithms designed to describe the group
behaviour of particles under certain conditions i.e. fire, smoke, clouds.

To create a particle emitting object, the object simply inherits from Particle_Group which inherits
from Particle which inherits from Particle_Engine. The object then initialises and specifies how the
particles should behave with accordance to its purpose.

The engine’s movement function is pretty much a copy of the physics engine’s movement function,
simplified only to deal with specifics of particles. The particle group class has functions to create
interesting particle colour transformations such as particles with static or dynamically changing
rainbow colours, random colours or fire colours. In addition a fire algorithm was written which
ensures fire sprays away from the direction of the emitter, alternating the angle of emission slightly
on each cycle by a random factor. The speed of the particle and its density also decreases in relation
to life (energy), finally depending on the particle type a colour scheme is chosen. The flexibility in
choosing colour means that objects in space aren’t limited to just emitting fire, the colour can easily
change to smoke, or something more psychedelic. In turn the same algorithm can be used on another
group of particles on the same emitter for smoke emission, all that is required is to increase the
density of the smoke particles so they move slower and choose the colour scheme.

4.2.11 Math components

A few classes comprise the math components of the application, each one of them will be discussed
in turn and in regards to any corresponding relationships.

Defines a vertex and the possible states it can be in via a xyz coordinate position and a status
definable by a custom enumerable type VertexStatus as INVALID or DEFAULT. Function wise it
overloads a few math.h operators and provides an optimised version of tools for finding the
magnitude and magnitude squared. In addition functions to convert a vertex to a valid Vector or
Quaternion exist for convenience purposes. (see Appendix: UML Class Design2 – Vertex)

Vector is extremely similar to Vertex; it too has xyz coordinate positions and a status definable as
INVALID, DEFAULT or UNIT. Operator overloading exists for tool function optimisation such as
finding magnitude, dot and cross products, distance between two vectors, find the unit vector and
finding a random normal for a vector. Additionally, for convenience there are functions for
conversion to a Vertex and a Quaternion. (see Appendix: UML Class Design2 – Vector)

Quaternion is defined as xyz and w coordinate positions and has the same status as defined for
Vector. The w coordinate point is used as a function to manipulate xyz under certain conditions, a bit
like the 4th element in a 4x4 matrix. It too uses operator overloading to optimise the function toolset
that finds the quaternion length, makes a quaternion from an angle and another quaternion or vector

Interactive 3D Space Environment                                                                   124
Appendix                                                                       Author: Chris Georgeiou
(essentially only an xyz is required from either, a vertex implementation was not provided as the
concept requires a notion of direction that vertices do not have) and a quaternion cross
multiplication. (see Appendix: UML Class Design2 – Quaternion)

A matrix has been defined as either 3x3 or 4x4 with a 2D array representation having possible
statuses INVALID, DEFAULT. Overloaded operators are used much the same as before and the
toolset functions is only half developed with a function calculating the determinant, the transpose
and the inverse of a matrix. This function isn’t used anywhere in the program, it was created in hope
that it could later be implemented with a core rendering engine that deals with matrix form only,
although no planning has been done for this for feasibility reasons. (see Appendix: UML Class
Design2 – Matrix)

A triangle is any three vertices and a status defined as INVALID, CLOCKWISE or
ANTICLOCKWISE. In addition 2 array variables hold the vertex ordering for clockwise and anti-
clockwise orientations. A triangle can be created from either 3 sets of xyz cords or 3 vertices; a
function was to be designed which returned the status of the triangle in relation to the camera. This
would be done, by creating a triangle so that the front is facing the programmer, then marking the
ordering of the vertices for this anti-clockwise orientation in the appropriate variable, this would
then go somewhere in the scene. Upon calling this function the first two vertices in the anticlockwise
orientation variable are taken and a ray is formed, then depending which side the 3rd point lies on in
relation to the camera we know the orientation. This was going to be used so that polygons facing
away from the user would not be rendered but was never realised. (see Appendix: UML Class
Design6 – Triangle)

A polygon is defined a triangle or any number of vertices over 3 and has status of INVALID,
TRIANGLE, PLANAR, NON_PLANAR or UNKNOWN. Invalid indicates the polygon object is
not setup at all, triangle, planar and non-planar, indicate what is known about the polygon and
unknown refers to a polygon that’s been created but not yet calculated as either of the previous three
states. A function to calculate the status was not created because this classes was never fully
developed however it would have simply involved finding the equation of the plane from 3 vertices
then calculating if the remaining vertices all lie on the plane, this could have also been done with the
help of the Plane class for object inherting polygon i.e. Plane objects. This class is the first real step
in attempting to realise a rendering engine as any non-planar objects could be split into a set of
planar ones, however this was never attempted. (see Appendix: UML Class Design6 – Polygon)

A plane is a polygon, it has a normal, an equation term and a status defined as NEITHER,
EQUATION, NORMAL and EQN_AND_NORM. Neither means we know nothing of the plane,
equation and normal imply we know those things respectively and the last means we know both the
previous two things about the plane. There are functions for working out the equation of the plane
using textbook methods and also for calculating the normal of the plane. There is also a function
which given a point calculates its relationship to our plane. This is another one of the functions
meant for the rendering engine, which unfortuntely could not be realised due to the hold ups
developing various sections of the framework. (see Appendix: UML Class Design6 – Plane)

Interactive 3D Space Environment                                                                      125
Appendix                                                                                   Author: Chris Georgeiou

4.2.12 Environment Objects
The environment was built in consistency to the design specification. Originally there was just one
solar system, however this was expanded. At the top level exists a universe, which can be from one
to NxNxN star systems, where N has been capped at three. This was going to be modified so that
only one star system would exist, and if the user left that system, the new star system would be
created dynamically and the old one deleted as soon as the user had completely entered into the new
one. Although this means the user can never visit the same place twice it provides for a more
interesting experience and encourages each environment to be fully explored before delving into the

Figure 4.17

This version of the universe constructor accepts a position, radius, number of star systems that the universe comprises
of, and a width and height for the window resolution. This is an example of how a single call to universe can call all
the sub parts of a universe leading to the production of the full environment.

Each star system can consist of a set of properties defining its composing components which are
setup using various defined states. Once the properties of the system are defined the star system is
generated by making subsequent calls to the relevant classes. Additionally there is a function to

Interactive 3D Space Environment                                                                                    126
Appendix                                                                                     Author: Chris Georgeiou
ensure each object placed in the star system never collides with another object.

Figure 4.18

A star system type can be: empty - containing nothing; have only stars; have only cloud formations; have only planets;
be a normal mixed and well balanced star system; bonus implying it will be comprised of a psychedelic theme and
contain bonus items for the player to collect; rare is similar to bonus except it will contain more rare bonus items, like
the possibility for ship expansion; random creates a nonsensical environment.
A star system population can be very low, low, medium, high and very high describing how crowded the environment
will be.
A star system colour theme can be of one of the 4 elements and effects textures and such chosen for planets, colours of
suns, clouds etc.

Unfortunately this model was never fully implemented and only the solar system exists. This is
populated with part of our solar system, a sun with 4 planets representing, Mercury, Venus, Earth,
Mars with 3 moons, Earth’s moon, and Mar’s moons Phobos and Deimos. Additionally stars in
space are mapped on quadrilaterals and upon view change the appropriate rotations are made to turn
the image plane so that stars always face the viewer, the technical name is a billboard. Some of the
stars twinkle; this was achieved by using custom textures (see Figure 4.19) and alternating between
them to produce varying areas of dark and light.

Figure 4.19

                                                                      An example of how textures
                                                                      are selected for the twinkle
                                                                      effect and how their visual
                                                                      appearance varies.

Other stars can be seen shooting across space while other still exploding into chunks of equal sizes,
though this effect requires some tweaking as most of the stars, although large in size are far from the
viewer thus appear in proportion, so exploding stars may seem to just get bigger until a slight gap
can be seen between the exploded pieces before they all disappear.

Unfortunately much more wasn’t able to be implemented for the environment due to the time spent
on the framework. However, the functionality now exists to create such effects.
Figure 4.20

Interactive 3D Space Environment                                                                                       127
Appendix                                                                       Author: Chris Georgeiou

This is the logic section of how a star twinkle works.

4.2.13 Object View
The object view class is a series of properties describing the camera and various functions that
require knowledge of these properties for certain special rotations i.e. the billboard effect. The class
simply provides an abstraction for the camera details, any object that wishes to make use of it needs
to inherit from it and then initialise the camera properties accordingly i.e. to view a particular object,
the objects position is stored in the variable describing the camera’s focus. Each camera parameter is
represented as a quaternion as to apply the same rotations to the player as to camera, i.e. when a
player uses roll, pitch and yaw movements.

4.2.14 Object Movement
Object movement has the same properties as a physics object, thus it inherits its properties in order
to describe various fixed rules of motion. A function move() was going to be implement which takes
a parameter referring to a certain direction up, down, left, right, forwards and backwards, would then
be used by other functions with pre-defined motions to create more advanced movement patterns for
the object.

Object movement has no real properties of its own, instead it uses variables to describe various laws
of motion in 3D space, such as longitude and latitude angles used to help describe the direction of
the object. It mainly takes note of a position, defines a plane, a normal to that plane of movement,
has longitude and latitude angles and a radius. For static objects, a basicMovement() function exists

Interactive 3D Space Environment                                                                      128
Appendix                                                                     Author: Chris Georgeiou
that defines movement inside a sphere. Increasing the radius moves further from the centre, there is a
possibility to invert longitude and latitude angles which could be used to create a swivelling motion
or both can be inverted to completely swap direction. Development of this class is still quite basic as
time was made for more key components.

4.3 The bigger picture
This section speaks briefly how things work as a whole. Upon initialisation of the application the
function WinMain() in the engine class is called, this sets up the engine state, the device handler and
the main window. Control is then passed to the main window which continues setting up the window
properties to reflect the menu mode. Once the menu content is loaded, control is passed to menu
were the user can now manipulate internal states.

The drawing routine in the menu makes a quick pass in the screensaver routine which checks if the
application has been left inactive long enough to enable the screensaver, then function
drawMenuMode() is called. In this function the mode is passed to a function which checks what the
current menu is and draws it. The remainder of the function enables the menu effects where toggled.
Upon leaving the menu to go to screensaver mode whereby the universe becomes visible, the content
of the main window is emptied, the resources of main are freed, the engine class runs a function to
detect if the mode has changed and then it initialises the window properties once again.

The content for the solar system is now loaded up and the corresponding initialisation functions are
called. The control bindings now reflect what the user can do in space mode. Changing the space
view sends a call to the setView() function which uses switch statements to enable the corresponding
view. Once bored with space view the user can then run the main application.

The same occurs as before except this time the player class is loaded in conjunction with space. The
view begins in first person perspective and the user takes control of the ship. Unfortunately third
person view doesn’t work as well as first person mode and various calls to player movements don’t
respond well due to camera issues. The problem for these are still unknown as time was tied up in
fixing the framework which was holding back the integration of some functionality designed for the
craft. Additionally functionality has been implemented for the player i.e. the ability to teleport and
travel at hyper speed, which has not been integrated because simpler more core issues were not able
to be resolved.

When the application exits, the calls to WinMain() exit their loop and a garbage collector is called to
free up memory from each section of the program.

4.4 Implementation Issues

Interactive 3D Space Environment                                                                   129
Appendix                                                                      Author: Chris Georgeiou
This next section highlights issues such as updating and the evolution of ideas that occurred during
implementation in more detail.

4.4.1 Updating Structures
Apart from the initial menu effects created everything has had its structure updated. Initially this
involved evolution of ideas but later better planning for data structures meant updating was a
necessity. As some of the classes were required to be built at a later time once a general structure for
the program had been setup, these classes also needed sum updating. The worst cases came about as
the project was starting to grow to a size where many classes started becoming interlinked and
dependant on each other as this created more opportunity for abstraction. Common concepts
occurring between classes were studied and a plan formed in the logbook. A class was then created
and much time was spent abstracting from other classes, which involved replacing interlinked details
with the ones new class provided. While in most cases this was simple a tedious affair some classes
were interlinked in such a way that changing them would involve practically rewriting the various
functions and how they behaved. Though logic was rarely changed drastically, there were cases
where in order to apply a new idea much had to be restructured like in the case of designing space as
a dynamic universe.

4.4.2 Real Time Expansion
Though the entire application was designed in an iterative process, this section will discuss a major
step added at the end of implementation and says how and why it was chosen.

At the end of stage 6 of the project (see Section 3.1 - UML Diagram Design5) the application was in
a near complete state. The player could perform some degree of interaction with the environment
and the framework was stable if not incomplete in some areas. The choices for expansion lay in
developing the view class to provide camera for every object and a means by which to examine the
environment. The object movement class was the next obvious choice allowing more interesting
things to be done with objects in space, however the data structures representing space was messy
for either of these to be implemented without the need for some kind of abstraction and
reorganisation. It was thusly chosen to upgrade the way space was represented by abstraction space
and introducing the universe class (see Section 3.3). Essentially a universe is comprised of various
environmental objects and is itself one. This allows space to be defined as multiple universes at
some point. The universe has a series of star systems which vary in content but generally contain,
suns, planets, moons, stars, clouds, a unique perspective of a background (to be represented as a star
map). While these singular entities comprise a star system, they also to some extent comprise a solar
system, thusly solar systems were defined as a separate entity which could be initialised and
included as an object in the star system.

The biggest issue here was time and much of it was spent in the design stage planning the intricacies
of creating a pseudo-randomly generating space environment. This endeavour was a bit too
ambitious given the time remaining and left the environmental structure of the application messy.
Some functions remain to be written while other simply integrated. The reason this option was taken
and was thought to be feasible was because a lot of the functionality already existed. It was the logic

Interactive 3D Space Environment                                                                    130
Appendix                                                                    Author: Chris Georgeiou
that handled various environmental objects and calling the aforementioned function that required
being re-written.

The safety mechanisms were the only features worked on consistently throughout the entire flow of
the project. The application was designed such that all pieces of code were interconnected indirectly
and direct connections occurred only between directly relevant classes. This meant that somewhere
along the line everything eventually joined to the heart of the code. The general safety mechanism
designed to protect from runtime errors was to terminate the application flow and rollback on
operations, freeing memory where possible before quitting. This was achieved by returning a
message to the calling function informing it that it had failed its task. Depending on the severity of
the failed task this message was permeated back to the main flow of the application by signalling via
various state flags. This could cause anything from exiting and producing various errors messages to
simply not responding to a particular function. This in turn helps isolate the location of problems
while debugging, depending on how the program reacts to the error. Safety mechanisms were
originally designed for the debug mode in order to give the programmer feedback at various major
data flow points in the application.

Interactive 3D Space Environment                                                                  131
Appendix                                                                        Author: Chris Georgeiou

                                     CHAPTER 5

   Testing and Results

5.1 Testing Methodology
While it was possible for various aspects to be implemented and tested in parallel such a case was
not so for other aspects and as such testing takes a number of stages. Depending on the scale of each
section, testing was adapted to be time efficient and to bring to light the most errors. This was
achieved by breaking testing into categories of ensuring first the logic was correct for the code and
secondly checking the object’s visual appearance was in synchronicity with the rest of the
environment. These were done by observing the validity of the object’s behaviour given user input
where possible and through simple observation respectively.

Evidently larger sections comprising of smaller components had the framework created and tested
first. Each of its components were added in turn, tested and tweaked, if any such component was too
large this process was repeated recursively until only individual components were dealt with. An
exception lies where individual objects compose to provide a new feature. As composition of the
program was incremental (see Section3.2) testing was functionality based until the code had the
main framework implemented. Here it was important to study the collection of classes and ensure
properties between linked classes were not repeated and accurately described the structure.

After having identified such collated objects it was possible to then, upon ensuring that they would
work together via careful implementation planning, test them individually where plausible and
finally ensure they work together correctly.

For example:
An asteroid is an environmental object, we want ensure it is drawn correctly, and that it behaves
according to its pseudo-randomly generated pre-defined behavioural pattern. These are tests done on
an individual scale.

However, depending on the type of asteroid, i.e. flaming, testing on inherited functionality i.e. the
particle engine is difficult. It is important to first ensure the logic is correct and the asteroid behaves
correctly by specifying and observing the result on screen. If the particles are not emitted as we
expect from simple observation, the issue can be isolated more easily by creating example scenarios
and then figuring out which attributes and functions are the cause of the error.

With almost all objects tweaking was necessary to some degree, which involved simply playing with
values once things worked correctly to ensure the run-time visuals or behaviour, looked and moved
naturally or to some satisfactory degree.

Interactive 3D Space Environment                                                                       132
Appendix                                                                   Author: Chris Georgeiou

5.2 Testing and Tweaking
Here we describe what kinds of testing and tweaking we will be doing in the context of our sub-
sections and give examples which show their relevance. Generalisation cases will be described here
and given examples of so that in sub sections briefer descriptions can be given and we can focus on
more specific things.

From a top down approach in terms of how the user would interact with the software, the first thing
available for scrutiny is the menu. The application runs through the engine setting up the window,
the engine states and the handler, for which there is difficulty proving actually occurs. Since the
success of the program relies on the framework it was important to devise a scheme able to prove the
engine works and sets up the appropriate devices. Logically it can be assume that the window has
been loaded up as the window does indeed load up and rendering of the main menu takes place. This
tells us the window handler acknowledges the internal state of initialising on menu mode, so we
know the state has been initialised correctly.

Figure 5.1

The fact that rendering occurs shows we are in the WinMain() of the menu context because the
WM_PAINT message must be sent to WindowProc() in Engine in order for the generic DrawGL()
function to be called. The internal message handling system works which means any all backs will
be received and if the handler doesn’t respond then the error lies in the device handler devised and
not the window framework. All internal workings seem to have gone well; next, testing was done on
the menu effects designed.
In addition we know, as the menu has loaded without returning any error messages that all the

Interactive 3D Space Environment                                                                133
Appendix                                                                   Author: Chris Georgeiou
classes required to provide functionality for menu mode have loaded successfully as they were called
from menu’s initialisation function. This implies the image loading framework was successful as it
has loaded up .bmp and .tga, extension files. Loading the effects classes implies initialising class
attributes, drawing routines effects and loading textures. This was tested by loading various aspects
of the program without the necessary files to test the various error messages.

Before explicitly enabling effects, sliding and fading bar have loaded and seem to be working.
Visually they don’t hide or obscure the menu beyond recognition; there aren’t too many bars
cluttering the screen and they are visually enticing, subjectively speaking. The bars behave
according to their specification and everything looks natural. The fact that the sliding bars work at
all also implies that the internal workings of the menu substructure works, thus the logic in
triggering events is correct and any problems will be in their relating functions, not the framework.

Figure 5.2

Figure 5.2 shows that the passive effect works, after 30 seconds a sphere object texture wrapped with
the menu texture appears from the centre and increases until it fills the screen covering any effects
that appear on screen. This animation routine takes at most 5 seconds and doesn’t occur often
enough to obscure the menu from the user.

Figure 5.3

Interactive 3D Space Environment                                                                 134
Appendix                                                                      Author: Chris Georgeiou

This is the first spot effect tested (see Section & Figure 5.3), it shows that the corresponding
effect class has initialised correctly loading its textures and setting up the effect. The logic for the
drawing routine is correct, randomly coloured spots with varying levels of alpha can be seen
growing in random places to different maximum sizes. Visually this passes all the tests; it keeps the
scene fun without taking focus from the screen. The same can be said for the second spot effect (see
Figure 5.4); spots behave as described in the implementation (see Section & Figure 5.4).
Further study of the two images shows the randomness of the effect can sometimes work to its
downfall. Though it looks good most of the time, on occasion the effect can overlay with other
effects and the build up of alpha due to blending causes certain aspects of the screen not to be visible
for a short amount of time. This effect could be sped up a bit or a lot in order to give an interesting
but sketchy feel to the menu.

Figure 5.4

The next test was on the first star effect whose details and implementation can be found in Section Figure 5.5 clearly shows the effect fulfilling its specification, both the stars and the warp
hole load and the warp hole remains confined in the centre of the screen. Hyper speed was enabled
to check the second part of this function and it worked successfully. The star object array was reset
and re-initialised with faster properties, which is evident as more of them have tails appearing; in
addition the warp hole is no longer confined. So far the device handler has responded to every

Interactive 3D Space Environment                                                                    135
Appendix                                                                     Author: Chris Georgeiou
possible command issued in both small and capital letters in addition all other keys were tested to
ensure they did not respond. This proves the device handler works as devices are detected and only
the keys bounded according to the engine mode respond.

Figure 5.5

The next effect was a stream of stars which begin to appear in a random location in a confined space
within the screen centre and disperse to the edge of the screen (see Section 4.2.3 & Figure 5.6). The
secondary feature of this effect causes the stars to move in the same direction as indicated by the
user, simulating stars moving away from the viewer as he cruises around space. Directions tested
were down and left, figure 5.7 shows this as the images to the left and right respectively. There is
more of a void of stars in the opposite section of the screen that the directional button was indicated
i.e. in the left picture the down key was pressed and the top part of the screen has less stars.

Figure 5.6

Figure 5.7

Interactive 3D Space Environment                                                                   136
Appendix                                                                      Author: Chris Georgeiou

The final test involved 3 parts, for the time it takes the screensaver to appear, first pressing a key on
the keyboard, then a button on the mouse and finally just moving the mouse. While these actions
occurred the screensaver didn’t load and when left alone for the designated time the screensaver
loaded successfully (see Figure 5.8). This proves only that the interlinking structure of the universe
system works as only a portion of full potential is on display here demonstrating a predefined system
with only a random assortment of stars each time. More will be tested in the next section. Planets,
moons and suns are loaded as part of a solar system structure, each body has a glow effect of its
corresponding base colour i.e. earth -> light blue, sun -> white, even the stars twinkle. The
screensaver was allowed to run for an extensive period of time to test the behaviour of the algorithm
and allow it to run through each space view (see Figure 5.8).

Figure 5.8

Interactive 3D Space Environment                                                                     137
Appendix                                                                     Author: Chris Georgeiou

The menu loads as soon as the application is interrupted with a any input from one of the peripheral
devices. Though switching to screensaver mode proves the engine state evolves it does not imply the
main window content has been switched yet. Currently both menu and space system have their data
loaded but when universe mode is activated, mainly for curiosity and debugging purposes of the
environment, the menu content is freed. Below are some of the error messages that occur if the
content is not loaded up properly in the main window window manager.


5.2.1 Environmental Objects
In terms of environmental objects, the screensaver manages to reflect part of the scenario in space.
Here the remainder of successful coded and integrated objects is tested reflecting up the visual
appearance and the behaviour where possible. Visual & Behavioural Result Tweaking
Fortunately no visual tweaking was required because during implementation each section was
implemented, integrated, tested and tweaked on the fly thanks to the iterative work model chosen to
tackle the project with. This ongoing process of building, evaluating and testing meant the best

Interactive 3D Space Environment                                                                  138
Appendix                                                                    Author: Chris Georgeiou
visual result given available resources was achieved.

Ideally objects could have been modelled using 3D rendering software or textures could have been
sharpened with better mapping techniques however this required a custom rendering engine. As you
can see from the bottom left image in figure 5.8 even with OpenGL’s automatic texture mapping and
depth-buffering techniques there is some undesirable detail coming through the sun. Some of the
stars and planets are momentarily visible through the sun caused by bugs in the OpenGL API.

Below, Figure 5.10, is again a very basic build of the program, the structure was simplified to
accommodate for asteroids. The asteroid was positioned randomly in an enclosed vicinity near the
sun then the solar system view was used to get the angle for the object. Immediately the crudeness of
the randomly generating asteroid algorithm is noticeable (functions GBAAsteroids() and
createAsteroid() can be referred to in the Asteroid_Group.cpp file included in the source code). This
particular asteroid was created with only 3 layers so that the behaviour of the algorithm could be
observed. In essence each vertex from every layer joins to the nearest vertex in the layer above, or
below, for the top layer. Additionally the drawing routine also works in providing an asteroid with
only convex surfaces. In the image to the right, a more complex asteroid was created to show the
diversity of the algorithm. While not all rule-based generated asteroids look fantastic, they always
look like asteroids, and the line implementation means fields of asteroids can exist with very little
stress on the cpu and graphics card (see Figure 5.11).

Figure 5.10

While no behaviour has been built into the asteroid yet, they are still particle emitting objects. The
particle count was set to 100 particles per asteroid and tested (see results in Figure 5.12). This was
prior to any colouring being applied to the particles. The particles were emitted in random directions
as the asteroid wasn’t moving, proving the physics and particle engines work correctly together.
While the behaviour is ok, the visual appearance is somewhat lacking. The particle changes texture
and colour can be applied to make it visual stimulating but the problem herein lies with the size of
the asteroid. Simple put larger asteroids require a proportionally larger number of particles to match
the visual theme aimed for. Figure 5.13 will attempt to show the fire algorithm in action.

Figure 5.11

Interactive 3D Space Environment                                                                  139
Appendix                                                                     Author: Chris Georgeiou

Figure 5.12

Though the fire algorithm works, it requires better use of randomly selecting from the available
colour ranges specified. There are more red than yellow particles towards the centre, the inverse is
true when moving further from the centre however this isn’t as visible given the lifetime of the
particle has nearly depleted and as a direct result so has its alpha value. Visually the erratic
behaviour of the particles is enjoyable to look at and apart from the colour selecting scheme, this
algorithm works as intended. It would have looked nicer with more particles set at a smaller size

Interactive 3D Space Environment                                                                   140
Appendix                                                                      Author: Chris Georgeiou
perhaps, but this could have the adverse effect of being computationally expensive

Figure 5.13

Figure 5.14

Upon exploration of space in first person mode with the craft it was possible to spot various star
effects. The first is the shooting star as shown in figure 5.14, left, followed by the a shot of the star
having disappeared the right image.

Interactive 3D Space Environment                                                                     141
Appendix                                                                        Author: Chris Georgeiou

Fortunately the texture used for the tail trail is small and stretched and the soft bluish tint on the tail
section looks good and really highlights the shooting star. A third image is provided without the
highlighted zone to show how readily visible they are.

Unfortunately upon close inspection exploding stars had some behavioural issues. Various chunks
didn’t render because of camera problems with the bill boarding technique. Had the star object
inherited from the physics engine, the chunks upon explosion could have been given a direction
which in turn could be used to provide the necessary rotations to ensure the billboards were visible.
In addition the tail trails for the exploding chunks looked messy as the size of the trail was not
always in proportion to the size of the chunk providing visual imbalance.

5.2.2 Player
While functionality was designed for the ship craft it was not possible to implement everything and
some of the tools implemented could never be integrated. The ship view in third person perspective
(see Figure 5.15) still has some issues with movement in various directions. It is immediately
noticeable that the craft shine was not implemented, this was simply due to lack of available time.

The ship was tested moving in a range of all its possible directions under every single gear available.
While the functionality worked and rotating to see the universe on its side looked good, the fact that
the craft remained in the centre didn’t look as good. It would have been nice to allow the ship to
move slightly without affecting the camera and then have the camera lock on before the ship moves
too far from the centre. This would give a stronger sense of ship movement throughout the space

Interactive 3D Space Environment                                                                       142
Appendix                                                                      Author: Chris Georgeiou

The player craft is featured in the image above uses a simple object comprised of a quadratic
function and texture mapped with two merged textures. The ship is always rotating which explains
the inconsistencies with the texture. A range of player movements were tested however while
functionality existed providing movement to the craft, only the keyboard aspects were implemented.
The camera framework required work before any other features were implemented otherwise it
wouldn’t have been possible to know whether the functions were being executed properly or not.
Visually first person perspective is more enjoyable to use because third person view can look a bit
static at times. This could be tweaked by implementing more animation to the craft to keep it looking
interesting even when it’s not doing anything. In addition the first person view could have done with
some kind of frame added to give the illusion of being inside a craft i.e. the interior of the ship could
be visible by about an inch each way into the screen giving the illusion of looking through a craft
window. This would have made a world of difference and could have been implemented with
relative ease.

Figure 5.15

A teleporting and hyper speed feature was never integrated along with a range of weapons and other
goodies whose prime purpose was to allow the user to interact with objects in space. Various items
which were to boost player statistics in terms health and craft weren’t integrated either due to more
pressing issues with framework. The different ship types were not tested individually as it can be
deduced from the way that the player class sets itself up, that the other types work. Using another
ship type simply involves passing different argument parameters to the initialisation function,
however only one was ever used.

                                    CHAPTER 6
Interactive 3D Space Environment                                                                     143
Appendix                                                                    Author: Chris Georgeiou

   Evaluation and Conclusion

6.1 Accomplishments
OpenGL and Window APIs were learnt to a satisfactory degree and much of the planning went
extremely well. A lot was learnt along the way in terms of how big programs work with highly
interconnected structures, both in terms of the importance of abstraction and watching out for
possible major problems that may arise.

Though the project didn’t go entirely to the structure planned, as there was both need and desire for
some changes to take place dynamically, many things were achieved. A strong internal structure
exists which supports dynamic changes to the application made in real time. The level of abstraction
in the framework provides future coders the required versatility to be able to extend the framework
anyway they see fit. Aside from a device and window handler, an engine, carefully planned memory
manager, an array of effects exists both inside and out of the universe.

Given the universe structure is unfinished the framework is there providing a powerful tool for
extensibility. The dynamic nature of the environment means an object simply has to be designed as
its own set of classes. Integration simply requires these classes inherit from the Environment_Object
class and then a routine written in the Star_System class to describe how the item should be
generated in the environment, i.e. the number of objects allowed per star system.

6.2 Evaluation
Given this was the first solo project ever attempted, despite the overall outcome much of the project
management processes had a desirable outcome. The project was approached with much enthusiasm
and time allocated to the first few stages was used very efficiently. A considerable amount of time
and thought went into planning for foreseeable problems and a broad range of research was done for
both the initial and final ideas. This covered learning the API necessary to code the project, looking
at how frameworks and engines worked and how to design them, interface designs and coding styles
from professional software source code, learning how to use professional animation software to
generate models and various libraries for importing these models. Even some math was researched
and advanced algorithms and data representation structures for more in depth functionality i.e.
fractal and recursive procedural modelling for terrain [18] and planets [19].

If this kind of project was approached again, given the experience gained the following steps would
be taken:
     1. Initial research stages would remain the same.

Interactive 3D Space Environment                                                                  144
Appendix                                                                    Author: Chris Georgeiou
    2. More time would have been spent in the design phase evaluating and looking over various
        issues that might occur. The project would have still been broken up into incremental steps,
        as this was a requirement to ease with the learning curve required for such a project. All of
        the stages would have been designed sequentially in one go. This would allow for clearer
        analysis of the entire project making problem spots more easily identifiable.
    3. Implementation would continue as specified by the design and as any major problems came
        into play the design diagrams would be consulted. If the problem was severe enough all
        diagrams would be redesigned to ensure consistency throughout the project.
All in all this would have allowed for the major problems to be identified earlier on and thus better
planning may have lead to successful completion of the project.

6.3 Future Expansions
In addition to spending a bit more time completing the current code the following could be added…

Some immediate additions would include more functionality for the player craft like the gravity gun,
which when aimed at an object either draws the object closer or the craft closer to the object if the
object is heavier. In general textures could be worked on for longer in photoshop to provide a crisper
appearance on screen. The pop up window class could also be implemented and interfaces designed
for in game functionality. The menu buttons designed can be implemented and a mouse detection
function written to allow mouse clicks to change the program state when buttons are selected.

A rendering engine could be devised with various rendering efficiency techniques using some of the
tools already provided. This is a very heavy task and requires much restructuring, OpenGL routines
would have to be re-written using the Matrix class, then substituted into the existing code. In
addition objects could be created and then described in terms the polygon class, which would then be
broken into triangles some of which would have to go to a routine rendering everything triangles in
which winding can be checked as not to render any back-facing polygons. An object handler with
some kind of special enumeration techniques could ensure only objects directly in the scene are
rendered which could then be passed to a routine to further enhance the rendering efficiency. Rays
could be cast through the rendering scene data, marking the first polygon surface encountered and
ignoring hidden polygons.

An image loader exists which could be extended to support all sorts of other image formats. The
device handler can be extended to deal with gamepad devices and what not, one could even look into
the dynamic plug and play detection i.e. a device is plugged in, the new hardware is detected and
depending on the features available code is generated to handle the specifics of this device. This
would require further knowledge in the Windows API, knowledge in the plug and play feature, a
class allowing the use of regular expressions for random code generation and possibly joint work
extending the window framework so a pop up window can be provided allowing the user to
configure their device. Instead of dynamically generating the code, the user could specify a new
device in a pop up window and use these parameters to setup the necessary code, regular expressions
would probably still be required to generate special names for new variables, or a list of pointers
could be used and initialised upon request.

Interactive 3D Space Environment                                                                  145
Appendix                                                                   Author: Chris Georgeiou

6.4 Conclusion
Given the resources available and the fact that this was an own project a phenomenal amount of
work was done in terms of researching, developing and evolving an idea. Planning was made for the
eventuality of change in both major and minor aspects of the project ranging from drastic changes in
the overall application, changes in framework and the data handing hierarchy which involved a lot of
abstraction. Some of the less significant changes featured on how various effects look and behave on
screen and what classes they needed to relate to in order to produce the desired effects more
accurately. It is however regrettable that the software was unable to be completed, however this will
become an personal hobby and who knows what else in years to come.

Writing this undoubtedly large piece of software provided a real insight as to how the software
development cycle works and how dynamic it must be in order to facilitate the various aspects of the
project. One model will not work from beginning to end, some components are vastly interlinked
and required careful planning while others, due to the framework which had been provided could
eventually just be thrown in with little planning required.

Interactive 3D Space Environment                                                                 146
Appendix                                                                  Author: Chris Georgeiou


The above URL provides a link to the OpenGL manual for a vast number of OpenGL functions.

The above URL provides a link to the Microsoft reference library for windows specific API

The above URL provides a link to the Microsoft reference library for windows specific API
structures and constants.

The above URL provides a link to 3DS-MAX basics tutorials.

The above URL provides a link to 3DS-MAX modeling tutorials.

The above URL provides a link to 3DS-MAX modeling tutorials, it contains walkthroughs on how to
create relatively complex models with simplicity.

The above URL provides a link to the Qt online reference documentation.

The above URL provides a link to the image used in figure 2.3.

The above URL provides a link to the image used in figure 2.4.

The above URL provides a link to information on a particle engine.

[11] &
The above URLs provides links to research on space exploration game interfaces.

The above URL provides a link to Shneiderman's "Eight Golden Rules of Interface Design".

The above URL provides a link to Half Life 2.

Interactive 3D Space Environment                                                               147
Appendix                                                                    Author: Chris Georgeiou
The above URL provides a link to Maelstrom.

The above URL provides a link to the rotating desktop on Mandriva (Linux).

The above URL provides a link to Quake 3 source code.

The above URL provides a link to planet, moon, sun and star texture maps.

[18] &
The above URL provides a link to a guide on procedural modeling of terrain.

The above URL provides a link to a guide on procedural modeling of planets.

The above URL provides a link to a guide on terrain generation using fractals.

Interactive 3D Space Environment                                                               148
Appendix                                                                       Author: Chris Georgeiou

Ref: Initial Story Theme

Initially the project was to be an action adventure in which the main character would crash land on
Earth; this follows on from a main menu space theme. Earth is in turmoil due to neglect allowed and
planned by an elite group of the planet’s inhabitants who happened to be its leaders. These leaders
knew about the spiritual duality of the universe and where driven like most men by desires of power
in its domain. Essentially from another kingdom, they bound their spirits to Earth spending lifetimes
preparing Earth and its inhabitants for a new way of life by ensuring man never evolved to know one
of the many truths of the universe. Knowledge is power and lack of it meant suppression of wisdom
and thus the ability of better judgement, such are the rewards of ignorance. Just as man in history has
never learned to love and live in peace with his brothers, so this colony of space pirates wonders the
abyss trying to bring galaxies together to unite under one rule. An ambitious and presumptuous plot,
which stands only hand in hand with conflict and failure; violence only begets more violence.

Aware of the duality of the universe you just happen to be drawn onto the planet by strong forces
which you happily accept, as only a being of light truly understands what it means to have purpose
and to live without coincidence. You soon understand your purpose upon arrival as you sense the
strong negative vibrational state of the planet (the planet’s aura is one of suffering) as it is preparing
to give birth to something new.

The adventure begins upon crashing into Earth where you explore a brief level above ground in
which you learn of the planets situation and are armed by nature’s various fruits of protection and
destruction. The story the continues to underwater depths, where a new physics system takes place
allowing for all kinds of level design in which the player must learn to use the physics of the
environment with his abilities to and the fruits of his new environment to progress. This part
represents cleaning the blood of the planet, while the surface represented cleansing of the flesh. The
final part of this trinity sees the player warping into the hidden heart of the planet to cleanse the
spirit of the planet. This section focuses on space like physics in which the player must navigate
through solving a final and difficult environmental puzzle, after which he is left to battle “the
infinity”, a source of endless power which seeks only to collect and consume as it knows the
ultimate truth. Anything with a soul will eventually rejoin and become god at the end of its cycle,
however in the meantime those not willing to become part of the one, or created without soul, have
devised an economy where souls = money = power in dimensions of a grander illusions where
foolish beings still come to terms with the truth of their existence!

Interactive 3D Space Environment                                                                      149
Appendix                                                                Author: Chris Georgeiou
Ref: Project Timeline Chart

In order to establish organization of the project process, milestones were set to mark important
stages of the project.
    1. Full implementation/testing of arithmetic and data operation instructions – 21st October
        (week 3).
    2. Full implementation/testing of the main GUI layout – 18th November (week 7).
    3. Full implementation/testing of the primary objectives – 10th February (week 19).
    4. Full implementation/testing of the secondary objectives and any extras – 24th March (week

Interactive 3D Space Environment                                                            150
Appendix                                                                                                                              Author: Chris Georgeiou
Ref: Particle Engine Research

Name                Description                                                                                                      Used? Why? Why not?
Particle Emitters   A Particle Emitter is a special object type that can spawn groups of other objects, called particles.            Yes, an emitter object is required
                    Particles have no collision (you cannot run into them) and are simply visual effects that are added to the
                    scene. They can be used to create countless special effects, including: rain, snow, blowing leaves,
                    fireflies, fire, smoke, ground fog, waterfalls, fireworks, and many other effects. Experiment and see what
                    you can do!
Particle Type       This controls what kind of particle will be released. There are four types:                                      Particle type was used to
                                                                                                                                     indicate the type of particle to
                        •    Sprite - This is the simplest form of particle. The particle looks the same regardless of where it is   setup, smoke, fire etc.
                             viewed. The texture is simply drawn to the screen, ignoring the angle the particle is facing. This
                             is similar to the way in which coronas are drawn.
                        •    Facer - The particle is a flat vertical panel that turns to face the user. This is the same way in
                             which facer objects are handled.
                        •    Flat panel - The particle is a flat panel that can be rotated into any position.
                        •    Model - The particle is an RWX or COB model.

Asset List          This defines what textures or models will be used to make the particles. This may be a list of textures          Particle textures were loaded in a
                    (separated by commas) for sprites, facers, and flat panels, or a list of model names (separated by               similar context
                    commas) for model type particles. If more than one texture or model is listed, the each particle will use
                    one from the list at random.

                    For texture lists, masks can also be specified. You can place the name of the mask after a colon as


                    If you omit the mask, the texture will "self mask" by making the pixel opacity based on pixel brightness.
                    (White pixels are opaque, black are transparent) If you do not want any mask, use a colon but supply no
                    mask name. So, if you had three textures you wanted to use, the first of which was not masked and the
                    second of which was self-masking, you would use the following:

Tag Name            This assigns a name to the emitter. If this field is NOT empty, the emitter will NOT start when it comes         Not used
                    into view, but will need to be triggered by another object.
Emitter Lifespan    This defines how long (in milliseconds) the emitter will release particles. When this time expires, the          Used to specify particle
                    emitter will no longer release new particles, although existing particles will continue to run their course.     sustenance.
                    If you want the emitter to release particles forever, enter zero.
Release Count       This defines how many particles will be released at one time.                                                    Not used, as a max allowable
                                                                                                                                     number of particle are released,

Interactive 3D Space Environment                                                                                                                                        151
Appendix                                                                                                                               Author: Chris Georgeiou
                                                                                                                                      that’s it!
Release Time        This defines how often (in milliseconds) new particles will appear. If this range is set to 500 to 3000, then     Not practical for the application.
                    particles will be released at least once every three seconds, but not more often than once every half
                    second. The browser will emit particles at random intervals between these two values.
Particle Lifespan   This defines how long (in milliseconds) each particle will last once it is released. For example, if this is      Not practical for the application.
                    set to 2000 then each particle will last 2 seconds.
Fade In Time        This only affects texture-based particles (sprites, facers, and flat panels) and has no effect on "model"         Not practical for the application.
                    type particles. This defines how long (in milliseconds) it will take for each particle to "fade in" to view.
                    See "Opacity" below for more information.
Fade Out Time       This only affects texture-based particles (sprites, facers, and flat panels) and has no effect on "model"         Not practical for the application.
                    type particles. This defines how long (in milliseconds) it will take for each particle to "fade out" of view.
                    See "Opacity" below for more information.
Interpolate         If this option is checked, particles will begin at the minimum size and start colour. Each particle will then     This is an effect specific
                    transition to the ending size and colour by the end of its lifespan. If this option is NOT checked, than each     property and depends on the type
                    particle will have a random size between min and max size, and a random colour between the given start            of variable.
                    and end colours.
Gravity             If this option is checked, particles "fall" according to gravity.                                                 No gravity in space.
Zone Collision      If this is checked, particles will check to see if they have run into zone boundary. If so, the particle will     Collisions not detected in
                    vanish instantly, without fading out. This is useful for keeping particles out of unwanted areas. For             application.
                    example, you wouldn't want rain or snow particles to come indoors. This will not keep particles from
                    appearing in other zones, but it will keep particles from moving into a different zone once they exist.

                    Note that this check consumes more CPU time, and may cause slowdowns, so you shouldn't enable this
                    option unless you need it. Also, particles can only collide with zones that have the "particle collision"
                    option enabled. That is, for a particle to run into a zone both the particle emitter and the zone must have
                    collision enabled.
Zone Exclusive      This will prevent particles from spawning in a zone different from the emitter.                                   Not necessary.

Use camera          Normally particles appear within a box as defined by the emitter’s volume (see below). However, you               Not necessary.
position            may enable this option to move the spawning area from the emitter to the camera. The effect will be that
                    particles will seem to come from all around the user, wherever they go. This is ideal for creating weather
Colour Start        If "Interpolate" is checked, then particles will begin at this colour and fade to "Colour End". If it is not      Algorithm specific.
                    checked, then particles will randomly choose a colour between "Colour Start" and "Colour End".
Colour End          If "Interpolate" is checked, then particles will begin at this "Colour Start" and fade to this colour. If it is   Algorithm specific.
                    not checked, then particles will randomly choose a colour between "Colour Start" and "Colour End".
Size                This defines how big (in meters) each particle is. The first number is the minimum size, and the last             Algorithm specific.
                    number is the maximum size. These numbers have different effects based on what type of particle this is:

                        •    For Sprites, Facers, and Flat Panels, the first two numbers define the width and height of the 2d
                             panel and the last number (Size Z) is not used. So, if Size X was set to 2 / 3 and Size Y was set
                             to 10 / 10, then the emitter would create particles 10 meters long, with a width between 2 and 3
Interactive 3D Space Environment                                                                                                                                       152
Appendix                                                                                                                          Author: Chris Georgeiou
                    •    For models, these numbers act as scaling values. Making all of the numbers 1 would cause
                         objects to appear normal size, while making all values 10 would cause the objects to appear 10
                         times their normal size.

Volume          This defines the size (in meters) of the cubic area in which new particles will appear. For example:             Algorithm specific.

                    •    If all members are set to zero, then all particles will appear right at the emitter.
                    •    If the X range is -5.00 to 5.00, and the other members are zero, then each particle will appear
                         randomly along a line that extends five meters east and west from the emitter origin.
                    •    If all ranges are 0.00 to 1.00, then each particle will appear randomly within a one cubic meter
                         area that extends up, North and East from the origin.

                Note that if you have a single particle emitter selected, it will show the cubic area as a purple wireframe,
                so you can see the area in which particles will appear.
Acceleration    This defines how fast (in meters per second) particles will accelerate. When a particle is created, it will      Algorithm specific.
                randomly choose an acceleration value from within the given range, and will accelerate at that rate for the
                duration of its lifespan.
Speed           This defines how fast (in meters per second) particles will be moving at the time of their creation. When a      Algorithm specific.
                particle is created, it will randomly choose a speed value from within the given range.
Start Angle     This defines the position (in degrees of rotation) of each particle. This has no effect on sprites and facers.   Algorithm specific.
Spin            This defines the rotation (in degrees of rotation per second) of each particle. This has no effect on sprites    Algorithm specific.
                and facers.
Render Style    This defines how the particle will be blended with the scene. This option has no effect on "model" type          Algorithm specific.

                    •    Normal - The texture is simply drawn to the scene with no special effects.
                    •    Bright - The texture is drawn to the scene, blending with whatever is already visible so as to
                         brighten the scene.
                    •    Glow - The texture is added to the scene, making it significantly brighter. The difference
                         between "bright" and "glow" can be subtle in certain situations. It is best to simply experiment
                         with this option to achieve the desired effect.

Opacity         This is a value between 0.00 and 1.00 that defines how opaque the particles will be. This option has no          Algorithm specific.
                effect on "model" type particles. Particles begin at zero opacity (fully transparent) and fade to this value
                over the course of the "Fade In Time". They then fade back to fully transparent during the last segment of
                their lifespan according to "Fade Out Time".

                For example, if "Opacity" is set to 0.5 (half opacity), "Fade In Time" and "Fade Out Time" as both set to
                1000, and "Particle Lifespan" is set to 5000, then the particle would fade in over once second, hold at
                50% opacity for three seconds, then fade back out over the final second.

Interactive 3D Space Environment                                                                                                                            153
Appendix                                                                            Author: Chris Georgeiou
Ref: Design Stage Reference User Guide

The design stage references that follow come in two parts:
    1) A UML style class description.
    2) An entity relationship diagram.
The first aims to show the level of development at a particular stage of the project’s development and
the second shows how the classes were related to provide the behavioural structure of the program.

The basic layout is as follows:
1) Example…

Note: Inherited class properties will not be listed in the class to limit the size of the class description.

2) Example…

Note: If one class inherits from another it must also include it, this is automatically implied to avoid
clutter, also only important ‘include’ relationships are shown.

Interactive 3D Space Environment                                                                        154
Appendix                           Author: Chris Georgeiou
Ref: UML Class Design1

Interactive 3D Space Environment                   155
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                   156
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                   157
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                   158
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                   159
Appendix                           Author: Chris Georgeiou
REF: UML Diagram Design1

Interactive 3D Space Environment                   160
Appendix                           Author: Chris Georgeiou
REF: UML Class Design2

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou
REF: UML Diagram Design2

Interactive 3D Space Environment                       164
Appendix                           Author: Chris Georgeiou
REF: UML Class Design3a

Interactive 3D Space Environment                   165
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                   166
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                   167
REF: UML Diagram Design3a
Appendix                                 Author: Chris Georgeiou

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou
REF: UML Class Design3b

Interactive 3D Space Environment                        169
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                        170
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                        171
REF: UML Diagram Design3b
Appendix                                 Author: Chris Georgeiou

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou
REF: UML Class Design4

Interactive 3D Space Environment                        173
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                        174
REF: UML Diagram Design4

Appendix                                 Author: Chris Georgeiou

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou
REF: UML Class Design5

Interactive 3D Space Environment                             176
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                             177
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                             178
Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                             179
  REF: UML Diagram Design5

Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment                     180
Appendix                           Author: Chris Georgeiou
REF: UML_Class_Design6

Interactive 3D Space Environment

Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment

Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment

Appendix                           Author: Chris Georgeiou

Interactive 3D Space Environment

Appendix          REF: UML Diagram Design6   Author: Chris Georgeiou

Interactive 3D Space Environment                                       185
Appendix                           Author: Chris Georgeiou
REF: AC3D_3D_Models_Ship 1

REF: AC3D_3D_Models_Ship 2

REF: AC3D_3D_Models_Ship 3

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou
REF: AC3D_3D_Models_Ship_4

REF: AC3D_3D_Models_Ship 5

REF: AC3D_3D_Models_Tutorial

Interactive 3D Space Environment
Appendix                           Author: Chris Georgeiou

REF: AC3D_3D_Models_HiPolyBush

REF: AC3D_3D_Models_Plant&Trap

REF: AC3D_3D_Models_Tree

Interactive 3D Space Environment
Appendix                              Author: Chris Georgeiou

REF: AC3D_3D_Models_Teleport_Effect

REF: AC3D_3D_Models_HiPoly_Forest

Interactive 3D Space Environment
Appendix                                                    Author: Chris Georgeiou

REF: Logo_Design                               REF: Final_Logo_Design

REF: Application_Logo              REF: Application_Logo2

Interactive 3D Space Environment
Appendix                                                 Author: Chris

REF: Main_Menu_Design1

REF: Menu_Bg1                            REF: Menu_Bg2

Interactive 3D Space Environment

Appendix                                                     Author: Chris

REF: Loading_Bg1                         REF: Loading_Bg2

REF: Options_Menu_Bg1                    REF: Options_Menu_Bg2

Interactive 3D Space Environment

Appendix                                 Author: Chris

REF: StorySoFar_Bg

Interactive 3D Space Environment


Shared By: