Docstoc

Implementation_Document_v1.0.0.docx - GoogleCode

Document Sample
Implementation_Document_v1.0.0.docx - GoogleCode Powered By Docstoc
					Implementation




        V1.0.0

    Nebular Studios

      2011-06-20




                      1
Revision history
Version   Date         Changes                                              Author
0.0.1     2011-05-20   Initial version                                      Gudo Breukers
0.0.2     2011-05-20   Added the chapter about the Particle Generator       Gudo Breukers
0.0.3     2011-05-24   Added explanation of how to shoot in the direction   Niels Sondervan
                       of the crosshair
0.0.5     2011-05-27   Added explanation for the loading of maps            Daniel Jimenez Kwast
0.0.6     2011-05-30   Reviewed the maploader                               Patrick Slamp
0.0.7     2011-05-31   Added AI                                             Jelle de Jong
0.0.8     2011-06-01   Targeting enemy added                                Patrick Slamp
0.0.9     2011-06-14   Updated description for the maploader                Daniel Jimenez Kwast
0.0.10    2011-06-19   Added implementation explanation about collision     Maciej Czok
                       detection
0.1.0     2011-06-20   Removed unnecessary chapters, added                  Niels Sondervan
                       figurenumbers and captions.
0.1.1     2011-06-20   Added implementation explanation about frustum       Gudo Breukers
                       culling.
1.0.0     2011-06-20   Final version                                        Jelle de Jong




                                                                                                   2
Table of contents
1.      Introduction .................................................................................................................................... 4
     1.1.       Target audience ...................................................................................................................... 4
  1.2. Document outline ................................................................................................................... 4
2. Definitions and abbreviations ......................................................................................................... 5
     2.1.       Definitions ............................................................................................................................... 5
  2.2. Abbreviations .......................................................................................................................... 5
3. AISubsystem .................................................................................................................................... 6
     3.1.       BrainSurgeon........................................................................................................................... 6
     3.2.       RandomAI................................................................................................................................ 6
     3.3.       ArtificialBrain........................................................................................................................... 6
        3.3.1.         increaseSpeed() ............................................................................................................................ 6
        3.3.2.         decreaseSpeed() ........................................................................................................................... 7
        3.3.3.         rotateRelativeXYZAxis(float x, float y, float z) .............................................................................. 7
        3.3.4.         fireToAbsolutePosition(Vector position) ...................................................................................... 7
        3.3.5.         rotateToAbsolutePosition(Vector position) ................................................................................. 8
        3.3.1.         rotateToRelativePosition(Vector position, bool maxRotation) .................................................... 8
        3.3.2.         calculateRotation(float opposite, bool maxRotation, bool behind) ............................................. 9
        3.3.3.         canFireToAbsolutePosition(Vector position)................................................................................ 9
        3.3.4.         canFireToAbsoluteNormalizedDirection(Vector direction) ........................................................ 10
4.      PhysicsSubsystem ......................................................................................................................... 11
     4.1.       Collision detection ................................................................................................................ 11
        4.1.1.         The shape of an entity ................................................................................................................ 11
        4.1.2.         High level collision detection ...................................................................................................... 11
        4.1.3.         Low level collision detection ...................................................................................................... 12
     4.2.       Translating user input ........................................................................................................... 12
     4.3. Frustum culling...................................................................................................................... 13
5.     GameLogicSubsystem ................................................................................................................... 15
     5.1. Targeting enemy fighters ...................................................................................................... 15
6.     GameEntities ................................................................................................................................. 17
     6.1.       Shooting in the direction of the crosshair ............................................................................ 17
     6.2.       MapLoader ............................................................................................................................ 18
        6.2.1.         Export Script ............................................................................................................................... 18
        6.2.2.         Output ........................................................................................................................................ 20
7.      UtilsSubsystem .............................................................................................................................. 21
     7.1.       Particle Generator ................................................................................................................. 21
        7.1.1.         Particle ........................................................................................................................................ 22
        7.1.2.         Laserbeams ................................................................................................................................. 23




                                                                                                                                                                              3
    1. Introduction
This document describes the Implementation of Space, a 3D third person arcade style space combat
simulator developed by Nebular Studios. For a more general explanation, the technical design should
be read, this document describes the problems in depth.

    1.1.    Target audience
The target audiences for this document are current and future developers of Space.

    1.2.    Document outline
Some of the subsystems will receive a bit of extra attention in this document. Not every approach
chosen by Nebular Studios warrants an in depth discussion. This document should clarify parts of the
system that do.




                                                                                                       4
   2. Definitions and abbreviations
   2.1.      Definitions
Definition             Meaning




   2.2.      Abbreviations
Abbreviations          Meaning
AI                     Artificial Intelligence
GUI                    Graphical User Interface
HUD                    Heads Up Display




                                                  5
    3. AISubsystem
    3.1.    BrainSurgeon
The BrainSurgeon is the main class of the AI subsystem. This class controls all the brains (AI’s) in the
game. The BrainSurgeon is used by the AISubsystem class to determine the AI’s next steps.

    3.2.    RandomAI
To give the computer there first behavior we decided to implement a random behavior brain (AI).
This AI determines the course of action through the use of chances. These chances are written in the
Technical design. After we implemented the RandomBrain class we wanted to make a BrainFactory
so we can easily create a lot of the same or different brains. But due to the lack of matching brain
properties (RandomBrain and ArtificialBrain are two different types with less or no matching
properties). Also future AI brains wouldn’t rely on chances such as the RandomBrain does. Therefore
we decided not the create BrainFactory.

    3.3.    ArtificialBrain
The ArtificialBrain has several important methods, which can be used by specific kinds of
ArtificialBrains, to determine its next course of action. These methods are:

    -   whatsMyNextStep, this pure virtual method is called by the BrainSurgeon, and implemented
        for each Brain type. This method determines the Brain’s behaviour;
    -   increaseSpeed, this increases the Entity’s speed;
    -   decreaseSpeed, this decreases the Entity’s speed;
    -   rotateRelativeXYZAxis, rotates the Entity over its own current orientation;
    -   fireToAbsolutePosition, fires in the direction to the position in World coordinates;
    -   rotateToAbsolutePosition, rotates the Entity towards the point specified in World
        coordinates;
    -   rotateToRelativePosition, rotates the Entity towards the point relative to its own position.

Apart from these public methods, the ArtificialBrain also has some private methods as helper
functions, these are:

    -   calculateRotation, calculates a value between -1 and 1, describing the rotation;
    -   canFireToAbsolutePosition, returns true when the Brain can fire at the absolute position;
    -   canFireToAbsoluteNormalizedDirection, returns true when the Brain can fire at the absolute
        direction, given in World coordinates.

All these methods will be described below (except for the whatsMyNextStep):

     3.3.1. increaseSpeed()
Calling this method will simply create an increase speed Event, and pass it to the Controller. The
implementation is given below:




                                                                                                           6
Figure 1 - Increasing the speed of an AI controller Entity

    3.3.2. decreaseSpeed()
The same as increaseSpeed, but a decrease speed Event is passed.




Figure 2 - Decreasing the speed of an AI controlled Entity

     3.3.3. rotateRelativeXYZAxis(float x, float y, float z)
Calling this method will rotate the entity’s orientation over its axis, with the given amount of
degrees, for the next update tick. This number must always be no greater than 1.0f and no smaller
than -1.0f, because this is the maximum rotation of a given Entity. Because this is a private method,
there are no checks for this value, although it is asserted. Calling this method will only fire a
changeDirection Event, the actual rotation algorithm is implemented in the MoveComponent.




Figure 3 - Rotating the direction of an AI controlled Entity

     3.3.4. fireToAbsolutePosition(Vector position)
Calling this method will firstly check if the Entity can fire to the position, if so, a
shootPrimaryWeapon Event is fired. To determine if the Entity can fire to a given position, the
normalized fire direction is calculated. The normalized fire direction is calculated by substracting the
fire position from the Entity position, and then normalizing this.




                                                                                                           7
Figure 4 - Firing the weapons of an AI controlled Entity into a specific direction

    3.3.5. rotateToAbsolutePosition(Vector position)
A rotation towards an absolute position is done by simply calculating the relative position, and then
calling rotateToRelativePosition(relativePosition).




Figure 5 - Rotating the direction of an AI controlled Entity

    3.3.1. rotateToRelativePosition(Vector position, bool maxRotation)
To rotate to a relative position, it is only necessary to know if the position is above or below, left or
right, behind or in front. It is also required (albeit not strictly necessary) to know the angle in relation
to the respective plane. This angle is the angle between the xz plane and the direction for rotation
over the y axis, and the angle between the yz plane and the direction for the rotation over the x axis.


To determine the angle between a given plane and a position, a standard formula can be used to
determine the normal of the plane (with the cross product) and then using the dot product to
determine the distance to this plane, and finally extracting an angle from this distance to the plane
and the direction.




Figure 6 - Rotating the direction of an AI controlled Entity

However, the cross product need not be calculated, because the normal to the xz plane is the
upvector of the Entity, and the normal to the yz plane is the rightvector. The dot product calculation
is also relatively simple because all vectors are already normalized (by contract, which is asserted
upon).




                                                                                                               8
Figure 7 - Determining whether the AI controlled Entity is in front of or behind the player

Finally the angle needs to be calculated, this is done in a helper method calculateRotation(float
opposite, bool maxRotation, bool behind), after which a rotate Event can be fired.




Figure 8 - Rotating the direction of an AI controlled Entity

    3.3.2. calculateRotation(float opposite, bool maxRotation, bool behind)
This method will determine the angle between a given plane and the hypotenuse, given that the
hypotenuse’s length equals 1. The return value is a value between -1.0f and 1.0f, with a linear
relation to -90 and 90 degrees, or 1 when <-90 or >90 degrees. If maxRotation or behind is true, the
return value is always -1.0f or 1.0f.




Figure 9 - Calculating the rotation needed for an AI controlled Entity

    3.3.3. canFireToAbsolutePosition(Vector position)
This method will determine the direction from the Brain’s position to the firePosition, and return the
call to canFireToAbsoluteNormalizedDirection. The direction is calculated by substracting the fire
position from the Entity position, and then normalizing this.




Figure 10 - Determining whether an AI controlled Entity can fire in the desired direction




                                                                                                         9
     3.3.4. canFireToAbsoluteNormalizedDirection(Vector direction)
Calling this method will return true if the Brain can shoot in the given direction. This is done by
calculating the angle between its direction (parameter) and the Entity’s direction. Because both
these directions are normalized (by contract), no division by length is needed. This length is asserted
upon however.




Figure 11 - Determining whether an AI controlled Entity can fire in the desired direction




                                                                                                          10
    4. PhysicsSubsystem
    4.1.     Collision detection
The collision detection in our engine is bases on the following aspects:
    -   The shape of an entity
    -   The high level collision detection among all possible entities during each tick
    -   The low level collision detection between two basic geometrical volumes

    4.1.1. The shape of an entity
The shape of an entity is of course primarily determined by the mesh being used. As collision
detection based on the mesh itself is however expansive and inefficient, generally a hierarchy of
bounding volumes is used to represent the shape of an entity. Logically every used mesh requires a
suitable bounding volume hierarchy. In our game a hierarchy of bounding volumes is implemented
by the class CollisionNode which inherits from the recursive Node template class and uses
RelativeBoundingSpheres as volume. It ultimately offers a method in which a CollisionNode is
compared with another CollisionNode recursively considering the connected
RelativeBoundingSpheres. The low level collision detection methods are called from that method. A
RelativeBoundingSphere is a sphere whose radius is relative to the SCALE property and whose
position is relative to the CURRENT_POSITION property of the owner entity. By making the methods
getPosition and getRadius virtual in the super classes and overriding them in
RelativeBoundingSphere to do the extra calculations needed, the client can tread a
RelativeBoundingSphere the same way as an ordinary BoundingSphere whose radius and position is
always absolute. Unfortunately we realized that we also need to deal with the current rotation and
orientation of the entity. Furthermore the creation of a complete hierarchy per mesh turned out to
be very time consuming. That is why we decided to use just a single bounding volume per mesh
being smaller that the mesh. This has of course the disadvantage to be inaccurate but was the only
possibility to manage this part of collision detection within the time.

   4.1.2. High level collision detection
We have placed the high level collision detection between updating all entities and rendering the
current frame. It takes place within EnginePhysics in the class CollisionDetector as reaction on the
UPDATE_EVENT coming from GameLogic. The CollisionDetector checks all possible collisions and
stores found collisions in a list of entity id pairs. This list is returned and then used by
CollisionResponder to react on the different collisions. The engine separates thus model updating,
collision detection and collision response completely from each other.
The collision detection works in the following manner:




                                                                                                       11
    1. Retrieve all current existing entities from the scene graph
    2. Iterate through all entities and do:
            a. If not movable object
                      i. than pick next entity
            b. Retrieve all neighbors of the selected entity from the scene graph
            c. Iterate through all neighbors and do:
                      i. If neighbor is explosion or both entities are bullets
                               1. than pick next neighbor
                     ii. If first entity is bullet
                               1. If neighbor is in first entity’s friend list.
                                         a. Than pick next neighbor
                    iii. If neighbor is bullet
                               1. If first entity is in neighbor’s friend list.
                                         a. Than pick next neighbor
                    iv. If the two pairs have been controlled already
                               1. Than pick next neighbor
                     v. Check recursively for an intersection
                    vi. If intersection
                               1. Than mark both entities as collided
                   vii. Mark both entities as checked

For step 2.c.v. the CollisionNodes of both entities are used to search recursively for a collision.
The collision detection performance is heavily depended on the scene graph implementation. In step
2.c. the neighbors of a certain entity is requested. In case of a suitable space partitioning the
neighbors list contains only possible candidates. For many entities the neighbors list is empty
completely so that the inner loop is not executed at all. Other notable aspects are the many
exceptions: The main loop of the method just iterates over the non-static entities. Furthermore we
do not do detection for two bullets and for an explosion. Additionally each bullet has a friend list in
which the ids of the friends are stored. All these exceptions result in a faster detection because the
actual low level collision detection does not take place for them. Ultimately a list of all found
collisions is returned to be handled by CollisionResponder.

   4.1.3. Low level collision detection
Both the high level collision detection and the space partitioning of the scene graph are dependent
on low level collision detection. EngineUtils contains such low level intersection methods which
compare two bounding volumes with each other. Each method deals with different subtypes of the
BoundingVolume base class.

    4.2.    Translating user input
As described in the technical design, the PhysicsSubsystem translates user input into usable data for
the rest of the application. Letting the player entity follow the mouse cursor has been described in
that document. Shooting in the direction of the cursor is described in chapter 6, because the actual
math is performed in the GameEntities ‘subsystem’.




                                                                                                          12
    4.3.      Frustum culling
Frustum culling is partially done in the PhysicsSubsystem. The GraphicsSubsystem sends an event to
the PhysicsSubsystem before it renders the entities. It requires a list of the PhysicsSubsystem,
containing all the entities to render. It gives the event the following properties: an empty list (which
is going to be filled with entities), the camera and frustum data. This data consists of the following of
the near and far planes: the distance to the planes, the height of the planes and the width of the
planes.
After receiving the event, the PhysicsSubsystem updates the camera and then calculates the center
of the near and far plane and the right direction of the camera. It does this in the following way:


vector3f cameraDirection = glm::normalize(camera->getOrigin() - camera->getFocus());
vector3f farCenter = camera->getOrigin() - cameraDirection * data->farDist;
vector3f nearCenter = camera->getOrigin() - cameraDirection * data->nearDist;
vector3f rightDirection = glm::normalize(glm::cross(cameraDirection, camera->getUpDirection()));


These vectors are then used to calculate all the planes of the frustum.




Figure 122: An example of all the points in the frustum.




                                                                                                            13
These points are calculated in the following way:




Figure 13: The calculations for every point in the frustum.

The abbreviations in these calculations are:

Fc = the vector farCenter, the center of the far plane.

Nc = the vector nearCenter, the center of the near plane.

Hfar/Wfar/Hnear/Wnear = the width and height of the near and far planes.

Up = the up direction of the camera.

Right = the vector rightDirection, the right direction of the camera.



The PhysicsSubsystem creates a frustum bounding volume with all the planes and then uses the
method getEntitiesInBoundingVolume() of the scenegraph to retrieve all the entities inside the
frustum. The scenegraph uses the low level collision detection in EngineUtils to determine which
entities are inside the frustum. The list, which was added as property to the event of the
PhysicsSubsystem, will be filled with entities and afterwards the GraphicsSubsystem renders all the
entities in it.




                                                                                                      14
    5. GameLogicSubsystem
    5.1.    Targeting enemy fighters
Targeting enemy fighters is very important for the user experience, because this allows the player to
easier follow and determine the next enemy fighter. To make the experience more user friendly the
selected fighter is highlighted by a so called targeting box. An example of the targeting box can be
seen on the right side of this paragraph.


Switching between enemies can be done by
pressing the D (default) this then highlights the
nearest enemy fighter. The steps taken to switch
between enemy fighters are as follows:

                                                       Figure 14 - A selected enemy Entity
    1. Determine the selected enemy (which is
        stored in the player entity)
uint previousSelectedEntityId = boost::any_cast<uint>(
entity->getProperty(SELECTED_ENEMY));
Entity* previousSelectedEntity = mSubsystem->getScenegraph()
->getEntity(previousSelectedEntityId);
    2. Deselect (de highlight) the selected enemy
if(previousSelectedEntity != NULL) {
        previousSelectedEntity->setProperty(IS_SELECTED, false);
}
    3. Determine the nearest enemy using the scenegraph
Entity* selectedEnemyEntity = mSubsystem->getScenegraph()
->getNearestEntity(entityId, entityTypes);
    4. Highlight the nearest enemy
selectedEnemyEntity->setProperty(IS_SELECTED, true);
    5. Set the selected enemy property in the player enity
uint selectedEnemyEntityId = boost::any_cast<uint>(
selectedEnemyEntity->getProperty(ID));
entity->setProperty(SELECTED_ENEMY, selectedEnemyEntityId);




                                                                                                        15
The following code in GraphicsController (located in Graphics Subsystem) draws a target box around
the fighter. The size of the target box is equal to size of bounding sphere.
//CHECK if the current rendered fighter is selected by the player
bool isSelected = boost::any_cast<bool>(
entity->getProperty(IS_SELECTED));
      if(isSelected) {
            //Get the bounding sphere of the fighter
            const BoundingSphere* sphere =
            &(boost::any_cast<shared_ptr<CollisionNode> >(
            entity->getProperty(BOUNDING_VOLUME))->getBoundingVolume());
            assert(sphere != NULL);
            float radius = sphere->getRadius();
            //WIREFRAME MODE is activated
            glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
            glDisable(GL_TEXTURE_2D);
            //Set color green and paint a skeleton cube after
            //creating the skeleton cube set the color back to white
            glColor3f(26.0f, 255.0f, 0.0f);
            drawCube((radius*2.0f));
            glColor3f(255.0f, 255.0f, 255.0f);
            //WIREFRAME MODE is deactivated
            glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
            glEnable(GL_TEXTURE_2D);
      }




                                                                                                     16
    6. GameEntities
    6.1.    Shooting in the direction of the crosshair
To shoot in the direction of the crosshair, the following data is needed:
    -   Mouse input with x and y coordinates;
    -   Screen resolution.
First the desired direction of the bullet should be calculated. The mouse coordinates generated by
Allegro are relative pixel positions in the screen, i.e. x= 789, y= 420. The OpenGL coordinate system
works differently. It ranges from -1 to 1 and is flipped in relation to Allegro.
The near plane is set to 1.0 and that’s the plane the player will shoot through. Therefore the mouse
coordinates are transformed to OpenGL coordinates and the z-direction is set to the z of the near
plane. Like so:
int mouseX = boost::any_cast<int>(event->getProperty(MOUSE_X));
int mouseY = boost::any_cast<int>(event->getProperty(MOUSE_Y));
int screenWidth = boost::lexical_cast<int>(manager-
>getProperty(SCREEN_WIDTH));
int screenHeight = boost::lexical_cast<int>(manager-
>getProperty(SCREEN_HEIGHT));


float x = (float)mouseX / (screenWidth / 2) - 1;
float y = (float)mouseY / (screenHeight / 2) - 1;
glm::vec3 direction(-x, -y, 1.0f);


The direction vector is the desired direction of the bullet. However, it’s only useful if the current
orientation of the player entity is along the z-axis. Unfortunately that’s very unlikely to be the case,
so some further calculation is needed.
The direction of the player entity could be turned back onto the z-axis, so the desired direction of
the bullet is now valid. Set it as the current direction of the bullet and turn it back to the original
position.
Another approach would be to calculate how to turn the desired direction in relation to the z-axis,
apply that to the current direction of the player and set it as the current direction of the bullet. The
latter approach is chosen:
glm::mat4 bulletOrientation =
calculateBulletOrientation(orientation, bulletDirection);




                                                                                                           17
glm::mat4 calculateBulletOrientation(glm::mat4 orientation,
glm::vec3 bulletDirection) {
            glm::vec3 zAxis(0.0f, 0.0f, 1.0f);
            glm::vec3 normalBulletZ = glm::cross(zAxis,
bulletDirection);
            float angleBulletZ =
calculateAngleWithZAxis(bulletDirection);
            return glm::rotate(orientation, angleBulletZ,
normalBulletZ);
      }


float calculateAngleWithZAxis(glm::vec3 vector) {
            glm::vec3 zAxis(0.0f, 0.0f, 1.0f);
            return acos(glm::dot(vector, zAxis) / glm::length(vector)) /
PI * 180;
      }
Now the player is able to shoot in the direction of the crosshair. This is somewhat of a shortcut
however as the camera position is different from the player entity position. Fortunately for the
player experience this is enough.



    6.2.    MapLoader
The MapLoader – which load maps that are created and exported with Blender and Python –
assumes a couple of conditions are met. This means that the following must be considered
whenever designing a map with Blender:

    1. The name which can be given to any object in Blender is used as a type name. It should be
       noted however that whenever there are duplicate names within a Blender scene it will
       append that name with a number. This number can be ignored as the export script will
       remove it.
    2. The type name given to objects should be added as an if-clause in the createEntity(string
       type) method of the MapLoader.
    3. The scene must contain a cube mesh which describes the measurements of the entire map
       (or universe) and this cube must have ‘universe’ as its name.
    4. The scene must also contain a player mesh which describes the position and orientation of
       the player. This mesh must have ‘player’ as its name.


    6.2.1. Export Script
In order to export the Blender scene a Python script has been written. It basically retrieves Blender’s
object data and writes these out to a file



                                                                                                          18
import bpy
import Blender
from Blender import *

def write_obj(filepath):
          mapfile = filepath + '.map'
          entitiesExtension = '.entities'
          spawnareasExtension = '.spawnareas'

         map = file(mapfile, 'w')
         entities = file(filepath + entitiesExtension, 'w')
         spawnareas = file(filepath + spawnareasExtension, 'w')

         universe = Object.Get('universe')
         size = universe.getSize()
         map.write( 'universeSize %f %f %f\n' % (size[0], size[1], size[2]) )
         map.write( 'entitiesExtension %s\n' % (entitiesExtension) )
         map.write( 'spawnareasExtension %s\n' % (spawnareasExtension) )

         player = Object.Get('player')
         playerLoc = player.getLocation()
         map.write( 'playerPosition %f %f %f\n' % (playerLoc[0], playerLoc[1], playerLoc[2]) )

         playerOrientation = player.getMatrix().rotationPart()
         map.write( 'playerOrientationR0 %f %f %f\n' % (playerOrientation[0][0], playerOrientation[0][1], playerOrientation[0][2]) )
         map.write( 'playerOrientationR1 %f %f %f\n' % (playerOrientation[1][0], playerOrientation[1][1], playerOrientation[1][2]) )
         map.write( 'playerOrientationR2 %f %f %f\n' % (playerOrientation[2][0], playerOrientation[2][1], playerOrientation[2][2]) )

         for obj in bpy.data.objects:
                    if (obj != universe) and (obj != player):
                                meshName = obj.getName()
                                if meshName.find('.') > -1:
                                          meshName = meshName.split('.')[0]

                            if meshName == 'spawnarea':
                                     spawnareas.write( 'type %s\n' % (meshName) )

                                      location = obj.getLocation()
                                      spawnareas.write( 'position %f %f %f\n' % (location[0], location[1], location[2]) )

                                      scaleVec = obj.getMatrix().scalePart()
                                      spawnareas.write( 'scale %f %f %f\n' % (scaleVec[0], scaleVec[1], scaleVec[2]) )
                                      spawnareas.write( '\n' )

                            else:
                                      entities.write( 'type %s\n' % (meshName) )

                                      location = obj.getLocation()
                                      entities.write( 'position %f %f %f\n' % (location[0], location[1], location[2]) )

                                      rotationMat = obj.getMatrix().rotationPart()
                                      entities.write( 'rotationMatR0 %f %f %f\n' % (rotationMat[0][0], rotationMat[0][1], rotationMat[0][2]) )
                                      entities.write( 'rotationMatR1 %f %f %f\n' % (rotationMat[1][0], rotationMat[1][1], rotationMat[1][2]) )
                                      entities.write( 'rotationMatR2 %f %f %f\n' % (rotationMat[2][0], rotationMat[2][1], rotationMat[2][2]) )

                                      scaleVec = obj.getMatrix().scalePart()
                                      entities.write( 'scale %f %f %f\n' % (scaleVec[0], scaleVec[1], scaleVec[2]) )
                                      entities.write( '\n' )

         map.close()
         entities.close()
         spawnareas.close()
Blender.Window.FileSelector(write_obj, "Export")



                                                                                                                                         19
              6.2.2. Output
        When executing this script within Blender a wizard will pop up prompting where to save the output
        file. You should browse to the desired location and specify the desire name without extension. The
        python script will create three files which describe the scene.

        The first file will have the ‘.map’ extension and may look like this:

universeSize 20754.804688 20754.804688 20754.804688
entitiesExtension .entities
spawnareasExtension .spawnareas
playerPosition 180.141998 492.327515 166.134705
playerOrientationR0 0.972149 -0.141604 -0.186749
playerOrientationR1 0.186443 0.950089 0.250140
playerOrientationR2 0.142007 -0.277991 0.950029




        The second file will have the ‘.spawnareas’ extension and may look like this:

type spawnarea
position 229.134354 -431.273376 67.730850
scale 71.310081 -71.310081 71.310081

type spawnarea
position 22.044502 -299.768860 -16.365784
scale 71.310081 -71.310081 71.310081




         The final file will have the ‘.entities’ extension and may look like this:

type asteroid
position -127.134827 55.397449 116.869675
rotationMatR0 0.410465 -1.338427 -0.964433
rotationMatR1 1.574334 0.614767 -0.183126
rotationMatR2 0.492943 -0.848925 1.387925
scale 1.700000 1.700000 1.700000

type asteroid
position 258.022095 13.904026 45.916626
rotationMatR0 -0.791331 0.954615 -1.162973
rotationMatR1 -0.831838 0.817348 1.236927
rotationMatR2 1.253731 1.144837 0.086643
scale 1.700000 1.700000 1.700000




        The MapLoader will read all of these files and construct a Map struct, which consists of all the
        information listed in the files.




                                                                                                             20
    7. UtilsSubsystem
    7.1.    Particle Generator
The spacefighters in the game Space shoot with laserbeams. A good technique to use for those
laserbeams is a “particlesystem”, it is used in games for special effects like explosions and gives a
nice effect. A tutorial on nehe.gamedev.net explained how to implement a particle generator and
that example is used to create the generator for Space.




                                 Figure 135: An example of the particle effect




                                                                                                        21
     7.1.1. Particle
The implementation of the generator started by defining a struct for the particles. A particle has the
following attributes:

   struct Particle {

            bool                         active;                       // Active (Yes/No)

            float                        life;                         // Particle Life

            float                        fadeSpeed;                    // Fade Speed

            float                        red;                          // Red color value

            float                        green;                        // Green color value

            float                        blue;                         // Blue color value

            float                        size;                         // size

            float                        textureId;                    // texture id

            Vector3f                     position;                     // Position

            Vector3f                     direction;                    // Direction

            Vector3f                     gravity;                      // Gravity

       };



A particle is basically a cube with a texture on it and a color. The life attribute is the value used for
the Alpha parameter of the glColor4f() function and the fadespeed attribute is a value that the life
value is subtracted with to create the effect that a particle fades away.




      Figure 16: Texture of a particle




                                                                                                            22
    7.1.2. Laserbeams
To create a laserbeam, four particles are randomly placed around a position on the ray of the
laserbeam. There are multiple positions with particles in a laserbeam and there is a constant interval
between the positions. The effect that the particles produced was not really the effect that was
wanted.




                         Figure 17: A screenshot of our laserbeam made with particles

The effect does not really look like the effect that was shown in the example. Firing this laserbeam
looks the same as when you fire a single red cube. The difference between the laserbeam consisting
of particles and the laserbeam consisting of a single cube is that the one with particles is actually 500
little cubes. This means using the particles will have a major effect on performance while have the
same effect as a single cube.

Too improve the performance of the particles it was tried to render quads instead of cubes but this
gave an ugly effect when you looked at a particle beam from the side. After trying to render quads
the, it was tried to render the particles with only three quads that were perpendicular combined to
each other; however this still did not give the effect wanted.

One option that was not tried yet is rendering only quads and let them face toward the camera of
the player. Unfortunately we decided to not try this option and a single cube is used instead of the
particles. The reason for not implementing the option is because the implementation and fine-
tuning the particles will take too much time and there are features with higher priority that need to
be implemented first.




                                                                                                            23

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:0
posted:2/23/2013
language:Unknown
pages:23