Docstoc

Rory Kelly

Document Sample
Rory Kelly Powered By Docstoc
					                                                        Rory Kelly 0601896
                      DirectX : Terrain and Water Simulation

Introduction
        The original idea for this application was to create a 3d object which
resembled the appearance and behaviour of a lake, and include it in a 3d
environment. This included the waters interaction with light, including reflections, its
movement and interactions with objects dropped into the water‟s surface.
        The lighting of the water should be updated in real time relative to the waters
movement and include diffusive lighting, ambient lighting and specular reflections,
each controlled separately by variables set within the code.
        The movement of the water should accurately resembles waves and ripples
caused by atmospheric conditions such as wind and tides, as well as movement
caused by objects falling through the surface of that water.
The reflections within the water should accurately reflect any object facing the water
and be updated in real time relative to the users position.
         The application will also create a 3D scene including a realistic terrain and
skybox, but this will mainly be for aesthetic purposes to allow the user to view how
the water effects could be applied in a realistic game environment, as a result these
will not be the main focus of this task. The sky box should have an animated texture,
allowing the application to simulate the movement of clouds across the sky. The
application should also set up a camera and maintain appropriate projection
matrices, all controlled by the user allowing him to move around the scene and view
it from many different angles.
        The application should be developed using visual studio and the directX
software development kit (SDK), making sure the application runs fast enough to be
included in a real time application and manage any data structures appropriately
ensuring no memory leaks. The code should also be organised in an object
orientated manor and be appropriately comment and arranged allow the code to be
easily read and understood. Where applicable the application will attempt to use
pixel and vertex shaders as much as possible, in order to achieve the most
aesthetically pleasing results.
        The project should create a frame work which initialises DirectX, sets up a
window and create a message handler to deal with the users input from the window(
such as resizing, maximising etc.) and use Direct Input to allow the user control of
the camera through the mouse and keyboard.

Application
       The final application produced had most of the original planned functionality
but not all. The original specification (outlined above) had the water simulation
reacting to objects falling through its surface; this part of the application has been
dropped due to time constraints. Having said that most of the necessary calculations
and data structures are present in the framework to allow this expansion to be easily
included at a later date.
       The actual application produced creates a sky box, a realistic terrain and a
water simulation, as well as a camera which move along the terrain, simulating a
person walking. The skybox scrolls textures across its surface creating the illusion of


                                           1
                                                        Rory Kelly 0601896
moving clouds, and the terrain is generated form a height map, and textured using
three textures blended according to a blend map.
 The water simulation is a mesh with its y co-ordinates offset by a formula designed
to simulate water, within the vertex shader. The water is light as specified above and
its normal‟s are calculated within the shader relative to the mesh‟s position (as it is
animated) to create more realistic lighting effects.

Controls
When the application is opened the user will find the camera rendered in the centre
of a terrain surrounded by mountains. To the left is a representation of the texture
blending and a small open plane, to the right is a lake with the afore mentioned water
effects. The user controls the camera with the mouse and its position with the „w‟, 'a‟
,‟s‟ and„d‟ keys. With „w‟ and „a‟ being forward and back, and „a‟ and„d‟ being left and
right, the camera moves along fixed to the terrain. In order to view the specular effect
of the terrain the user will have to move around the water and observe the different
patterns of light and dark created.

General structure of code
The application is structured into header files and '.cpp' files as well as some '.fx'
(effect files). Most of these files are taken from the Frank Luna framework provided
with his book Introduction to Game Programming with DirectX 9.0c, which I have
edited heavily in order to create my application. The functions and contents of each
header file and associated .cpp file is outlined below

Camera.h/.cpp
These files create the camera class with in my application the deal with creating the
projection matrix and ensuring the camera is fixed to the surface this is provided with
the Frank Luna framework, but has been edited slightly.
D3dApp.h/.cpp
This file create the D3DAPP class, which my main application class ( see
Terrain_Water.h) derives from, and deals with creating the main application window,
running the message loop, handling window messages, initializing direct 3d and
enabling full screen mode. It also creates the IDirect3DDevice9 object and sets up
the framework functions (virtual functions which are override). This is form the Frank
Luna frame work and has not been edited.
D3DUTIL.h/cpp
These files contain utility functions used by other headers within the framework. This
includes creating global variables for convenient access and structures for materials
and lighting. This is form the Frank Luna frame work and has been slightly edited to
include a function which calculates spherical texture co-ordinates.
DirectInput.h/cpp
These files bypass the message system and works directly with the input drives to
allow us to quickly obtain mouse and keyboard input. This is from the Frank Luna
framework and has not been edited.
HeightMap.h/cpp



                                           2
                                                        Rory Kelly 0601896
These files allow us to load in and set variables for a .raw file, the values form this
height map are used to generate the heights of the terrain. This is also provided with
the Frank Luna frame work and has not been edited.
SkyBox.h/.cpp
 These files contain the “SkyDome” class which contain function and variables
associated with creating, texturing animating and rendering the skybox, as well as
returning values such as the amount of vertices and triangles in the skyboxes mesh.
Terrain.h/.cpp
 These files contain the class “Terrain” which deals with creating geometry for a
terrain, rendering it and texturing it.
Vertex.h/.cpp
These files deal with creating the vertex definitions used within the project and
creating function which initialise them.
WaterMesh.h/.cpp
 These files deal with creating the “water” class which represents the water with in
this application. The “water” creates a mesh of a set size, renders it and animates it.
Terrain_Water.h/.cpp
This is the main body of my application it contains the Terrain_WaterDemo class
derived from D3DAPP. This class contains all of the data structures and function
used by my program to create the application and it is where all of the previously
defined classes and functionality are tied together to allow the application to run.

Code organisation and operation

Framework
As I have said previously I am using the Frank Luna frame work in order to create
my project as a result I will only briefly explain its operation. The frame work deals
with setting up the windows and initialising the vertex definition, it also creates the
present parameters structure and created the IDIRECT3D9DEVICE and gives it
global access.

Terrain
The terrain in my project is not the main focus of my project an as a result I will only
briefly explain its operation. As I have said previously the terrain class is a heavily
edited version of the terrain generator in the Frank Luna book.
        The terrain is represented in the code by the “Terrain” class located in
Terrain.h. This class geometrically creates a triangle grid around the origin and then
translates it out to a specified location and then saves the output to a buffer. The
mesh then sets its vertex buffers to these values and sets the y co-ordinate to a
value sampled forma height map. The height map is a grey scale.raw file that
represents the weights of my terrain; they are loaded in to the file and represented
by the height map class (located in the Heightmap.h file). It also builds the effect and
then sends the lighting and textures to the effect file. The .effect file is reasonably
simple and performs basic diffuse lighting on the surface using the following lines.



                                           3
                                                        Rory Kelly 0601896
outVS.shade = saturate(max(0.0f, dot(normalW, gDirToSunW))) + 0.3f;

This calculates the cos of the angle between the surface normal and a vector
pointing from the normal to the sun, the 0.3 is the ambient component. The more
complicated aspect of this effect file is the texture blending carried out in the pixel
shader. The pixel samples three textures and then according to a fourth then blends
and draws them to the pixel i.e.

float totalInverse = 1.0f / (B.r + B.g + B.b);
    // Scale the colors by each layer by weight stored in the blendmap.
    c0 *= B.r * totalInverse;
    c1 *= B.g * totalInverse;
    c2 *= B.b * totalInverse;
    // Sum the colors and modulate with the shade to brighten/darken.
    float3 final = (c0 + c1 + c2) * shade;
Sky Box


The sky box in my scene is rendered by the “skyDome” class. This creates a unit
sphere and renders it around the camera position. During rendering, depth testing is
turned off this results in the rest of the scene being rendered over the sky, and the
sky representing a far plane which the player can never reach. It also creates texture
co-ordinates for the sphere, by taking into account the position of each vertex. This
class like all of my classes creates the geometry in the constructor, updates the
geometry in a separate function and draws in a spate function. It also contains a
deconstructor to free up COM objects and functions to reset the effect file when the
device is lost.
        Its effect file simply animates the sky by passing two textures over each other
and blending their colour values with a blue value. The texture are moved by
offsetting the two sets of texture co-ordinates by a value that is updated each frame
and then passed in to the shader. The vertex shader updates the texture co-
ordinates by adding these offset values to the texture co-ordinates. These are then
passed on to the pixel shader where the texture is sampled and blended in the same
way as before i.e.

Vertex shader:
// Pass on texture coordinates to be interpolated in rasterization.
outVS.tex0 = tex0 + gTexOffset0;
outVS.tex1 = tex0 + gTexOffset1;


Pixel shader:
float3 c0 = tex2D(CloudS0,     tex0).rgb;
float3 c1 = tex2D(CloudS1,     tex1).rgb;
float3 blue = float3(0.0f,     0.0f, 1.0f);
 return float4(c0+c1+blue,     1.0f);


Water


                                          4
                                                         Rory Kelly 0601896
The water within my scene is represented by the “Water” class contained in the
“WaterMesh.h” file. This class just renders a mesh 100 X 100 and then creates
generates random numbers within the range 0.1-0.8, which are saved to each y co-
ordinate in the mesh. This is used later in the shader to “randomise” the wave
effects. It also draws the water to the stencil buffer, used later to produce reflections
on its surface.
The rippling of the water is again created within the vertex shader using the following
formula, with the y co-ordinate representing rand:

(rand*A)*sin((rand*K])*d - gTime*(rand*W) + (rand*P]));

Where A is the amplitude of the wave, K is the wave number, W is the frequency of
the wave, P is the phase shifts and d is the distance of the vertex from the start of
the wave. These variables allow me to control the shape of the wave along the mesh
and control the progression of the wave across the mesh. The rand value is a
different value for every vertex within the range 0.1-0.8. This creates a semi-random
height value for each y co-ordinates trying to simulate the look and behaviour of real
wave patterns. The wave is further randomised by summing it together with another
wave with different values for each of the constants within the formula. The values
created from this formula are then used to represent the height of each vertex.
       Like the Terrain the lighting of the water is calculated within the shader. Since
the shape of the mesh changes every frame, the normal‟s are calculated within the
shader by taking the partial differentials of the formulas that define the shape of the
mesh, with respect to x and z. Since we sum the two waves together, the differential
of the resulting wave is the sum of the two constituent wave‟s i.e.

// for each wave
dx =(rand*A)*(rand*K)*x*cos(K*d - gTime*(rand*W)+rand*P))/d;
dz =(rand*A)*(rand*K)*z*cos(K*d - gTime*(rand*W)+rand*P))/d;


 This creates two tangent vectors which are then crossed together to generate a
normal at that point in the mesh. I could just work out a tangent in the x direction and
cross it with a vector parallel to the z axis, but I have kept working out the z tangent
as the original design involved circular waves that were created by an object passing
through the water‟s surface. Since this would create circular waves I would need to
generate a second tangent as the z axis would no longer be a tangent. This has
been left in to allow for future expansion.
        The shader file, since we now have the normal‟s, then calculates the amount
of each type of light the water reflects. I.e. specular, diffuse and ambient. The
specular component is calculated by first finding the reflection vector by subbing the
light direction into the HLSL function reflect(). The dot product between this vector
and the tocamera vector is then calculated. This is raised to a specular power which
defines the cone of reflectance (“shineyness”) of the material.

      float t    = pow(max(dot(r, toEye), 0.0f), gSpecularPower);




                                           5
                                                           Rory Kelly 0601896
 This value is then multiplied with the product of the specular material and the
specular light component.
       The diffuse lighting is calculated the using the same formula used to calculate
diffuse lighting for the terrain, although the water has a diffuse material specified in
the water class. Each different type of lighting has a material associated with it; a
material is an rgb value that represents that colour of each type of light the surface
reflects. This material is passed into the class along with the diffuse component of
the light source into the vertex shader which multiplies them together and them subs
them in the lighting calculation. i.e.

float3 diffuse = s*(gDiffuseMtrl*gDiffuseLight).rgb;


The specular and ambient values are calculated in the same way and passed on to
the vertex shader i.e.

float3 spec = t*(gSpecularMtrl*gSpecularLight).rgb;
float3 ambient = gAmbientMtrl*gAmbientLight;


Reflections
The reflections in this application is carried on the c.p.u by firstly computing a
reflection matrix using the directX library function D3DMatrixReflect, which uses
snell's law to product a matrix which reflects a surface in a plane( in our case the ox
plane(0.0f,1.0f,0.0f)). This is calculated within the constructor of my application, as it
only need to be calculated once and is not updated with each frame. The reflect
matrix is then passed into the Terrain class Terrain::drawReflected(), which draws an
inverted terrain.
 But how come the inverted terrain is only drawn on the reflective surface of the
water? In order to make sure that the inverted terrain is only drawn to the reflective
surface of the water I perform a stencil test using the stencil buffer. In the drawScene
function of the Terrain_Water class you may notice that the water is also drawn to the
stencil buffer. This sets the positions in the stencil buffer that correspond to the
waters location in the scene to 1, then when drawing the reflected terrain I perform a
stencil test that only allows the terrain to be drawn if the location in the buffer is equal
to 1. A second problem, encountered when the reflected terrain is rendered is double
blending as triangles hidden behind other triangles are blended twice on the same
location resulting in darker patches in the reflection. I solved this problem by setting
the following render state.

HR(gd3dDevice->SetRenderState(D3DRS_STENCILPASS,             D3DSTENCILOP_ZERO));


This wraps the stencil location currently being tested to zero if the stencil test
passes. This means that the same location cannot be written to twice, alleviating the
problem of double blending.




                                             6
                                                        Rory Kelly 0601896


Appraisal

The application created meets most of the objectives that were set at its beginning.
Despite this some functionality was still dropped, namely the interaction with objects
dropped through the water. Despite this if given more time this effect could be easily
included, as the code has some of the functionality necessary to create this such as
working out two tangents relative to the x and z co-ordinates.
         The application is also designed in an object orientated manor which makes
further expansion simple as it is written as with as little interdependence as possible.
One of the main objectives of my project was to try and create an application in
which adding new functionality meant that other objects did not have to be changed
as well. A second aim also wanted to create modules that could be included in other
applications, possibly with different rendering systems, without much editing. For
example the sky box is included in separate files and any variables it needs are
passed in via parameters making it self contained, and as a result easy to include in
a separate project.
       The project runs at an adequate frame rate allowing the user to move about
the scene without slowing down. But having said this, if any further functionality was
added it would probably slow down significantly. The frame rate of the application
could be improved if all of the shaders were included in a single .fx file with multiple
passes for each of the different objects in the scene. This is because a different
effect file won‟t have to be “sent” to the GPU every frame depending on the object
being rendered. The terrain could also be rendered as a collection of sub grids and
cull sections which are not within view, as opposed to a giant mesh constantly
rendered. I could also write a HLSL function to generate a random number, or use
the existing noise() function, instead of inserting random values into the terrain
mesh‟s y co-ordinated every frame.
       Aesthetically the look of the application could be improved by loading in more
textures for the terrain and then blend them according to the height of the terrain.
one could also set it so that the camera can only walk up a slope if it angle is less
than a certain value so that the player can‟t walk up steep mountains, or set it so that
the player can‟t go under the water mesh.
       When creating the application of several strategies were presented to animate
the water with in the application. One was just to give the mesh random heights, but
this resulted in a messy mesh with no resemblance to water, it also made it almost
impossible to calculate normal‟s which accurately changed with the mesh. The
second idea was summed sine waves that progressed along the x axis, but this
created a mesh with a uniform structure that did not resemble the actual behaviour of
water. This idea was also implemented in the c.p.u. This greatly affected
performance. As a result the calculation was moved to the shader where a
combination of the two previous models was used, summing sine waves but giving
each vertex a random number(with in a set range) with which to multiply things like
the phase and amplitude of the wave. This resulted in a nicer looking animated
mesh, in which could also calculate normals.
       The terrain could have been generated procedurally using many different
methods, and during the beginning of this project many were considered. They were

                                           7
                                                      Rory Kelly 0601896
ultimately discarded as they gave the programmer very little control over the shape
of the terrain. For this project a certain of degree of control was needed, to create
mountains and a lake. As a result I decided to use a height maps as this allowed me
to create necessary geographical structures.

Conclusion

During the course of creating this coursework I have learnt a lot about directX and
how to create a framework that would allow me to develop on directX 9.0c. I learned
how to set up a project define my own vertex structure and program an entire 3d
application form the ground up, using the directX rendering system.
        Despite this I have learnt most about how shaders and how they are a very
powerful graphics tool that allows you to not only free up some computing time form
the c.p.u but also turn an otherwise simple scene into a complexly lit and animated
environment.
       Programming for directX has also greatly improved the structure and
organisation of my code. This is many because of the complexity necessary to carry
out simply task such a rendering a mesh. There is a lot of housekeeping that needs
to be carried out in a directX application, such as releasing COM interfaces and
handling lost and reset devices. This make code organisation vital as when the
application grows in complexity, well designed data structures and classes can make
debugging and understand code much simpler.
       Since this project also had a set time limit I learnt a lot about time
management and how vital planning is to the development process. More Effective
Time management would have allowed me to create a better application and
probably include some of the feature that were dropped from the original design.
       Over all I believe that I have created an application that meets most of the
original design goals and runs fast enough to be included in a larger real time 3d
application.

References

Luna, Frank D. (2006) Introduction To 3D Game Programming With DirectX 9.0c: A
Shader Approach, Wordware Publishing.




                                          8

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:20
posted:2/23/2010
language:English
pages:8