WAT E R FA L L S I M U L AT I O N U S I N G PA RT I C L E S
ADVANCE COMPUTER GRAPHICS PROGRAMMING, PROJECT REPORT
Ahmed Tausif Aijazi – ahmai719
Tuesday, August 25, 2009
Fluid simulation is an increasingly popular tool in computer graphics for generating realistic
animations of water, smoke, explosions, and related phenomena. These fluids can add substantial
realism to interactive applications such as computer games. In this project I have restricted myself on
simulating water. However same technique can be used for the simulation of wind and smoke.
The aim of this project was to get familiar with the openGL programming environment and to
develop an understanding of high performance, real-time computer graphics. In this project I have used
my own created particles to simulate a simple waterfall. While simulating this water fall inter particle
forces are also considered. Collision detection of these particles with other objects in the scene is also
implemented. Additionally, sky with moving clouds is also created to enhance the look of the scene.
describe the dynamics of fluids. These equations are
1 – Introduction basically the fluid dynamic equivalent of Newton’s
Fluids (i.e. liquids and gases) play an important second law, force equals mass times acceleration.
role in every day life. Examples for fluid phenomena With the help of today’s computational powers, a
are wind, weather, ocean waves, and waves induced large number of methods have been proposed to solve
by ships or simply pouring of a glass of water. As these equations to simulate fluids, but they are quite
simple and ordinary these phenomena may seem, as expensive in terms of computational power.
complex and difficult it is to simulate them in Since about two decades, special purpose fluid
computer generated scenes. simulation techniques have been developed in the
The reason for the complexity of fluid behavior is field of computer graphics. In 1983 T. Reeves
the complex interplay of various phenomena such as introduced particle systems as a technique for
convection, diffusion, turbulence and surface tension. modeling a class of fuzzy objects. Since then particle
Fluid phenomena are typically simulated off-line and based approaches have been used to simulate fluids in
then visualized in a second step e.g. in aerodynamics computer graphics.
or optimization of turbines or pipes with the goal of So far many software and projects have used this
being as accurate as possible. Less accurate methods, particle system technique to simulate fluids. A few of
that allows the simulation of fluid effects in real-time, them can be found on the internet. One of such
open up a variety of new applications such as medical project was done by Fengdi Yang at the university
simulators, computer games or any type of virtual of Saskatchewan, Canada. He simulated smoke using
environment. the technique of particle systems. Another one can be
found on the website of Bonzai Software also
2 – Background and previous work simulating smoke. Many others can also be found, but
Computational Fluid Dynamics has a long history. these two are quite nice. I have used some of the basic
In 1822 Claude Navier and in 1845 George Stokes techniques described by these two to simulate the
formulated the famous Navier-Stokes Equations that whole scene.
Figure 1: Sequence of screen shots showing the waterfall, particle-sphere collision and sphere-sphere collision
3 – Overview points along the ray. Now we can use this ray
representation to calculate the intersections.
This project implements a waterfall simulation To determine the exact collision point we take the
using OpenGL, GLSL and the concept and technique start position of the ray as the position of the particle
of particle system. To create a connection between and the direction of the ray as its velocity. This needs
OpenGL and window system the utility toolkit GLUT to be determined because from that point the velocity
is used. The project principally is aimed at rendering (speed and direction) of the object will change. Now
technique along with the complex physical process as in this project the particles of water falls down on
that actually happens in the real world. Not all the the ground which is a plane and for defining a plane
physical laws are incorporated but enough to produce we just need a 3D point and a normal from that point
some satisfactory results. A few of the elements that which is perpendicular to that plane. These two
effect the motion of the particles are calculated based vectors will form a plane.
on some pre-defined functions that give the visual For example if we take for the 3D point the vector
satisfaction of the displayed image. Key techniques (0,0,0) and for the normal (0,1,0) we essentially
involved in the algorithm are texture mapping, image define a plane across x,z axes. Therefore defining a
blending and implementation of physical laws in the point and a normal is enough to compute the vector
computer generated scene. representation of a plane which is given by
To make the behavior of the particles as realistic
as possible collision detection is also implemented. To NormalOfPlane( x , y , z ) • Po int OnPlane( x , y , z ) = Dis tan ce
demonstrate this collision detection user interaction
was added. The user can throw a sphere in the scene NormalOfPlane is the normal of the (ground)
to interact with the particles of the water. Much plane and PointOnPlane is a 3D point from which the
emphasis has been made to make this interaction as normal is originated and Distance is a float
close to reality as possible based on the physics. representing the distance of the plane along the
To further improve the looks of the scene and to normal, from the center of the coordinate system.
give an outdoor feel, sky is created. This sky is If a ray (particle) intersects the plane at some
created with moving clouds. In making the sky point then there must be some point on the ray which
shaders are used using GLSL shader language. As a satisfies the plane equation and the above equation
simple simulation, the results are fairly satisfactory. becomes
4 – Implementation NormalOfPlane( x , y , z ) • Po int OnRay( x , y , z ) = Dis tan ce
This project is implemented in the object oriented
programming language C++ using OpenGL. The Substituting the value for PointOnRay from the
windowing system is handled by the utility toolkit ray equation we get
GLUT. This project also uses the OpenGL shading
NormalOfPlane( x , y , z ) • ( RayStart(x,y,z) + t × RayDirection(x,y,z) ) = Dis tan ce
language to bypass the fixed functionality of the
Here ‘t’ represents the distance from the start until
4.1 – Collision Detection the intersection point along the direction of the ray.
Therefore solving for ‘t’ we get
In this project, detecting the collision was fairly
difficult as there are many particles in the scene and I
had to detect collision with every particle in every NormalOfPlane • ( Po int OnPlane − RayStart )
time step. There are some very general algorithms to t=
NormalOfPlane • RayDirection
do this and they usually work with any kind of
objects, but they are expensive. Now by substituting t into the ray equation we can
I have used an algorithm which works fast for my get the collision point. There are a few special cases
application, is easy to understand and to some extent though. If NormalOfPlane dot RayDirection = 0 then
it’s also flexible. Furthermore importance is given on these two vectors are perpendicular (ray runs parallel
what to do once a collision is detected and how to to plane) and there will be no collision. If t is negative
move the objects, in accordance to the laws of the collision takes place behind the starting point of
physics. the ray along the opposite direction and again there is
4.1.1 - The Algorithm
Using the same technique for spheres colliding
For the collision detection I used the algorithm with a plane or other spheres is quite easy but with a
which is mostly used in ray tracing. A ray is defined little trick. Each sphere has a radius, take the center of
by the equation the sphere as the particle and offset the surface of the
colliding plane along its normal. As shown in Figure
PointOnRay (x, y,z) = RayStart (x, y,z) + t × RayDirection(x, y,z)
2A, collision is detected at the dotted line instead of
the original surface which is represented by the
PointOnRay, RayStart, RayDirection, are 3D
continuous line. This is done to stop the ball from
Vectors with values (x,y,z) and t is a float which takes
penetrating into the plane or the other sphere.
values from 0 to infinity. With 0 we get the start point
Otherwise we get a situation as in Figure 2B, where
and substituting other values we get the corresponding
the sphere penetrates the surface.
Figure 2: (A) Collision of a sphere with a plane or another sphere is detected at the dotted line instead of the
original surface (B) Sphere penetrating the surface as the detection is done at the actual surface.
4.1.2 – Collision Response systems is the introduction of some type of random
Finding the point of collision is not enough; element. This random element can be used to control
another important thing is to determine how to the particle attributes such as position, velocity and
respond after hitting the static (ground) plane or color.
another sphere or other particles. For calculating the In this project, fluid particles are introduced by a
accurate response to a collision, laws of physics have water fall into the scene, which over a period of time,
to be applied. When an object collides with the move from within the system, and die from the
surface its direction and speed changes i.e. it bounces system. To compute each time step in a motion
off. The angle of the new direction (or reflection sequence, the following sequence of steps is
vector) with the normal at the collision point is the performed:
same as the original direction vector. Figure 3 shows a • New particles are generated into the system
collision of a sphere with a plane.
The reflection vector R should have the speed and • Each new particle is assigned its individual
direction of the reflected ray. The direction will be attributes
same as the incident ray but in opposite direction and
• Any particles that have existed within the
the magnitude of the speed is supposed to be
system past their prescribed lifetime are
decreased a bit depending on the friction coefficient.
But in my case it is neglected so the speed remains the
same. Same goes for collision of sphere with any • The remaining particles are moved and
object including the water particles. This is valid transformed according to their dynamic
when a sphere striking a static object but when the attributes
sphere strikes another moving sphere then the
situation becomes a bit complex. The velocity vectors • And finally an image of the living particles is
after collision are calculated for both the striking rendered in a frame buffer.
spheres and these vectors depend on the striking
4.2.1 – Creating the fluid
Fluid is created by generating a set of particles in
each time step. For this generation a controlled
process is adapted which determines the number of
particles entering the system during each frame. The
number is important because it strongly influences the
density of the fluid. The generated particles enter the
system through a generation shape called emitter
which defines a region about its origin into which
newly born particles are randomly placed. Fluid
emitters provide an alternative to directly creating
fluid particles. They allow the fluid to fall with certain
Figure 3: Collision of a sphere with a plane properties. I have implemented two basic modes for
emission, one as constant pressure and the other
4.2 - Simulating the waterfall constant flow rate. In the first case the water comes
out with constant pressure i.e const acceleration
Fluids allow the simulation of liquids and gases
(speed increases in every time step) while in the
using a particle system. A particle system is composed
second emitter keeps emitting the same number of
of one or more individual particles. Each of these
particles each time step. Each particle generated
particles has attributes that directly or indirectly affect
through this emitter has a set of attributes which affect
the behavior of the particle or ultimately how and
its individual behavior. For each new particle
where the particle is rendered. Often, particles are
following attributes are stored.
graphical primitives such as points or lines and even
polygons. In this project, I have simply used quads. • position – defines the three dimensional
The other common characteristic of all particle position of the particle
• velocity – defines the speed of the particle in velocity. Therefore the velocity term is divided
the direction of the vector field of the by the viscosity for each particle.
• accumulatedBodyForces – result of all the V
P new = P old + + F acc
forces combined that are acting on the particle v
except those of other particles.
When the particle is generated it is given a
• density – defines the density for the fluid that lifetime. As each frame is computed, this lifetime is
the particle belong to. decremented. A particle is killed when its lifetime
• radiusMultiplier - sphere radius of reaches to zero.
influence for particle interaction. 4.2.3 – Rendering the fluid
• viscosity - defines the fluid's viscous Once the position and appearance parameters of
behavior. all particles have been calculated for a frame, the
rendering algorithm makes a picture. The particles of
• lifeTime - the lifetime of the particle in water in this project are rendered using the simple
seconds. Defines after how many seconds the technique of bill boards with texture. Billboards are
particle should fade from the scene. simple polygons always facing the camera.
I have defined two classes for creating the fluid, MyFluid::draw() method is used to display the
one FluidDesc which contains all the description particles on the scene. First the coordinates of the
about the fluid and the other MyFluid which takes an billboard are computed by taking into account the size
object of the description class as input and creates the of the particles defined and the current camera
actual fluid. The MyFluid::update() method is be position. Then the simple textured GL_QUADS are
used to make the particle move in every frame. drawn in such a way that the viewer should not be
able to recognize the edges of the square. One
problem with particles is that they can obscure other
4.2.2 – Fluid dynamics particles that are behind them in screen depth. This is
In this project, while simulating the waterfall, the solved with the help of blending and defining the
inter particle forces are considered. This is quite alpha channel. Blending in openGL takes effect by
expensive as a number of force fields are associated enabling several predefined functions and values:
with each particle. Influenced by these forces, glEnable(GL_BLEND);
individual particles of the fluid move in three- glDisable(GL_BLEND);
dimensional space. To move a particle from one time glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC
step to the next is simply adding its velocity vector
and accumulatedBodyForces vector to its Another advantage of using blending was that
position vector. when two or more particles overlap each other their
effect is also reflected. After blending is turned on the
P new = P old + V + F acc value of alpha defines how much the overlapped
texture square can be mixed with each other.
For each particle accumulatedBodyForces is
calculated by adding all the external forces and the 4.3 - Making the sky
forces due to neighboring particles that comes in the A good sky was certainly important to improve
sphere distance radiusMultiplier. the feel and look of the waterfall. And to make it more
spectacular moving clouds were added to this sky.
F acc = F external + F neighbors And changing clouds cover to simulate the weather or
any other special event.
The density defines how much impact should
the neighboring particles will have on an individual
4.3.1 – About the real sky
particle. It basically defines a force of attraction To make the sky as realistic as possible first thing
directed towards the centre of the particle. Higher the what I did was that I went outside and look at the real
density defined, higher the force of attraction, closer sky. Made a few observations about how the real
the neighboring particles will be. clouds actually behave. What is the effect of sun’s
light on them and so on. A few clues are also given in
. All these can be seen when you go out and see the
F neighbors = d ( F n1 + F n 2 + F n 3 + ...) real sky.
Where F n1 , F n 2 ,... are forces due to the • The actual sky is curved in shape.
neighboring particles in sphere distance • The sun is a very bright and large object, with a
radiusMultiplier. The velocity of each particle is noticeable halo that surrounds it.
also influenced by the viscosity defined for each
particle. It defines the amount of resistance in the • The clouds are illuminated by the sun, and
flow. Higher the viscosity lesser will be the change brightness depending on the distance
from the sun.
• The clouds are usually moving across the sky,
and occasionally block out the sun.
• The clouds are also changing shape as they
move across the sky.
• The appearance of clouds also depends on
distance from viewer. The furthest clouds are
often obscured by haze.
• The sky color near the horizon changes due to
• Some of the area of the sky is completely free
from the clouds.
All of these things happen in a real life cloudscape Figure 4: Putting the clouds on the sky ( image
and lots more besides. The more things we can courtesy Hugo Elias )
simulate, the more realistic our clouds will look.
4.3.2 – Putting the clouds on the sky clouds in the sky. Both the textures are shown in
To model the curve shape of the sky what I did is figure 5.
that I made a very large sphere with a radius of 10,000 Animation is done by using the noise texture
and put all the remaining geometry not at the centre of which is sampled at suitable intervals. Texture
the sphere, as this is not realistic, but a bit above the coordinates for the look-up are computed in the vertex
centre of that sphere. And for the clouds I took little shader but most of the tasks are performed in the
square shaped texture of clouds, and lay it over a fragment shader. A few parameters like the size of the
sphere to represent the curvature of the earth. This is texture, simulation time and the sharpness are passed
shown in figure 4. from the main program to the shader program.
Lighting is indeed very simply computed with the
4.3.2 – The sun diffuse equation, using the normal from the normal
In this project the main source of light is the sun, maps. But before using this diffuse equation in the
which is positioned on a certain point of the sky lighting equation, one thing to remember is that our
hemisphere. This is simply light 0 of openGL. But to normal is taken from a bump map, which is in tangent
render this as a real sun there are two ways. The first space, while the light position is not. So it has to be
is to just use a billboard, may be with blended halo transformed into tangent space. I could have also
and flare textures. This will certainly give good result transformed the normal to the same coordinate system
but I wanted to do something better and faster. What I as the light but since I am not using any tangents or
did is that I take the scalar product of our sun position bi-normals, I simply assume that the sky is a flat
with the normal of the vertex of the sky, being plane, and therefore tangent = (1, 0, 0), bi-normal =
rendered. The result will be maximum when the (0, -1, 0) and normal = (0, 0, 1), so the matrix product
normal and the sun position are coincident. That is at of the tangent matrix and the light vector is simply the
the center of our sun and will gradually fall off as we light vector with the z and y coordinates swapped and
diverge. And I selected, how this fall-off will the y coordinate changed of sign.
decrease, by simply using a power function. This will //Transform light vector to tangent space
have two added advantages, first a very realistic halo
being rendered and the second is that the sun diameter Light = normalize(vec3(lpos.x, lpos.z,
will change depending on the distance from the lpos.y*2.0));
horizon, simulating light scattering. light.y =-light.y;
This is done in the GLSL shader language. Please
note that there is no need to pass any normal to the
shader, as for a hemisphere the normal is nothing but
the normalized position of the vertex. This position is
stored in a variable passed from the vertex shader to
the fragment shader.
4.3.3 – Creating the clouds
For creating the clouds I’ve used the texture
created by Perlin Noise which is great for creating
cloudy textures. This noise can produce very realistic
results but the problem is that it is far too slow to be
used for a real time graphics application like this.
Therefore I have looked up into two texture maps, one
for the bump mapping, to give volumes to our clouds Figure 5: (left)Texture used to form clouds
(and to animate their shape), and another to move the (Right) bump map of the texture
But actually the sky is a hemisphere not a plane, and therefore 110 will be 1 and 0.1*1-0.1 will be zero,
and therefore I have cheated a bit and artificially boost and so blue1 will be taken.
the height of the sun to have this reasonably working. For haze I have also used something similar. This
If we move the sun and take it close to the horizon is what I have in my shader:
then this cheating will not work. As in my case the vec4 grey = vec4( 0.5, 0.5, 0.5, 0.05 );
sun remains stationary therefore it worked for my
application. Finally, the cloud density is computed by //Add a horizon haze
the following pseudo code as given in  float haze = pow( gl_FragCoord.z * (1.0-
haze = pow(2.71828, -10.0*haze);
c = v - CloudCover
if c < 0 then c=0 skyColor = mix( grey, skyColor, haze );
CloudDensity = 255 - ((CloudSharpness c)
* 255) This is basically a per-pixel fog, but the exponent
is yet using again another power function, basically
return CloudDensity since I want to reduce the height of the haze zone to
end function very close to the horizon.
4.3.5 – Putting all together
Then all is blended together, note that the sunlight There are a couple of things going on every time
is multiplied by the cloud cover parameter. This step. Firstly, the cloud map must be animated.
allows us to reduce the sun image when our sky is Animating the clouds was not difficult. All I did is
well covered by clouds or one of the clouds come animate the original noise maps texture. The
over the sun. simulation time is passed in the shader from the main
skyColor = mix( skyColor, white * program and the noise texture is mapped on the basis
cloudDiffuse, cloudDensity ); of that simulation time. Then the blue sky is drawn
gl_FragColor = skyColor + cloudCover * along with the sun glow and the cloud map. They all
sunlight; are combined to create the final image.
4.3.4 – Scattering and haze 5 – Results
Scattering is a process in which some forms of After implementing the components of this
radiation (such as light) is deviated from a straight project, they were integrated to form the output shown
path. In my case the scattering is done proportional to in figure 6. The water fall shown in the figure can
the height of the atmosphere surrounding the camera have a maximum of 65535 particles in each time step.
y-position. This scattering will be minimal just above The first image (a) shows the water fall generated
the camera and will be gradually become thicker until with particles. For the second image (b), three spheres
it will reach its maximum at the horizon. Scattering are thrown into the scene to interact with the water
will cause the light to diverge through multiple paths, particles. For the first image, the animation runs quite
therefore increasing the perceived luminosity. This is stable at 19 frames per second on a 2.4 GHz Core 2
achieved by making the sky darker at the zenith and Duo PC with a GForce 9200GS graphics card, but for
gradually lighter until haze will kick in at the horizon. the second image the frame rate reduces to 11fps as
Haze is similar to scattering, but is caused by dust now there are additional objects in the scene. Another
or smoke which not only scatters the light but also thing to note is that as the application starts the frame
obscures the clarity of the sky. We can simply rate is around 50fps but as time passes the number of
simulate scattering by linear interpolating between particles in the scene increases which causes the
two colors, the transformation can be based on the frame rate to decrease drastically.
height of our hemisphere. The two images shown in Fig. 7 demonstrates the
vec4 skyColor = mix( blue1, blue2, clouds in the sky. Through mouse motion, the user
0.1*pow(11.0,length(norm.xz))-0.1 ); can move around and have a look at different
segments of the sky. The haze and scattering near the
The two colors are blue1 (cerulean), which we
horizon can be seen clearly. The sun can also be seen
take as our sky color for the zenith, and blue2 (baby
with the gradual fall off in the light as we diverge
blue) which we take for our sky color at the horizon.
from its centre.
For the height, I'm simply using the normal. A
"horizontal" normal will be in the x-z plane and will 6 - Conclusion and further work
have a length of one. A "vertical" normal will have
instead a length of its projection on the x-z plane of After doing this project it was learnt that a
zero. Again, I'm using a power function here, since I particle-based method for simulating fluids can give
want a different spread than linear. better results in terms of physical behavior. It was also
For points closer to the horizon, the length of the concluded that it is much easier for the particle
normal will be close to one, therefore looking at the systems to model objects that can explode, flow,
above code, 111 will be 11 and 0.1*11-0.1 will be 1 splatter, puff up, and billow. These kinds of dynamics
and so blue2 will be taken. For points close to the are fairly difficult to produce with surface-based
zenith, the length of the normal will be close to zero representations. The most important aspect of particle
systems is that they move, as good dynamics are quite  Particle-Based Fluid-Fluid Interaction,
often the key to making objects look real. Matthias Müller, Barbara Solenthaler, Richard Keiser,
The results of this project are not as photorealistic Markus Gross, Eurographics / ACM SIGGRAPH
as they could be; much better rendering needs to be Symposium on Computer Animation (2005)
done. Since the goal was to work with the physics of  Cloud Cover, Hugo Elias,
particles and to make the fluid dynamics as real as freespace.virgin.net/hugo.elias/models/m_clouds.htm
possible, therefore less focus was made on the  Procedural sky, C. Tirabassi,
rendering part. While this project has covered the norsetto.890m.com/sky_tut.php
physical model, tracking and rendering of the fluid  Volumetric Smoke, Bonzai Software,
surface in real time certainly remains an open research www.bonzaisoftware.com/volsmoke.html
area.  Wikipedia, The free encyclopedia
 Particle systems — a technique for modeling a
7 - References class of fuzzy objects. W. T. Reeves, 1983
 Scalable real-time animation of rivers, Qizhi  Somke Simulation, Fengdi Yang,
Yu, Fabrice Neyret, Eric Bruneton and Nicolas www.cs.usask.ca/grads/fey399/ProjectReport829.htm
Holzschuch, EUROGRAPHICS 2009
 Particle-Based Fluid Simulation for
Interactive Applications, Matthias Müller, David
Charypar and Markus Gross,
Eurographics/SIGGRAPH Symposium on Computer
Figure 6: (A) (left) A waterfall generated with particles running at 19fps (B)(Right) User interaction with water
particles running at 11fps
Figure 7: (A) (Left) Clouds on the sky with haze near the horizon (B) (Right) The sun can be seen with gradual
decrease in its light as we move away from its centre.