Networking: Not Working
The Producer's tale
We began by deciding what type of game we wanted to make. This comprised
several brainstorming sessions between Jemal Armstrong (JA), Doug DaSilva (DD),
Kevin Neece (KN), and myself. We had just finished developing Trick Shot Golf (TSG),
and were in favor of making a game that would be a true showpiece. We had tossed
around a few ideas: a racing game, a puzzle game, a space shooter, an updated version of
TSG, and a multiplayer scavenger hunt game.
A good deal of discussion centered around developing the concepts of these ideas.
We then individually rated each game idea by difficulty in implementation and how fun
each game was perceived to be. At the end, we had narrowed it down to the puzzle game,
the golf game, and the scavenger hunt game. We were looking to expand our team, and
had approached the Drop Drop team about a merging. They sat in during one of our team
meetings and we pitched our game ideas to them. The Drop Drop team then consisted of
Jackson Dunstan (JD) and Eric Smith (ES). They informed us that the game idea they
saw the most promise in was the scavenger hunt game. They also made suggestions about
the design of the game, primarily in moving it from an indoor setting to an outdoor
We decided to join forces and pursue the scavenger hunt design.
Early on in development, our class was told that we could enlist the help of a team
of art students from the 3D art program at DigiPen. We put together a presentation, which
included concept art and descriptions of how we envisioned the game to look. One thing
that we vastly underestimated was our polygon budget. We took an extremely
conservative approach, and this ultimately was heavily revised during negotiations with
the art team. The art team consisted of J. Ryan Hammond (RH), Shane McIntire (SM),
and Colin Turner (CT). They proved to be a very competent team, but we faced many
We decided to use a 3D Studio MAX (3DS) plug-in called Flexporter to convert
the 3DS file format to a simple text format. This imposed a serious limitation on what we
could do with the animations. It was still the right choice, as it did some nice things with
consolidating the meshes, and organizing them in a very efficient manner. It also seemed
the only option since we wished to do animation based on quaternions. The other formats
we looked at only seemed to offer animations as mesh definitions, which is less efficient
and less useful.
A serious effort was made to optimize the game late in the development cycle. JD
and ES worked together to reduce the memory usage of the game, while I took charge of
optimizing the mesh drawing. The memory usage was reduced by nearly half thanks to
the efforts of JD and ES, and JD also added in the texture compression extension of
OpenGL. This further reduced the memory usage by a solid thirty percent. The efforts I
made to improve the rendering speed doubled our frame rate. The techniques employed
were OpenGL display lists, and preprocessing the mesh definitions into triangle strips.
The triangle strips were produced by another program written by the maker of
Flexporter, Pierre Terdiman.
The combination of all of these optimization techniques made our game very
playable on our minimum-requirements machine.
What went right…
1. The networking portion of our game came together in record time.
We had determined early on in development that the networking feature was the
single biggest contributor to the fun of Scavenger Hunt. Since development on the
networking engine was stalled the previous semester, an extra effort was devoted to
ensuring a clean and complete engine would be ready very early in the semester.
2. The game is playable on a minimum-system-requirements computer.
As the number of art assets the game had began to grow, the graphics engine
taxed even the super-fast machines in our student lab. A serious effort was undertaken to
bring the game down to a level where it would play on a system meeting the original
minimum system requirements specification.
What went wrong…
1. Unrealistically low estimates of graphics engine capabilities
Our early estimates on the limits of the graphics engine were unrealistically low,
and we nearly lost the art team because of it. We had set a polygon limit of 100-150 for
an animated character. The art team decided that this was a tenth of what they needed to
make professional character models. This number was revised to 1500 polygons for the
characters, with a low-detail version at 800 polygons. We also heavily revised the
texture-size limits at the pleading of our level design team.
A.I. for Scavenger Hunt
A Tale of Good Times and Great Frustration by the Technical Director
The approach to the A.I. for Scavenger Hunt was based around the idea of giving
each of the computer players a unique personality that would determine how often they
would make mistakes, how often they would try to pull “gags” on other players, etc. The
computer players were set up to interact with the game, also, by use of a “virtual
controller” which basically just means that each A.I. player can only move about in the
game by sending the same input signals to the game that players send through the
keyboard and mouse.
Each A.I. player would also be given a certain visible range and a memory system
that allows the computer player to “memorize” a set number of items they need from
their scavenger list. Each item in their memory is given a timer, and when it expires, the
item is removed from their memory as if they had forgotten it.
Also, each A.I. player has a set of desires assigned to each of a series of possible
actions that you can perform in the game. As the game moves along, the A.I. players
frequently re-evaluate their feelings as events occur in the game, which can cause them to
change their mind and alter their behavior.
What went right…
1. Building a Good Testing Tool Helped With Testing AI And Initial Game Balance
Having secured a team of artists for the game, priorities were initially focused on
getting the graphics engine up and running to be able to show off the art assets created for
the game. Because of this, much of the rest of the game was slow in being constructed,
and with a lot of the game logic not present for the first quarter of production, just about
any code written for the computer players would not be able to be tested for quite some
I took the initiative to program a Windows application with simple GDI rendering
and lots of output windows. The application allowed me to create and place bounding
rectangles for the world objects, place AI players represented by colored circles, and
place navigational waypoints for the AI players. The testing program contained some
simple logic to function much like the game would, so that the AI players could basically
play a slightly simpler version of the game. I could place the drop box for checking items
in anywhere I wanted in the level, so the AI testing tool basically became a model for
how a game round would work and how long it would take to play. The testing module
modeled the level in square meters and allowed the map to be scaled to just about any
Having all of these features in the testing module allowed me to directly test the
behavior of the AI in a variety of conditions, and observe how each aspect of their
personality worked as I added support for it. I was able to diagnose a lot of early
problems early on and it was, for the most part, very easy to identify and diagnose the
cause of problems as they arose.
The AI module for Scavenger Hunt is designed to talk to any program through a
set of interface functions. So moving the bulk of the AI code from the testing module to
the game involved no changes at all. Only the interface functions had to be altered to
communicate properly with the way the game logic and data was structured as opposed to
how it worked in the testing module. This proved very useful considering the technical
design for the game’s main logic was developing very slowly, and I was able to work on
the AI for quite a while in my stand-alone testing module. This helped keep a bottleneck
from forming because of the absence of main game logic.
Another unforeseen benefit to the AI testing module, since it basically had all the
logic to play a simple version of Scavenger Hunt as it currently existed in design on
paper, was that it allowed us to gauge how large a level should be used and helped us
figure out the scale for the objects and characters’ running speeds. This proved quite
valuable since this was critical information for the artists for building the objects to scale,
creating the right size level, and key-framing the characters’ different animations to the
correct speed so that they would like right in the game.
2. Staying Away From Cheating
Having played many games of Mario Kart on the Game Cube against computer
opponents and watched them cheat their way to victory race after race, I felt hell-bent on
a mission to create an AI system for Scavenger Hunt that would not resort to any cheap
tricks like teleporting or unfair speed boosts to win a round of play. I felt this was worth
the time to work on since I was really feeling like Mario Kart was a far weaker game with
the obvious AI cheats rearing their ugly heads, especially at the higher difficulty levels.
In their current state, the AI players maneuver around the level without any
performance advantages over human players with the exception that they can see through
walls due to the fact that, for efficiency reasons, what each AI player can see is handled
by a circle describing their view range. However, the view circle they have has a shorter
radius so human players can see a lot farther away then the AI does. I feel this more or
less helps balance things out a bit.
Having the AI set up this way for me is fun because it keeps the feeling in the
game that the AI players are trying to out-play you rather than just cheat to make up for
shortcomings in their performance. This does, however, give rise to a major difficulty in
creating challenging opponents for the player, which I discuss in the next section “What
3. Navigation Highway For Fast Movement
Maneuvering around a large level with hundreds of solid objects to negotiate
around can be a daunting challenge, especially when having to have multiple AI players
able to move around simultaneously in an action game. Scavenger Hunt is essentially a
racing game, so having multiple opponents to play against is basically a necessity for
interesting game play. The game design called for a support of up to eight players, which
meant with only one human player, seven AI players would have to be running around
the level and doing things in an efficient manner to keep the game’s frame rate
consistently running at a respectable rate.
I decided to use a “highway” of checkpoints for the AI to use to do most of their
navigating around the level. The game stores a look-up table for each point in the
highway that the AI can access to find a list of waypoints to use that provide the fastest
route to any other waypoint on the highway. Since this allows the AI to have pre-
computed paths for a lot of their movement, it was easy to have lots of AI players running
around a level without a noticeable effect to the game’s performance.
I was very happy with this aspect of the AI’s navigational system, and though it is
usually not frugal in its use of memory (depends on how dense the waypoint highway is),
it cuts down on lots of calculations and provides the reward of fast performance.
4. Knowledge Base For Fast Look-Up
Since the AI module was originally written outside of the game’s framework
(being built and tested attached to a stand-alone testing module), interface functions were
needed to query the game for a lot of information. Since this often requires nested
function calls and calculations which can be costly, a knowledge base was set up for all
of the AI players to use.
The knowledge base holds values retrieved from the game that are needed
frequently throughout the AI system, especially values that each AI player checks on its
individual turn. So, when a call is made by the game for the AI to update all of the
computer players, the first thing it does is query the game for data like player positions
and their distances from each other and certain key locations in the game. So on each
game loop iteration these calculations are made once, and then the values are stored in the
knowledge base, which is passed to each AI player when it processes its logic.
Again, this is another situation like the navigational highway where some
additional memory is used up in the interest of performance efficiency. Though this
knowledge base isn’t very large, it came in quite handy in cutting down on the number of
calls the AI module has to make back to the game to query about the state of the world.
5. Focused Game Design Helps Keep Things On Track
A lot of energy was put in early on in the development of Scavenger Hunt to nail
down how the game would play. At the start of the first semester, we didn’t even know
what game we were making at first, so it took a lot of work to figure out what type of
game we wanted to build. Again, because we had an art team that was relying on us being
detailed in our descriptions of the characters, environments, and what types of actions
each player would be able to perform, it was imperative that we be meticulously detailed
in our design of the game.
I think this turned out to be a good thing, because since the game play was
thoroughly determined and detailed, I was able to easily build the AI testing to function
as a prototype for the game and feel confident enough to delve into solving a lot of the
problems for how the AI would function in the game without fear that the design could
change radically at any moment. This helped save a lot of wasted time and energy on
changes due to guesswork.
6. Simplifying Much Of The Game To 2D Make Many Problems Easier To Solve
It was identified early on in Scavenger Hunt’s technical development that since
the players would not be able to jump nor climb on top of or over objects in the
environment, that the players would be basically grounded to moving about on a complex
plane. So, although the environment in Scavenger Hunt is a 3D environment, a lot of the
collision detection and movement logic could be written to be solved in a 2D system
since the height of anything in the level would be dependent only on the height of the
ground at that location.
This was very helpful since it allowed us to perform a lot of calculations in 2D
rather than 3D, which simplifies the calculations and allows the game’s performance to
stay high. Being able to identify and exploit this simplification due to the game’s design
really helped provide a boost to us in simplifying our work for certain aspects of the
game so that we could put more focus into some of the more difficult aspects of the
For the AI, this was useful since I could attack the path-finding and do a lot of
calculations in 2D as though the players moving along a plane. Again, this just helped
make things simpler and keep down the number and complexity of calculations that
would be needed for each iteration of the game loop.
What went wrong…
1. Constructing My Own Path-finding Without Considering Established Methods
Still being in school, I wanted to solve the path-finding problems for the AI
players myself without just blindly implementing something like an A* algorithm. I
wanted the chance to try to understand the complexity of the problem and try to find my
own way to solve it. Again, still being a student, I was interested in learning from trying
to tackle something difficult by hand rather than just use an existing solution to avoid
“reinventing the wheel.”
However, this obviously put me in a difficult position, since path-finding is not an
easy problem to solve. This cost me a lot of time in technical design creation and
revision. Also, my implementation of my own algorithms for pathing around the level
resulted in some bulky code due to unforeseen problems and trying to handle special
cases that I had not considered. This made debugging problems a lot harder than it
needed to be, and certainly emphasized that the problems could very possibly have been
avoided by simply sticking to existing published path-finding solutions.
In its current state, the AI players in Scavenger Hunt still have difficulty
navigating around certain points in the level, and I’ve had to resort to creating a denser
network of navigation points than I had intended in an attempt to smooth out the AI’s
performance when navigating.
2. Trying To Wall-Trace Around The Game World With Objects Arbitrarily
Positioned And Oriented
Being one of the people on the team who was urging the designer to consider
level design with objects not being placed in such a way that their walls were all axis-
aligned, I realize I am partially to blame in the creation of an even more difficult path-
finding problem to solve then if the walls had all been at right angles to each other.
When an AI player is attempting to navigate from the position of the closest
waypoint on the highway to a goal point in the level and that actual goal point’s position,
the AI can no longer use the highway waypoints, obviously, and then resorts to a wall-
In retrospect, I believe it would have probably been better to have just moved
around the level with a navigational mesh rather than trying to trace along the walls.
When wall-tracing, I ran into lots of issues figuring out how to handle situations where
the AI player hits an object’s corner and then has to decide which way around the object
to go. Object corners and having tight places that are difficult to move through in a
densely populated area provided a lot of headaches to try to solve.
I was never able to adequately resolve all of the problems with wall-tracing
around objects and so I had to lean on the highway waypoint system more to make up for
these short-comings. The AI generally splits into two directions every time it hits a wall,
and since these calculations are not run in a separate thread, this can slow the game down
if it gets out of hand. So, I’ve so far had to allow the AI to use the wall tracing only for
very short distances where it needs to be able to just walk around a wall immediately in
Again, these problems are basically a result of my stubborn determination to solve
all of the problems for the AI myself by hand.
3. Path-finding Not Thoroughly Tested in Testing Module & Lack Of Debugging
Tools Built Into Game Cost Lots Of Time
With time spent having to build the AI testing module from scratch, I did not get
to the implementation of the wall tracing aspect of the AI navigation algorithm before I
had to merge the existing AI module logic into the game. Once that was done, in the
process of tying the AI module into the game, more functionality was added to have the
game ready for pre-alpha at the end of the first semester and beginning of the second
semester. Thus, moving the AI module back to the testing module no longer became
something that could be done quickly.
Also, the game itself had no built in methods for debugging the performance of
the AI in the game aside from just using “printf” to print values to a console window.
With the lack of testing abilities in the game for me to observe the routes that the AI
players were generating graphically, it was very difficult to trace problems and involved a
lot of slow conversions where I would have to look at the print out of the position of
points on the route in world coordinates and find out where they fell in the level by
looking at the 3DS max model of the level on another computer when it was available.
It would have been advantageous to tie a few functions into the game early on to
make debugging the AI from the game much easier. The plan originally was always to
pull the AI module out and plug it back into the testing module when large problems
arose. However, due to the amount of functionality added to the AI once it was put into
the game, and the number of additional interface functions that would have to be added to
the testing module and additional logic added to the testing module to work like the
game, I just never found the time to prepare the testing module for receiving the updated
It’s unfortunate that it did become apparent earlier to have built-in testing
functionality in the game for the AI, because this would have saved me a lot of time and
frustration, which the path-finding cost me the most of. And since the AI module got
moved from the testing module earlier than I planned, I would up facing the most
difficult aspect of the AI’s implementation fairly blindly from within the game. This
should not have happened, but it did.
4. Hard To Create Tough AI Opponents That Do Not Cheat
I have heard it said that creating difficult AI once you’ve built it to be too easy to
beat is incredibly difficult to do without making them cheat. As navigational problems
emerged and cost me lots of time in debugging how it worked, I had little to no time
remaining to tweak how the AI would search a level for objects. Thus, since I had
forbidden myself from resorting to allowing the AI to cheat, it started to become apparent
that the AI was not going to be able to offer the player a significant challenge in time for
the final build of the game for the second semester.
Again, Scavenger Hunt is a racing game, so as you can imagine, making the game
fun hinges largely on providing opponents to play against that offer a good challenge.
Thankfully, the networking aspect of the game was successfully implemented, so at least
players can play against each other. The AI, unfortunately, has a way to go yet before it is
able to play well enough to beat the player regularly. Originally the AI was tough but that
was when they could run through solid objects. The path-finding once again showed how
problematic it could be by greatly hindering how effectively the AI players can play. So,
it would have been a lot better to have gotten all of the path-finding implemented and
tested in the testing module so that a lot of the bugs could have been ironed out, and then
I would have been able to focus on making the AI play the game better.
Scripting, User Interface, Sound, Input, and Porting for Scavenger Hunt
A Tale of Miscellany by the Lead Programmer
Video games are made up of many parts and someone has to create them. On a
team of six members, the tasks are always divvied up in such a way that some team
members are required to tackle more than one of these many parts. While other team
members jumped on graphics, networking, AI, and game logic, I was left to complete the
game's user interface, scripting system, sound engine, and input system.
I had worked on the user interface for my last two games and became quite
enamored with them, so naturally volunteered for this area of Scavenger Hunt's
development. I had also done sound in the past and had been researching the OpenAL
library, which tempted me with promises of an easy-to-use 3D sound system. Input was
really something I volunteered for as one of those “dirty” tasks that is not exciting, but
someone has to do. Lastly, over the summer before Scavenger Hunt's beginning, I
became very interested in the idea that scripting could be used as way to increase the pace
of development of the game's logic and to increase flexibility; I definitely wanted to
tackle this task, so signed up for it.
What went right…
1. Seemless integration of Lua scripting engine with C++
In order to ensure the acceptance of Lua as a scripting language for the game,
from both a technical standpoint as well as developer acceptance, the Lua integration
needed to be very tight with C++. Lua is by nature a scripting language meant to
extend C, which helps, but doesn't complete the job of full integration with C++. Our
team planned to make use of advanced C++ features such as template classes, abstract
classes, and singletons and it was important that the Lua integration could support
these concepts. Using the excellent language extension features of Lua such as
metatables, faking the aforementioned C++ techniques was relatively quick and
painless. In the end, Lua scripts were able to become a “first-class citizen” in our game
in the sense that any class properly exposed to Lua could be used by Lua in every way
that a C++ user could use it.
2. Input engine
The input engine made use of the Simple Directmedia Layer (SDL)'s already
excellent representation of input to the program. From there, the input was further
abstracted by the usual keybinding layer, where the user of the input engine is not
concerned with the physical device being used for input, but only for the result of that
input action. The input engine made use of the existing user interface objects that
represent the signals/slots paradigm. This allowed a very natural representation of
input from the user's perspective as their input handling functions (or methods or Lua
functions) would simply be called when the input associated with them occurred.
While this method of representing input is good for, say, the user interface that
reacts based on input events such as a click or a keystroke; it is insufficient for the
main game handling of input. A system of querying the input state was quite easily put
into place to thanks to the internal representation of input events. That representation
was based on the realization that all input from any device is essentially just a form of
input event, and a small class hierarchy was derived so that input events could be
conveniently stored for later querying. This also, thanks to SDL's easy handling of
input, allowed for simple implementation of these input events.
For all of the minutia that an input engine must provide, like access to the state of
the modifier keys on the keyboard, the ease-of-use of SDL really came in handy to
make that work go by smoothly and painlessly. In the end the input engine was set up
fast and remained virtually bug-free for the duration of the project. This stability and
power are always appreciated in a large project.
3. Reuse of an existing user interface engine
As stated in the introduction, I had been working on user interfaces for two years
prior to the beginning of this project and had become quite skilled at their creation. I
spent the vast majority of my time on my last project building a robust and efficient
user interface with advanced features such as XML menu generation and a
signals/slots system. I decided to carry over this user interface entirely and simply
adopt it to the new areas of the game where it was needed. Due to good design, this
was limited to just a few classes (image and text drawing) and was done easily. This
reuse provided us with another stable, proven area of code that we didn't have to worry
Due to the robustness of the existing user interface engine, the work that needed
to be put into the game to get our menus up and running was actually able to be put
into that area instead of worrying about implementing essentially basic features such
as a list box or radio button widget.
4. Cross-platform technology
From the very start of the project we planned to make the game available on three
platforms- Windows, MacOS X, and Linux. Two members of the team on their last
game, Drop Drop, had already accomplished this task. As those two team members
use MacOS X and Linux as their primary operating system on a daily basis, we were
able to develop the game simultaneously on the three platforms. Planning for this
began with two important realizations: we must choose only cross-platform libraries
and we must write our code to the standards.
At a certain level, cross-platform development is merely the consciousness of
what features you are using of a specific platform. Our libraries (Simple Directmedia
Layer, OGG Vorbis, OpenAL, Freetype, OpenGL, GLEW) were well chosen to be
open-source and cross-platform and they provide the robust and stable feature sets we
demand in a modern video game. This makes replacing platform-specific calls, such as
those to the Windows API, with their cross-platform equivalents provided to us by our
The cross-platform development was such as success that there was never a time
during development that any platform was behind any other or did not work at all. In
the end, we have three fully-functional, nearly-identical versions of the game, one for
each platform. Such cross-platform development is essential in a game industry
striving to release for multiple consoles, Windows PCs, and alternative operating
systems on the PC. We gained a lot of experience in this area from both the
differences in operating systems, but in the differences between hardware across a
large variety of computers. Lastly, we've found that development across multiple
platforms is a very good way of finding bugs in code that might otherwise work by
coincidence, but not in the general case.
As the Drop Drop developers did on that project, Concurrent Versioning System
(CVS) was used for version control. We used CVS for not only our source code, but
for documentation, game resources, art assets, distribution/installer-related files, and
for areas of code that could be maintained as separate modules (such as our low-level
networking engine). CVS allowed us to develop easily off of the LAN at DigiPen,
which was very helpful for development on three platforms because two of those
platforms are not available on the LAN. The other very helpful feature of CVS that we
constantly exploited was the ability for many developers to simultaneously work on a
single file and have their changes to it merged seamlessly as they commit those
changes. For files that are sort of “common areas” of programming (such as game
logic functions) this is an incredible productivity boost.
What went wrong…
1. Non-existent adoption of Lua
Despite the excellent integration work done between Lua and C++, I never “sold”
Lua to the team to make sure that they exposed their classes to it so they could use it
for game logic and functionality. While I exposed my own classes, the exposure was
never pervasive enough to be able to accomplish anything meaningful in Lua. This
was realized late into the project and, upon realization, it was determined that to go
over all of our classes and expose them to Lua would take too much time. This lead to
the wholesale removal of Lua, as it was never going to be utilized.
Sound, I have learned, is a major area of development in a game and should not
be underestimated. It's nature yields itself to threading, streaming, and high
performance. The listener will instantly notice the slightest glitch in a playing sound,
such as a jitter or a skip, and the sound will be essentially broken and worthless to
their ears. This means that the sound engine must be virtually perfect as far as the
user's presentation is concerned. Sound also carries with it large file sizes for raw
sound to be played to the sound card but is generally looked down upon as a small task
therefore not allowed the kind of machine resources (such as memory) that, say, a
graphics engine is allotted. This necessitates a high degree of efficiency in the graphics
engine, which makes its implementation even more difficult.
Scavenger Hunt's sound was planned from the start to be the marriage of
OpenAL's 3D sound capabilities and OGG Vorbis' natural streaming of high-quality
sound yet small sound data. While it is true that OGG Vorbis and OpenAL both
deliver on their promise, the demands of a sound engine are still great and even with
these two powerful libraries it was difficult to achieve a sound system that works well.
Firstly, the version of OpenAL at the moment does not provide an OGG Vorbis
playing extension, as is provided on MacOS X and Linux, and therefore there needed
to be two versions of many functions. This necessitates the implementation of sound
streaming by hand as well as implementation of the OGG Vorbis extension. Doing
both of these tasks is a lot of extra work and clutters up the code and logic
considerably to accommodate the differences between having and not having the
extension. Sadly, this problem would have disappeared if the project were to have
come next year rather than this as the OGG Vorbis extension is currently in a beta
state for Windows.
Graphics, Networking and Other things
By Eric Smith
What went right…
The graphics engine and networking had multiple people working on it at
the same time, which helped to improve their implementation. By working on the
design of both the graphics and networking with Jonathan we were able to create
a more polished design then if we designed it alone. Since there were two or more
people working on the module it was completed faster then with a single person.
After the first semester of development was over the networking module was
basically nonexistent. However, the team then had around three programmers
working on the module the second semester, and we got it up and running
What went wrong…
Both the time esteems and breakdown of tasks were really off for most
aspects of the game. The original scheduled had no more work scheduled for me
late in the first semester. Because of this throughout the course of the project I
found myself looking for things to do. Since there was also a problem with
defining all the tasks that needed to be done it made it harder for me to pick up a
new task after I completed my last one. While some tasks took less time then
planned, some took more time. So when tasks took more time it could cause that
person’s next tasks to be pushed back or even cut. Scheduling is a hard thing to do
especially when the person hasn’t done the task before. However, if we took a few
hours to brainstorm as a group the tasks needed at the beginning of each semester
we probably would have been better off.
O Boxes Where Art Thou?
A Designer’s tale of logic and collision
My primary first task was to set up the main game logic. Since I never had any
formal input into game logic before, I wanted to create a setup based on the game design
itself, yet still remain flexible enough to make changes later should they arise. In our
previous game, Trick Shot Golf, we had several instances of duplicate code for stand-
alone and networked versions of the game. I wanted to avoid that type of setup
completely. So, I devised a system of having function pointers that would be called in
the game state independent of whether the game was stand-alone or networked. My goal
was to stick to this setup at all costs and try to abstract the notion of network from the
game as much as possible.
In addition to game logic, I also took charge of the collision for the game. Being
the designer, I was well aware of the game’s type of play and functionality. Since there
was no jumping allowed in the game, I made the choice early on to make the game 2D in
terms of collision and use a height-map to simulate height. The idea behind this was to
simplify collisions so that we could do more interesting things with the collision boxes
(such as rotate and shear them).
Also, I took on the role of Main Art Exporter. I attribute this role primarily to the
fact that I was really one of the only team members with 3DS MAX installed on my
machine. I was put in charge of exporting each object in the max file into a separate
flexported file. This task was supposed to be relatively simple, but due to certain artist
concerns the task became much more involved and lengthy.
What went right…
1. Flawless collision code.
Although I fell behind on completing the collision code by about a week or so, the
end result was well worth it. From the moment I finished the collision code till the
moment we turned in the final deliverable, not a change was made to the code (with the
exception of adding additional return values so the AI path-finding could work). No one
ever got stuck in a wall or crashed the game from bad collision handling.
2. Group design effort.
With one exception, everyone on the team was into the design idea from the get-
go. With each new element added to the game, new design ideas were always brought to
the table for discussion. I think this was a great way to handle the design process. With
the whole team’s input, the game was able to travel in directions that appealed to
everyone on the team. This, in turn, kept everyone’s interest level in the game
exceptionally high, and I’m very happy that it turned out so well.
What went wrong…
1. Logic was left unchecked and not updated
Although I developed the game logic to work in a fashion that I thought was solid,
as the game progressed, certain modules had different interfaces than the game was
expecting. This lead to small bits of hackery here and there that eventually became to big
a problem to try and fix back to the original idea of how the game logic was to work. So,
the lesson learned is: keep an eye on the game logic, and fix hacks as soon as you see
2. Art format was finalized late
One of the biggest problems with exporting the art came after we decided to
change our format. We originally had each object being in a separate file and could then
be rotated and placed in the correct positions by the game. However, the artists felt that
they could make more detailed and interesting levels if they could make small changes to
each individual object. Thus, we decided to just export each object in the level
individually since they would most likely all be unique in some way. Although this is not
too difficult a task, with over 500 objects in the level, this was a very massive task!
Several objects in the MAX file were not properly transformed, and so many objects had
to be re-exported after a small fix.
The proof is in the pudding
The task that I was given was to build a networking engine that was able to run across
multiple platforms. This task proved to be a great and eventually warranted the
assistance of my teammates. Given that two of them had programmed networks for
previous games I was given specific tasks to complete and they took over the design.
The network was then completed quite quickly.
What went right…
The group was able to acquire an art team. The final deliverable far exceeded anything
that I could have imagined. I was very skeptical of ability for this product to be fun.
From the outset I was the biggest opponent to developing this title. I must admit, though,
that I am truly happy Scavenger Hunt was created. The final product will be a great
addition to the portfolio.
What went wrong…
I originally believed that the title would be a disaster, and this possibly affected the level
of intensity with which I attacked this project. I contributed, but could have assisted the
team more. Given this I am very proud of the team and the product we created.