Docstoc

Changing Concepts of Time Changing Concepts of

Document Sample
Changing Concepts of Time Changing Concepts of Powered By Docstoc
					                        Changing Concepts of Time




It's the greatest and most spectacular battle in the Lord Of The Rings
Trilogy. But during its production filmmakers faced one surprising challenge
how to keep the computer-generated soldiers from fleeing the battlefield.
Director Peter Jackson had laid down his requirements for the Battle of
Pelennor Fields… Jackson wanted the computer-generated antagonists to
have absolute authenticity on the big screen and to be indistinguishable from
the real actors. "I want battles like nothing anyone has ever seen on screen I
want every soldier fighting for himself… you have to come up with
something."… Special effects designer Richard Taylor says Jackson‟s order
led to the writing of a "massive" principal code for the battle in order to give
the more than 200,000 digitized soldiers and some 6,000 horses
distinctiveness and individuality… "It was the fact that you could get a
computer to think for itself, and that you could also get 200,000 agents
within the computer to think for themselves. So each of these computerized
soldiers is assessing the environment around them, drawing on a repertoire
of military moves that have been taught them through motion-capture
determining how they will combat the enemy, step over the terrain, deal with
obstacles in front of them through their own intelligence and there's 200,000
of them doing that. Basically, all the necessary information for decision-
making was fed into this network of computers without determining for
them whether they would win or lose. But this aim to ensure that they acted
spontaneously almost sabotaged the battleground sequences. For the first
two years, the biggest problem we had was soldiers fleeing the field of
battle," Taylor laughs. "We could not make their computers stupid enough to
not run away." So some extra computer tinkering was required to ensure that
the trilogy's climactic battle worked the way Jackson wanted.

                                              Globe and Mail – March 2004


                   _______________________________
                      (Pat Crawley Lughnasia 2005)




                                                                               1
                    Production Network Experiment



                                  Contents

The 2k Production Pipeline ------------------------------------------- page 3

Servers and Network Infrastructure --------------------------------- page 8

9/11 and Network based Production --------------------------------page 18

E-cinema on Location ------------------------------------------------ page 37

Motion-control and Performance-capture ------------------------- page 41

Virtual Art-direction ------------------------------------------------- page 53

Cost-control and Non-linear Production -------------------------- page 61

Institute for Creative Technologies -------------------------------- page 71

Machinima ------------------------------------------------------------- page 76

Computer-aided-design or CAD ------------------------------------ page 87

Flattening the Production Unit -------------------------------------- page 93

Rendering Farms and Grid-computing ---------------------------- page 100

DreamWorks and HP ------------------------------------------------ page 109

IBM, Disney and Others --------------------------------------------- page 111

Super-computers, Animation and Special-effects ---------------- page 114

Arrested Development and Runaway Production ---------------- page 121




                                                                                  2
                     Production Network Experiment




      If we were to visit Sam Goldwyn's Metro-Goldwyn-Mayer lot in the
      1930s and watch a movie being made, there would be very, very little
      difference in the rhythms, in the work, in the activity done on that lot
      than there is in doing a modern television movie. In spite of the fact that
      the machines are different, the rhythms and patterns and the time it
      takes, and the number of people, have basically not changed in 60 years.
      60 years in which technology has reduced the cost, and increased the
      diversity of every other consumer product, from cars to recorded music
      to computers, making those products objects of easy access and great
      variety. When we look at the way satellites, microwaves, and fibre
      optics have conquered space, and conquered time and then we look at
      how television production is done. We cannot conquer weight and we
      cannot conquer light. Why is it that so much of television costs come in
      simply setting up lights? Why is it that so much of journalism costs
      consist of schlepping heavy stuff onto airplanes and off airplanes?
      Clearly the main problem we have with television production is its cost.
      The most important part of our industrial strategy must be to reduce the
      cost of production. And I look to technology as the great unexplored
      frontier. I am convinced that there is a technological way, of challenging
      volume, of challenging weight, and challenging the idea that television
      production has to take such a long time and take so many people, and be
      so heavy! One way we will get the cost of production down is to look
      towards the technical solution and that technical solution is not beyond
      our capacity.

                                                      Trina McQueen – 1992

                        The 2k Production Pipeline

     Digital technology has actually been impacting the film and television
production, transmission and display systems in serious ways since Silicon
Graphics deployed powerful workstations applied to 2 and 3d computer-


                                                                               3
graphics and high resolution electronic image processing into the operational
components of these strategic industries on a world wide basis These
technologies nudged the operational side of these strategic industries into
what would turn out to be an inevitable transition from analog and non-
interactive towards digital and interactive formats following a path already
well traveled by the military in the US during the 60s, 70s and 80s. When
via the space program, various other surveillance, intelligence-gathering,
mapping and communications activities of the Defense Department,
National Reconnaissance Office, National Security Agency and Defense
Advanced Research Projects Agency generated the key concepts and
chipsets necessary to the digital technological revolution in general.

       The peace dividend shifted these disruptive technologies into the
civilian side of the IT and consumer-electronics sectors sending them
spinning off in a search of fresh markets and cash flows within the
entertainment industry in general and the media sector in particular. This
placed the components of the media sector able to best exploit the incoming
technological base at the centre of a revolution in electronic information-
processing, transmission and display. A transformation fueled by the super
profits generated by from the commercial aspects of the soft-power strategy
as accelerated by the dot.com investment bubble, globalization and
stimulated after 9/11 by the Bush Administration‟s expanded propaganda
and disinformation requirements resulting from the so called “War on
Terror.”

       The powerful chipsets when combined with the emerging 3d
applications enabled the computer-graphics industry to transcend its
scientific and military origins; overcome barriers of feasibility, cost-
effectiveness and productivity; to become pervasive in all aspects of
narrative and interactive media design, production and distribution while
simultaneously enabling the field of computer-aided-design in general.

       The cash flows generated at the box office by feature films including
ever more computer-graphics plus the sales of DVDs, video-games, digital,
HD and now IP based television signals expanded the cash flows available
to the already digital sides of the sector to the point where in the late 90s
they surpassed those of the still analogue elements of the marketplace. While
at the same time the interactive products such as video games and IP based
television signals were also setting the tone within the film, television,
animation, gaming and multi-media marketplaces on a world wide basis


                                                                              4
from an aesthetic point-of-view. As evidenced in the growing marketplace
for, and apparent cultural importance of videogame hard and software, as
well as, the various peer-to-peer file transfer systems which disrupted the
music industry to the point of forcing it to adopt an interactive and
transactional business plan connected with the arrival of the iPod and
Podcasting.

      Provocatively, in the whole area of software for 2 and 3d computer-
graphics, the core technologies necessary to both videogames and computer-
generated animation and specific effects industries. It was Canadian
companies leading the pack on a world wide basis. For example Softimage
and Discreet Logic both of Montreal, and Alias Wavefront of Toronto, by
now all owned by important American hard or software interests but still
head-officed in Canada engineered the software-environments necessary to
the computer-graphic side of a high-resolution electronic image-processing-
revolution on a world wide basis.

        Alias Wavefront was formed in Toronto after a 1982 visit to Industrial
Light & Magic (ILM) inspired Ottawa‟s Stephen Bingham to create an easy-
to-use 3d software package for use within the productive components of the
film and broadcast industries. In 1984 Alias suggested to SGI that their mini-
computer would form an excellent platform for computer-aided-design
based on Alias‟s emerging 3d software package. In 1986 Softimage was
founded in Montreal by an ex-NFB filmmaker based on the idea of creating
a 3d animation system for application to the media production for use by
artists and filmmakers. Like Bingham, Daniel Langlois was dissatisfied with
the existing 2 and 3d graphics packages designed primarily with military or
scientific applications in mind.

       Since off-the-shelf software for the production of films, television
programmes and videogames were not available the founders of these
Canadian companies, along with Wavefront Technologies in California,
since merged with Alias, adapted their business plans to develop and market
their own 2 and 3d computer-graphics packages. Designed primarily for use
by artists working within animation, video-gaming, film production,
broadcast and computer-aided-design or the CAD sector in general.

      In 1987 the Belgian government provided the capital to enable Alias
to focus on the computer-aided-design side of the business and they soon
developed an impressive customer list including Timex, Reebok, BMW,


                                                                              5
GM, Honda, Volvo, Apple, GE, Motorola and Sony. As well a growing
client list in the entertainment industry which grew to include Industrial
Light & Magic, Broadway Video, Moving Picture Company and ultimately ,
Dreamworks and Pixar. In 1989 Steve Williams (ex-Alias) went to ILM to
help another Canadian James Cameron creating the pseudopod in The
Abyss. Alias was chosen because of its patch instead of polygon based
modeling system. The software ran on SGI workstations and the Abyss
garnered an Oscar. In 1990 Alias went public and introduced a third
generation of software branded Studio for industrial design and
PowerAnimator within the entertainment industry in general.

        In 1991 Softimage released it‟s second-generation 3d-graphics
package featuring an “actor module” based on inverse kinematics and
enveloping constraints enabling animators to combine conventional
techniques such as editing and keyframing with advanced computer-graphic
tools of various kinds. The package heralded a revolution in creating
realistic character motion and the assembly of the very first digital actors. In
1991 Wavefront launched Composer that quickly became the standard for
professional 2 and 3d compositing and special effects for the film industry
while heading the optical-printers, scanners and film-recorders off in digital
directions, as well as, setting the stage for the whole idea of digital
intermediates (DI) in general.

       Concurrently, Alias announced an alliance with SGI and purchased
Spacemaker Technology and launched UpFront a low-cost 3d Mac and
Windows based package for architects. Alias also broadened it‟s product
range with the acquisition of Sonata, a high-end 3d architectural design and
presentation system from T2 Solutions in the UK. In spring of 1992 Alias
also introduced an upgraded version of PowerAnimator including
Kinemation and SmartSkin a “character animation” system for natural
motion and muscle behavior. Dynamation was a 3d tool for interactively
creating and modifying realistic, natural images of dynamic events. The
system allows the seamless blending of behavioral data and user-specified
information describing shape, color and motion. The tool set needed to first
create digital performers and then to release them into digital so-called
“virtual realities” were thus at hand.

      In 1992 Softimage went public and acquired EDDIE software and
Painterly Effects, providing a complete effects generation toolkit with
advanced color correction, filtering, rotoscoping, morphing and painting


                                                                               6
capacities. Creative Toonz debuted a 2d animation package that automated
the more tedious tasks involved in cel animation. Softimage then opened its
product line to third-party developers a key step towards the integration from
an operating systems point-of-view of the by now fast emerging computer-
graphic-to-film image processing system. Softimage and Mental Images
announced a rendering technology agreement and released an upgraded 3d
package featuring: file management, metaclay, clusters, flock animation,
weighted envelopes, channels all based on an “open system” policy and
third-party developers.

       In 1993 Steven Spielberg chose Industrial Light & Magic to provide
the visual effects for Jurassic Park and another major tipping point in media
production based on computer-graphics was passed. With computers now
operating in networked formats by way of interconnection to servers and
transmission systems and able to ingest and output both audio signals and
cinema grade high-resolution imagery the so-called “2k production-pipeline”
came into existence. Softimage deployed digital studio (DS) integrating the
3-to-2d software environment around the 2k processing formats then
stabilizing within the computer-graphics sector in general. DS supported
most elements of the 2 and 3d environments including expressions,
dopesheet, ghost mode and shape interpolation.

       The push toward “open systems” ironically took another giant leap
forward when to exploit the power of the Intel‟s Pentium processor
Softimage by then owned by Microsoft deployed the first high-end modeling
and rendering package able to run under Irix and Windows NT. Softimage
also upgraded to NURBS, relational modeling, trimming, instantiation,
polygon reduction, tangent-to-path constraint, Q-stretch, expressions, motion
control, actor, particle, mental ray rendering and metaclay. So the palette and
toolkit available to those assembling the virtual realities within which the
scenes were played continued to expand with each successive leading edge
production.

       In 1994 Softimage merged with the Microsoft Corporation and in
1995 Wavefront Technologies, Silicon Graphics and Alias Research also
entered into a merger agreement activities stimulated by the dot.com
investment boom then reaching a fevered pitch. Both company's missions
were to focus on developing the world's most advanced 2 and 3d tools for
the creation of “digital-content.” At this point the 2k-production system after
about ten years of evolution broke into the corporate exploitation stages of


                                                                              7
the process resulting in the “capitalization” of the computer-graphics
industry and its partial vertical integration into either the hard or software
sides of the IT sector in the US. At that point the elements of the film
industry, in the US, involved in computer-graphics and special effects like
ILM, Pixar or DreamWorks, or in Canada, Core, Mainframe, Rainmaker
Digital, Toybox West, AlphaCine and Eyes Post Group emerged as the
leading players in what was fast becoming the media related components of
the enhanced-computer-services sector.

       The 2k software environment stabilized the “workflows” moving
within the production “pipeline” and enabled a downscale move out of the
very upscale film industry into the more backward broadcast and DVD
sectors assisted by the relatively low bandwidth requirements of even HDTV
television signals. 2k outputs play really well on even very large video based
display units. So as the cash flows generated by DVD‟s, videogames on a
world wide basis -- surpassed -- those of the still analog and non-interactive
components of the sector such as the exhibition and broadcast sectors. The
dominant components of the film industry reached yet another critical
tipping point in terms of the overall transition towards a digital environment
within productive components of the media sector on a continental basis.

        But the 2k pipeline still wasn‟t up to speed in terms of being able to
service the image-processing requirements for screens of 36 feet and up as
was the standard within the exhibition sector. True cinema grade photo-
realistic image processing can squeak by with 2k but actually 4-to-8 or better
yet 10k is really necessary to play on the larger screens deployed in the
exhibition sector. As to fill a 36 foot, let alone a 60 x 80 foot Imax sized
screen with clean photo-realistic electronic imagery requires more that 2k.
So by the mid 90‟s the 2k production pipeline was already operating with 2k
proxies “front-ending” a 4k processing-system based on assembling on a file
server a so-called digital intermediate (DI) able to generate the 2k outputs
necessary to television signals, as well as, the 4k and up formats necessary to
the large screen based components of the marketplace.

                   Servers and Network Infrastructure

       In 1996 SGI assembled the first server-based “non-linear” editing
network by attaching eight Avid workstations to a central file-server able to
ingest, distribute and play back-to-air up to HDTV compatible video signals
or broadcast quality 2k computer-graphics. This was merely the application


                                                                                 8
of the 2k-production pipeline to the broadcast sector. Operational versions of
this first video compatible electronic image processing network were
installed at CNN‟s Financial and Headline News Networks as well as other
high profile broadcast newsrooms in both the US and Western Europe. And
the corporate exploitation of the 2k-production environment by the broadcast
sector, a process greatly accelerated by the dot.com investment boom,
picked up speed on a world wide basis.

       Also in 1996 Softimage DS became the world‟s most comprehensive
non-linear production system for creating, editing and finishing films,
television programmes and video games. The systems open architecture
enabled users to seamlessly integrate picture and audio-editing, compositing,
painting, image treatment, special effects, character generation and project
management all into one 2 and 3d operational environment. Digital audio
editing and non-compressed images met painting, compositing, titling,
image treatment and special effects. Softimage then owned by Microsoft not
surprisingly then offered a high-end non-compressed post-production system
on Windows NT.

        Jurassic Park‟s non-animatronic dinosaurs were assembled on SGI
workstations costing tens of thousands of dollars. But due to the high cost,
only large facilities with deep pockets or connections to major technology
providers could afford what at the time were the apex creative workstations.
However this began to change with the migration of the key 2 and 3d
applications to the environment provided by Intel‟s Pentium processor and
Windows NT. So that by the late 90s a completely new set of visual effects
and animation houses emerged based on Intel's Pentium processor. While at
the same time the always insightful Steve Jobs moved Apple aggressively
into the growing marketplace for image processing workstations based on
mass produced and thus cheaper consumer technologies. So while there are
still effects facilities working on a variety of platforms, SGI included, most
have moved onto either Mac or PC based workstations as these mass
produced consumer-electronics are much cheaper than the specialized mini-
computers specifically designed for computer-graphics such as SGI.

      In 1998 Avid Technology acquired Softimage from Microsoft and
"Sumatra" the world‟s first nonlinear animation system was introduced.
While Alias Wavefront also introduced a new 3d flagship product branded
as Maya Unlimited incorporating Maya Cloth, Fur, Live, and Power
Modeling. The upgraded software environment was used by ILM in Star


                                                                                 9
Wars: Episode I and ILM also invested in Maya “seats” for all their
technical directors, designers and computer-graphic artists working in their
production department. Also in 1998 Advanced Visualizer was
acknowledged by the Academy as the first commercial software package for
modeling, animating and rendering adopted into widespread use to create
digital images with sufficient quality for film industry. PowerAnimator
became the benchmark for modeling tools with a major influence on visual
effects, animation and the video game industries in general.

       Discreet Logic also based in Montreal was established in its present
format in 1999 when Autodesk merged it with Kinetix another important
American developer of CAD software. Autodesk was already a major source
of computer-aided-design and digital content creation systems that “enable
our customers to drive their particular production operations forward by way
of improving their power to design”. Discreet‟s range of cost-effective
systems and software for digital media creation, management and delivery
also crossed all disciplines and platforms from visual effects and editing to
animation, game development, web/interactive design, and computer-aided
design in general. Discreet is a division of Autodesk one of the world's
leading suppliers of digital media creation and management systems with six
million customers in 160 countries.

       By now the corporate exploitation stage within the computer-graphics
revolution was well underway and the providers of the main technology
groups, workstations, servers and network infrastructure were organized in
varying degrees of vertical or horizontal integration. Meanwhile within the
broadcast sector the server-based approach to radio and television
production was spreading. For instance in 2001 Sveriges Television selected
an SGI system for the Swedish public broadcasting company's news-
production facility in Stockholm. The facility was designed around a video-
server functioning as a central file storage system and acting as a hub
feeding servers located in the other operational components of the network.
SVT quickly expanded the network to include it‟s news, sports and current-
affairs departments. As well as being extended to 10 news rooms in
Stockholm and 10 regional production centers all inter-connected to the
network by either fibre-optic or microwave transmission systems.

      The server based approach to radio and television production involves
equipping each editing room with a PC-based non-linear editing system or
image and audio processing workstation allowing the import and export of


                                                                           10
video and audio files to and from network central for archival storage and
playback by way of a network of file-servers operating in a distributed
format. Each operational location had pairs of mirrored servers one for
storage and another image and audio processing both backed up by a super-
server for archival purposes and playback-to-air on the network levels of the
system. SVT also upgraded and automated its IBM robotic tape archive by
integrating a “media asset management system” allowing users to search for
meta-data and keyframes in feeds, browse the archive, ingested material, and
perform fast edits on all network compatible desktops with the capacity for
6,000 hours of high- and lower-resolution video on a network wide basis.

        In 2002 Alias Wavefront upgraded Maya for Windows, Irix, Linux,
Mac OS-X operating systems. Maya 4.5 boasted improvements such as
Maya fluid effects allowing users to create: atmospheric effects billowing
clouds and cigarette smoke; thick viscous liquids, mud, lava; pyrotechnics,
fire, explosions, nuclear blasts and space effects like comets, nebulae and
meteors. The addition of a new ocean-shader developed to produce The
Perfect Storm allowed the creation of realistic oceans through a
displacement and shading technique and the ability to create buoyant objects
that react to the simulated water movement. The pattern of the digital
production system being expanded by specific leading edge production units
engaged in breakthrough productions from both aesthetic and economic
points-of-view continued to unfold.

       Maya Complete included the ability to convert from subdivision
surfaces to NURBS. Smooth proxy tools were added, allowing rapid
building of high-resolution geometry with a low-resolution polygon proxy
cage. For the first time Maya was presented on all supported hardware,
allowing users running on different platforms to exchange files. In 2002
Softimage released its XSI digital-production-environment with over 2,500
new features and enhancements providing artists with the fastest tools and
workflows available for producing high quality 2 and 3d as well as 2 and 4k
imagery as the 2k production pipeline reach operational status on a network
and world wide basis.

       Interactive rendering still a bottleneck in the 4k-production process
took a massive leap forward by the employment of Mental Ray‟s seamlessly
rendering powerhouse that underlies the XSI animation system. As the push
for open systems and inter-operability increased. Softimage announced tool
sets for transporting data between various different 3d applications. This


                                                                           11
added support for Alias Wavefront Maya, 2 and 3d Boujou, Kaydara
Filmbox and Motionbuilder as well as the Half-Life, Unreal and Lithtech
game platforms mainly by way of MilkShape. These developments were all
aspects of Softimage‟s „connect‟ initiative designed to streamline the 2 and
3d image processing and 2 and 4k data transmission systems via the dotXSI
file format. Or to allow users to move 2 and 3d data sets into or out of the
various software environments and operating systems comprising the 2k
production pipeline. The 2k format compatible to the broadcast sector while
the underlying 4k format was still necessary to the higher-resolution
environment required by the film industry.

       Kaydara another important Toronto based software-provider
introduced a free tool that connected XSI to their products. This provided
users with a path to map animation from any “motion capture” source to
characters and environments created in XSI. In July of 2002, Softimage
announced a new portfolio of third party plug-ins from key industry
providers including Psyclone and Psunami, RealFlow fluid dynamics and
software from Digital Video, Okino, Right Hemisphere and Vicon. Maya
users also got access to a public beta of dotXSI xchange allowing users to
export content created in Maya to the XSI production environment. The
dotXSI xchange format was created in response to customer requests and
addresses select classes of 2 and 4k pipeline conversion needs.

       The dotXSI file format was so flexible it could also export a scene
with the camera data already calculated and all of the scene geometry set up
in a hierarchy, ready for the animators to get on with their work. Softimage
announced tools to allow XSI users to quickly and accurately re-construct
lighting conditions from live-action sets into 3d computer graphics. The
process provides for more realistic simulation of natural and outdoor lighting
in 3d scenes and reduces production times and costs within the 2 and 4k
sides of the system. Here some of the last bottlenecks in relation to
assembling perfect photo-realistic background “plates” were being engaged.

       In 2002 Avtoma, the largest producer of visual effects in Italy chose
to standardize exclusively on the XSI software as its production
environment. With the opening of a consulting office in Los Angeles, a
production research and development group in Montreal, and a creative
production team in Milan, Avtoma and Softimage were positioned to
continue the application of XSI as the standard high-end electronic-image-
processing system. Softimage and Avtoma were clearly positioned to utilize


                                                                             12
their creativity and expertise and the mature 2k-production pipeline in their
Los Angeles, Montreal and Milan operations by way of a network based
production strategy of some kind.

       Continuing with the pattern of establishing research facilities to
support the further growth of electronic-image-processing systems in general
in July of 2002 Alias Wavefront announced a custom software development
service inspired by its long-term relationship with DreamWorks Animation.
The research facility involved Alias Wavefront in engineering custom Maya
extensions and new workflow tools for the high-end computer graphics
industry. This for-hire facility was available to service the west coast film
and game production interests. The mandate, “was to provide long-term
custom engineering support for those production facilities where Maya plays
a central role in the computer-graphics pipeline. Through the direct
modification of the Maya source code the design team will engineer new
features and extensions to Maya, assist with production pipeline integration,
and provide custom technology innovations for clients in the gaming sector
and film industry” as pointed out by the facilities executive director at the
time. “But engineering services are just one of the products available from
the research centre others include, training, dedicated support, and workflow
process development targeted to both entertainment and gaming industries
and design customers in general.”

        What was happening was that the elements of the production sector
involved in computer-graphics and special effect were themselves
“morphing” into key components of the enhanced computer-services sector
in general and the media production sector in particular. So as we moved
towards the 21st century the 2k image processing systems began migrating
into the production and distribution systems in general. What was perhaps
most interesting about the migration of the chip sets and applications out of
the computer-graphic sector and into the production sector in general. Is that
it involved a transfer of digital technologies from a small but upscale part of
the production sector and into the main elements of the media production
and broadcasting sectors in general a very large but at that point a downscale
and backward component from an engineering point-of-view. So in making
this transition the chipsets were leaping from the high-cost ghetto of the
enhanced-computer-services sector into the components of the production
system involved in post-production, packaging and distribution and thus
broadcasting and e-publishing in general.



                                                                             13
       This was important from an economic point-of-view as it also meant
that the relatively limited market for high-end image processing systems
provided by the computer-graphics industry during the opening stages of the
process. Was being expanded to include the market comprised by the film,
television and video game production, distribution and e-publishing sectors a
much larger market for these technologies than the computer-graphics-
industry. For instance within fastest growing component of the transactional
marketplace established on the Internet. The application of file-servers
which enabled the “streaming” of video signals on a mass market basis via
the Internet. Is what allowed the estimated 5 billion dollar a year world wide
market for streamed pornography.

       Not surprisingly, perhaps given the true nature of the soft-power
strategy and how it attaches itself to culture in general, pornography has
always played a leading role in the evolution of the entertainment industry
from both economic and engineering points-of-view. Since the printing
press, almost every new invention has been embraced, improved and/or
downright created by pornography. If necessity is the mother of invention,
porn is its sleazy uncle. "If it wasn't for the subject, people would be praising
this as an engine of technological growth," says Jonathan Coopersmith, an
associate professor of history at Texas A&M and an expert on porn's effect
on innovation. And where would we be without porn? The new gadgets and
technologies still would be around, says Coopersmith, but fewer people
would have them, fewer people would be willing to try them, and they'd be a
lot more expensive. That means you probably wouldn't be enjoying high-
speed Internet, a digital camcorder and that $40 DVD player.

        Here's how "adult entertainment" over time has fanned the flames of
innovation: The printing press Gutenberg probably didn't use his press to
print a Middle Ages version of Penthouse, but not long after movable type
arrived, folks did begin using it for less-than-holy texts. And what exactly
were they? "Who could read at the time? You're talking about mostly literate
people," says Coopersmith. "It's fairly refined. It would be what we call
erotica." How long after the invention of picture-taking around 1840 does it
take for naked people to get in front of a camera? "Right away," says
Coopersmith. Dirty pictures widened porn's audience, but just as significant,
says the professor, is that pornographers bought lots of equipment and
supplies -- glass plates, cameras, chemicals. That sped up the profitability of
the photo industry, accelerated innovation and widened the market.



                                                                               14
       Motion pictures - The first porno movie was made in 1896, just two
years after moving pictures were introduced to the world. Much like still
photography, "stag films," while on the fringes of society, fueled the market
for photographic materials such as cameras, projectors and film stock. It also
helped lead to the development of 8 mm film, which folks could use
affordably for home movies. The Polaroid camera meant no longer could
your nosy druggist have to see your pictures, and that meant you could take
shots of anything (wink, wink). Although it named its first low-cost model
"The Swinger," Polaroid didn't advertise the obvious naughty uses of its
product; it didn't have to. As with many new technologies, early Polaroid
cameras and film were pricey, but do-it-yourself pornographers were eager
to pay it, paving the way for cheaper versions.

      The VCR and camcorder starting with home video, pornography
began leading the innovation, not following, says Coopersmith. Early VCRs
and video cameras were expensive, but porn users and people who wanted to
make their own "art" were willing to pay. Through the 1980s, half the
videocassettes on the market were adult, according to the professor. That
helped turn the $1,000 VCR in 1978 into the $200 VCR in 1988. You also
can thank pornography for lots of the cool features on your camcorder,
including low-level lighting capabilities. The Internet. Welcome to porn
paradise, where hundreds of thousands of porn sites cater to every taste you
can imagine -- and plenty you can't. Porn has driven almost every innovation
on the Net, says Coopersmith, from online payment services for streamed
video to Web casting. It's also responsible for much of the rush to high-
speed connections.

       So as pornography, so goes technology. The concept may seem odd,
but history has proven the adult entertainment industry to be one of the key
drivers of any new technology in home entertainment. Pornography
customers have been some of the first to buy home video machines, DVD
players and subscribe to high-speed Internet. One of the next big issues in
which pornographers could play a deciding role is the future of high-
definition DVDs. The multi-billion-dollar adult film industry releases about
11,000 titles on DVD each year, mainstream Hollywood eat your heart out,
giving it tremendous power to sway the battle between the two groups of
studios and technology companies competing to set standards for the next
generation of HD-DVDs.




                                                                            15
       "It's sort of like the buzz around the campfire," said Peter Warren,
DVD editor at industry bible Adult Video News. One side of the divide is a
standard called Blu-ray backed by consumer electronics heavyweights like
Sony Corp. Philips Electronics, Thomson and movie studios Fox and
Disney. Blu-ray offers storage up to 50 gigabytes, enough for nine hours of
high-definition content. On the other side of the fight is HD-DVD, which
has much the same structure as current DVDs and, backers say, is cheaper
and easier to manufacture as a result. Supporters of the disc format and its 30
gigabyte capacity include NEC, Toshiba Corp. Warner Home Video and
very recently Microsoft.

       And long before the Hollywood majors took positions on the coming
HD format the pornographers weighed in. At last Januarys Adult
Entertainment Expo, which runs parallel with the Consumer Electronics
Show. The pornographers were demonstrating the same technologies usually
at least a generation ahead of the mainstream producers. Sentiment about the
format rivalry varies, depending largely on the size of porn producer. But
like all straight filmmakers adult film producers want the higher quality
picture as well as extra space for “creative expression” wink, wink nudge,
nudge perhaps by giving viewers a fetish based choice of camera angles

       What's next. "Goodness knows," says Coopersmith. After years of
studying pornography's influence on technology, he is certain about one
thing. "Human behavior is stranger, far, far broader, than I would have
imagined before I started this research." For instance late in 2004 the adult
film company New Frontier Media and wireless content provider Brickhouse
Mobile started offering cell-phone "moan tones." That means you'll be able
to replace your boring ring tone with the actual steamy sounds of a
hyperventilating adult-film star (sure to be a hit when your phone goes off at
the library). Porn on cell phones is hardly a surprise. And clearly whole idea
of “stroke TV” takes the concept of interactivity to a new level.

       It was also the server-based approach to interactive broadcasting or e-
publishing when combined with audio compression technologies that enable
the various peer-to-peer downloading strategies causing the music industry
so much grief due to piracy. So the impact of servers and network
infrastructure within the transactional sides of the electronic-distribution-
sector by the turn of the millennium was already leading the evolution of
both the production and distribution sides of the sector from economic, and
perhaps even more shockingly, from an aesthetic point-of-view. And this


                                                                            16
trend is going to accelerate as the cost of these various image, audio and
data-processing and display systems continue to decline. While their power
and processing capacities, apparently subject only to the limitations of
Moore‟s Law, continue increasing. Its important to acknowledge at this
point we‟re still only in the opening stages of this process from engineering,
economic and perhaps more frighteningly even from aesthetic and the even
more deeply rooted cultural point-of-view.

       Suggesting the trend towards digital workstations and network based
production systems within CNN in the mid 90s was spreading within the
production sector on a world-wide basis as the economies of scope and scale
available to the providers of the strategic chipsets and applications impacted
the production and distribution sectors in general. As by now the 2k-
production pipeline is being marketed around the world by the technology-
providers apparently astride the broadcast sectors inevitable push towards
HDTV and IP based production and distribution systems of all kinds. A
process that will accelerate as the analogue components of the media sector
in the film industry and broadcast sector, with eyes still firmly clamped shut
and with panic rising in their throats, bite the bullet and start
opportunistically groping for the interactive and transactional formats now
coming on stream on a world wide basis via the Internet.




                                                                             17
                     Production Network Experiment




      The deployment of broadband networks and high-capacity servers will
      permit a degree of coordinated production management that has the
      capacity to erase the distinction between local and remote activity.
      High-speed links will permit the instantaneous sharing of digital
      files… For NFB productions taking place outside Montreal, this
      means that the time it will take to get comments or approval will be
      the same as it is for people who can just walk down the hall. It means
      that personnel in different locations will actually be able to work on
      the same production... The converging and consolidating media
      environment will change the way audiovisual works are made and
      distributed, not just in Canada but around the world. Small companies
      and individual artists will face even greater odds in dealing with large,
      internationally oriented production and distribution companies.

                                          NFB Forward Looking Plan 1999


                   9/11 and Network based Production

       Video transmission for review and approval, digital dailies and real-
time collaboration are subjects that have been suggested, debated and under
discussion within the film and television production sectors since the early
'90s. The promise was always there but a combination of technical and
economic factors kept these concepts from realizing their full potential. But
over the past few years as the technologies improved and became more
economical as Moore‟s law continued to apply within the hardware sector.
And since the 9/11 attack the dialog has heated up considerably with some
companies such as MidNet and Picture Pipeline both located in Vancouver
landing key contracts with runaway American film and television producers.
The idea was to assemble some network infrastructure able to interconnect
the various necessary components of the production sector on a world wide


                                                                            18
basis and mimick the operations of the Internet backed up where need be by
high-capacity proprietary networks operating as what correctly should be
understood as a production network of some kind.

       Following 9/11 all the Internet based video delivery services reported
a sharp increase in customer demand. As people were just less willing to
travel wanting to stay cocooned closer to home and the ability to work from
anywhere (including from home) and to reduce always expensive and
stressful travel was a key selling point for these services. Some new
customers were merely trying to solve immediate production problems. For
instance, when Fed Ex was still not operating, Vancouver‟s MidNet set up a
method for digital file transfers from runaway productions then underway in
Vancouver to their head-offices and post-production facilities in Los
Angeles. Similarly, Picture Pipeline delivered motion picture dailies from
Vancouver to Los Angeles. So once again another leap forward in terms of
the deployment of network infrastructure into the productive and services
related sides of the film and television industries was being guided, directed
and the key engineering solutions being made operational by corporate
interests involved primarily in the production related services industries in
Canada.

       So it turns out that a lot more than just movies and television
programs have been created in Hollywood North. In 1999 the services
components of the film industry in Vancouver gave birth to the Middle
Network which its founders hoped would become a sort of commercial
alternative to the Internet. The idea for the Middle Network was actually
launched through a consulting contract granted by Technicolor and a number
of Hollywood based studios working in Vancouver wanting to “digitize”
their workflows. Tilo Kunz, co-founder of MidNet, said the production
studios told him while they liked the neutrality of the Internet and that it
allowed then to talk to each other and was affordable, they felt it lacked
security and still couldn‟t give them the on-demand reliability they needed.

       Studios can have $150,000 a day tied up in production for a single
project, so the process needs to work smoothly. With a single film weighing
in at several terabytes uncompressed, Kunz said the at that point the Internet
just wasn‟t feasible, nor was a private point-to-point connection affordable.
"If the Internet wasn‟t working and they were relying on it that would cost
them heavily," said Kunz. "They wanted things the Internet could do, and
things it couldn‟t do, but that some networks should be able to do, namely


                                                                            19
deliver advanced communications services between Vancouver and LA
reliably, affordable and securely." Working with the necessary hard and
software-providers and telecommunications interests MidNet arranged for
access to the local loops and inter-exchange bandwidth. Then they installed
equipment in the customers premises and in a neutral co-location facility.
Minimum bandwidth started at OC3, or 155 Mbps, and went up from there
in multiples of OC3, dictated by demand. "It was our equipment inside and
out, and it‟s the bandwidth that is private to us but still owned by a telco,"
said Kunz. "We‟re not the whole solution, we‟re just the data transport. We
work with companies that provide applications services."

       MidNet launched in August of 1999 with a contract from the
Vancouver branch of Technicolor Creative Services, a major services and
post-production company, to transmit data between TCS and a Vancouver
motion picture production complex. "The film industry is a key for us
because it represents probably the world‟s most demanding user of data
services," said Kunz. "There are very high expectations, very little tolerance
for failure, and they move an awful lot of data." Another Vancouver post-
production company, Rainmaker, also started using the Middle Network on
television shows like Stargate SG-1 and Smallville. Rainmaker president
Barry Chambers said the firm used the network to transmit data between
Vancouver and Los Angeles. "We‟re always looking at new technology,"
said Chambers. "It‟s something new out there, we thought we‟d give it a go
and see how it works."

       Chambers said Rainmaker had used the Internet for data transmission
in the past, as well as private networks that require an application on each
end. MidNet use of fibre means that's not necessary on the Middle Network.
"We were attracted to the faster speed, and the security of a protected
network," said Chambers. "If we‟re sending hour-long dailies to Los
Angeles we don‟t want to loose any information, and we need high-speed."
In addition to the film industry, Kunz said MidNet was also soon working
with the oil and gas sector and the Provincial government not surprisingly
focusing on medical-imaging and the health-care field in general.

      MidNet started out mainly servicing runaway American productions
on location in Vancouver but controlled and post-produced from LA, but the
network based approach to production management and digitized workflows
was soon popping up all over the continent and in fact around the world. For
example MidNet was soon offering its networks to connect post facilities,


                                                                             20
edit suites and executive offices in New York, Los Angeles and Canada; and
offered production tools and services of various kinds including high-speed
delivery of pre-digitized files, nonlinear viewing of digital dailies, live video
collaboration and the ability to digitize and share edited work in progress. To
enable these applications, MidNet connected the Canadian based facilities
Rainmaker Digital, Toybox, Toybox West, Medallion/ PFA, Manta Sound,
Manta DSP, AlphaCine and Eyes Post Group to all on the production
network on a per production, continental and via the Internet on a world
wide basis.

       In the case of the Smallville television series, for instance, the
production was based in Vancouver, while the writing and review of visual
effects and postproduction took place at multiple locations in Los Angeles.
The telecined dailies were digitized as both Avid OMF files and MPEG-2
files and transmitted from Rainmaker over MidNet's high-speed connection
were accessed by editors at Warner Bros. in Burbank, the WB dub room,
and the production offices of production company Tollin Robbins, North
Hollywood. The material was also received by Blake McCormick, VP of
postproduction, Warner Bros. "In Smallville, we had an obvious location
challenge," said McCormick. "By shortening the distance between
Vancouver and Los Angeles, MidNet saved us much needed time throughout
the production and postproduction processes."

       Similarly, Gilmore Girls and Maybe It's Me were digitized and e-
published by Kodak‟s Laser Pacific, and sent to producers and editors of the
WB series. For production of The Education of Max Bickford, MidNet
linked Silver Cup East in New York with the editing rooms and production
executives on the Fox lot in Los Angeles. The connections enabled the
editors to send rough cuts from the editing suite to MidNet's server where
executive producer Rod Holcomb (in New York) and executive producers
Nicole York and Don Prestwich (in Los Angeles) could review and
approvals, and collaborate online with the editors in real-time as they work
through the creation of the final version of the show. Max Bickford
producers also used Media.net video conferencing to conduct table reads
with actors in New York. Rich explained that Media.net's private carrier
typically operates at OC3 speeds (155 Mbps) or higher.

       Meanwhile Picture PipeLine, was also making strides in the motion
picture arena, signing deals to provide secure broadband networks and
digital production tools to 20th Century Fox Films features, Behind Enemy


                                                                              21
Lines and The First $20 Million. Both installations, while vastly different in
set up, reduced unnecessary production costs and turn-around times.
Additionally, Charlie Mitchell, VP of sales and marketing, reported the
company installed a connection for use on Matrix II, enabling the production
to send digital dailies from the shoot in Australia to Los Angeles for the
visual effects team. With core compression technologies having been
developed by TRW, America‟s foremost manufacturer of spy satellites, in
collaboration with Warner Bros. Picture PipeLine now provides a high-
speed digital network to securely stream video for real-time collaboration,
file transfer and annotation to selected locations all around the world.

       For Behind Enemy Lines, Picture PipeLine assembled a broadband
network that connected Fox Films' visual effects senior VP Rich Thorne
with seven Los Angeles-area locations including Pacific Title, Encore
Hollywood , Reality Check, Digital Filmworks, Churchill and the Fox studio
lot. Thorne used Picture PipeLine's file transfer, synchronized playback and
annotation features. Said Thorne, "I was set up to monitor the visual effects
for the two separate films. The effects houses were situated in Los Angeles
and Canada with daily downloads from each facility. We have been able to
increase the speed at which shots are finalized. This allows us save the
studios serious money." Production of The First $20 Million also employed
Picture PipeLine's encryption and file transfer technologies to connect
production staff at DKP in Toronto with the Fox studio lot in Los Angeles.
Mitchell noted that the system is carrier agnostic, and users can rent the
necessary bandwidth by the month the use of the various applications.

       Curtis Clark, ASC, who is also CEO of NeTune Communications,
also sees production teams, "reassessing how to have collaborative
arrangements, without having to be in [one] place." And NeTune offers a
variety of long-distance applications, including real-time collaboration,
video conferencing, digital dailies and review and approval. Real-time
collaboration also occurred during the production of MGM's Heart's War.
When NeTune established a link between MGM in Los Angeles and scoring
and ADR stages at Goldcrest and Sony in London. Clark reported that the
link was frequently online for at least eight hours a day. Director John
Frankenheimer and DOP Stephen Goldblatt, used NeTune for review and
approval of dailies for the HBO film Path to War. And Ridley Scott's team
on Black Hawk Down used NeTune for visual effects collaboration between
The Mill in London and Revolution in Los Angeles.



                                                                           22
        These various production network related experiments soon generated
structural adjustments within the services industries in Vancouver aimed at
expand transmission capacity and extended wireless connectivity within and
around the city. For instance recently Universco Broadband and
AirCelerator Wireless merged to form MetroBridge Networks and along
with the purchase of ZooLink the merger consolidated three previous
competitors with more than 400 combined customers. MetroBridge‟s
redundant metropolitan area microwave network delivers Internet
connections with speeds up to 100 Mbps and in the urban core, up to 1 Gbps.
The new company's consolidated coverage now includes most areas in the
lower mainland, extending from West Vancouver to Maple Ridge and
totalling more than 2,000 square kilometres.

       And as other production related interests saw what was happening
between L.A. and Vancouver they began coping the approach and initiated
similar services all over the US. For instance DDB, a large Chicago based ad
agency relied on Telestream's Clipmail service to distribute production
related signals on a world wide basis. Harold Smith, digital systems manager
at the agency reports that immediately following the 9/11 attack, when
production teams were unable to travel, the agency relied on Clipmail to
keep their activities on track and on budget. The network was used to send
and receive elements for a JC Penney campaign between the Chicago
agency, the Dallas-based production company and via the internet between
the agency and the shoot in South Africa. Smith added that DDB also
increased its use of video conferencing in the wake of the tragedy.

       Steven Rich, senior VP of business development, MidNet, observed
another factor in the rise of interest in network or IP based production in
general. "Studios have beefed up their security substantially, making it more
difficult to get on and off lots, reducing the efficiency of delivering content
on tapes. One way to handle or reduce the amount of couriers and delivery
people on studio lots is to move content over networks between
workgroups... If [studios] want to eliminate tape from their life, they can
begin to," he said. "Sometimes it takes a significant event to get people to
consider new technologies or processes," Rich commented. "The fact that it
continues to get more difficult to physically move media around, certainly
will open up the marketplace for moving media on networks."




                                                                             23
       While the events of Sept. 11 have certainly spiked interest in video
delivery services and video conferencing in general, there were already a
number of drivers which were impacting the sector, indicating that this may
be the start of wider acceptance. The first was the new economies of content
creation, explains Matt Peterson, president of information research firm
Scenic Wonders. "Factors ranging from increased international competition
in production and postproduction dollars, to economic pressures on content
distributors are driving content creators to be more cost-effective. Remote
delivery and collaboration can reduce expenses and enable local creators to
compete in a global market," he said. Second, is the reengineering the
content distribution business. " Television ownership groups are taking
advantage of the transition to digital to move away from the model of each
station being a self-contained replica of a fully-function broadcaster. Remote
operation and sharing of local/regional news, sports and other programming
will drive more video delivery. For similar reasons, video delivery is
emerging as a key component of digital cinema the theatrical film industry's
version of DTV," Peterson said.

       Third is efficiency in the business world. "The corporate world has
long wrestled with the growing costs and wasted productivity of business
travel. Recent events (in relation to fraudulent and even criminal styles of
corporate governance within the media and telecommunications industries
within both Canada and the US) have only served to place more scrutiny on
business travel. Video delivery and conferencing will increasingly replace
travel while providing a richer experience than traditional audio conference
calls," said Peterson. These developments in combination have allowed a
number of production-network services providers to emerge as players on a
continental basis including needless to say some operating within Canada.

        So the marketplace for production networks soon was becoming
crowded with new players. Editvu is a newer American online digital video
delivery application designed for reviewing, storing and distributing
streamed digital video launched early in 2003. Current uses of Editvu
include: Uploading and reviewing short-term, low-memory, encoded digital
dailies and editing cuts; uploading, storing and organizing long-term, larger
memory digital video content as an asset management base; and delivering
vaulted digital video content on a subscription and/or pay-per-view basis.
The system was originally developed as a value-added service for clients of
Editvu's originator, Chick, Inc., a Hollywood based advertising and
marketing company that primarily served postproduction accounts.


                                                                            24
       Once Editvu completed the systems beta testing many Internet-related
companies, including Anystream, Archion, Broadwing, Callisto, Cisco,
Encode One, IntelliSpace, Qwest, Sprint and Terran, approached the
company concerning possible market alignments and product bundling. By
now Sony Pictures Imageworks, L.A. Digital Post, Keep Me Posted,
Moviola Digital Education Center and many services and post-production
interests are using or have used the system. Chick then allowed IntelliSpace
to be the default provider of the Editvu interface. The goal was to give
Editvu users greater flexibility and control over uploads and downloads of
digital content, reported Ciccarelli. Users increased their bandwidth speed
from 64 Mbps (fractional T1) to as high as 1.244 Gbps (OC24), when they
needed it. Users were billed for the length of time they use it and rates for
usage will vary with the level of bandwidth they require. IntelliSpace
operates in 15 markets around the world and has the largest networks in New
York and London, as well as southern California.

       Meanwhile back where it all started in 1999 Middle Network in
Vancouver continues from strength to strength. For example in June of 2005
they announced that Digital Film Group, utilized their network to move
sensitive film material it required to fulfill specialized services on behalf of
Intel Corporation. The footage was sourced from high definition master
tapes that could not be moved by courier due to piracy concerns. Instead,
Digital Film Group chose to transport the material over The Middle
Network, utilizing a private circuit between Lightning Media in Hollywood
and Rainmaker in Vancouver.

       According to Joe Alaniz, DFG's manager of technical services, "This
job could not have happened without MidNet. The content owner wouldn't
allow the high definition masters to be shipped anywhere because of piracy
concerns. Instead, we transferred the footage from Lightning Media's
Hollywood facility to a secure DFG server at Rainmaker in Vancouver.
After the high-speed transfer, our customer in Los Angeles departed with his
tapes, and we had the material we needed in Vancouver to finish our job for
Intel." Tilo Kunz, MidNet's president and CEO added, "This session
provided a good perspective on the ways a secure, private network like The
Middle Network can complement the public Internet. We're gratified by the
trust DFG placed in our network and the support of our partners at
Rainmaker and Lightning Media. It's good to see the network
commercialized in a cooperative effort like this." MidNet is the creator,
owner, and operator of The Middle Network, which provides high-speed


                                                                              25
digital connectivity to companies and individuals who wish to create,
manage, and distribute valuable intellectual properties on a world wide basis
in a secure and cost-effective manner.

       And not surprisingly SGI was also an early visionary of network
operability and supplier of video delivery systems, partnering, for example,
with Sprint's Drums collaborative system in the early '90s. SGI remains
committed to this vision. It offers customized systems, and continues to
work with video delivery application companies. For instance, SGI is in an
engineering partnership with Toronto-based JCI Corp., provider of the
Fireline digital collaboration tools. Together, the companies are developing
scalable broadband tools using the SGI platform. JCI and its technology
partners plan to leverage modular computing technology and SAN systems
from SGI to build uncompressed high-definition and real-time interactive
networking. The aim is to facilitate the transport of and collaboration on
resolution-independent files for the production and postproduction of visual
effects, animation and compositing. SGI leverages JCI Extranet, JCI's digital
collaboration platform, used by companies such as Kodak‟s Laser Pacific. In
other instances, SGI creates customized systems using its server and
workstation technology, and StudioCentral Library digital asset management
system.

       And Montreal is not without major players in the marketplace for
network based production systems and services. For example Miranda
Technologies recently launched Miranda Media Networks a fully-owned
subsidiary, following a joint investment of $19 million by SGF, a division of
the Société générale de financement du Québec, BDC Venture Capital, a
division of the Business Development Bank of Canada and Investissement
Desjardins. “This placed Miranda in the interactive broadband delivery
markets for both services and technologies” says Miranda president
Christian Tremblay, "This includes applications such as broadcast, program
production, video conferencing, telemedicine, distance learning, and
telejustice."

      The company is already shipping its MAC 500IP multimedia access
concentrator designed to allow businesses to use their existing IP networks
for multimedia delivery in interactive applications such as
videoconferencing, distance learning, and telemedicine. The MAC 500IP
conforms to the latest LAN, WAN, and video standards to enable low-
latency video transport of MPEG-2 video, audio, and data over broadband


                                                                              26
networks. It provides industry-standard MPEG-2 video signal compression
supporting all video frame-coding modes (I, IP, IBP, and IBBP), and
features a video bit-rate range from 15 Mbps to 800 kbps. Miranda recently
announced a joint agreement with Streaming 21, a Silicon Valley-based
developer of media streaming technology. The two companies have
expanded the reach of interactive MPEG-2 video communications by
streaming to personal computers and laptops, with additional applications
for archiving and video-on-demand.

       The agreement features integration of Miranda multimedia multipoint
server and MAC 500 platforms for delivery of broadband multimedia with
Streaming 21's Media Platform, a technology for scheduled and live video
streaming . "Our joint technologies provide clients with real-time, broadcast-
quality videoconferencing at interactive participant locations, and enable a
much larger observer audience via desktop, portable computers, mobile
communications devices, and HD television monitors," said Charles
Voelkel, business development manager at Streaming 21.

       In May of 2005 Miranda announced their iControl monitoring and
control over IP System and Kaleido-Alto-HD multi-image processor. The
system allows complete monitoring and control over IP based distribution
systems for the network operations centers of cable, satellite, and IPTV
operators. Many new features have been added transforming the system into
a high-level media asset management system specifically tuned to television
and radio or electronic-publishing operations of all kinds. The system
leverages industry-standard SNMP protocols for monitoring and control of
video, as well as, all of the surrounding processing, transmission,
networking, and location specific technologies. A new feature allows third-
party applications to be hosted directly within iControl. Another innovation
is scripted macros that allow automatic reaction to system failures that guide
operators through complex diagnostic and repair procedures.

       All these features enable IP based broadcasters or electronic-
publishers to diagnose and repair complex systems on a production network
wide basis more effectively, dramatically reducing downtime. This is really
just the production network concept moving into a distributed-computing or
grid format and down market into the broadcast sector in preparation for the
coming explosion of HD, IP and server based broadcasting or electronic
publishers that‟s clear just over the horizon. .



                                                                            27
       Miranda‟s Kaleido-Alto-HD 10-input processor is an HD version of
their successful existing multi-image display processor that accepts any HD
format and 525/625 SDI, as well as analog composite inputs. The system
provides a high quality DVI output with up to 1280x1024 pixels. It offers a
feature-rich display with embedded, AES and analog audio-level metering,
along with source IDs, tallies, aspect ratio markers, and clocks/timers. The
easy-to-use processor offers off-line layout editing with a full choice of
window resolutions, sizes, positions, and aspect ratios.

       In May of 2005 Miranda announced that its “channel branding” and
monitoring systems were key parts of a centralized and consolidated TV
presentation solution implemented by la Télévision de Radio-Canada. The
network has brought together automation, playout, master control, and
channel branding for all of its 10 regional stations and one specialty channel
at its Montreal headquarters. And once again the components of the media
production and broadcast industries located in Montreal were leading the
way on a national basis. "Centralizing operations in Montreal has brought
greater control to our playout operations and greater consistency to
individual channel broadcasts," said Daniel Guévin, CBC/Radio-Canada's
manager, Broadcast Engineering, Montreal. "With the introduction of this
new solution, we are making dramatic changes to la Télévision de Radio-
Canada's on-air presentation, with improvements including more
sophisticated graphics in program junctions and more engaging channel
branding."

       The network's centralized presentation operations integrate Encoda
station automation with Miranda's Imagestore branding processor and
VertigoXmedia's CG, which together enable just three operators to control
as many as 10 channels. The Imagestore processor features capabilities
including video and multi-group audio mixing for radio broadcasting, dual
3D-DVE effects, and the integration of multiple layers of computer-
graphics. Another major challenge for the network was effective monitoring
of the 11 independent channels generated from Montreal, as well as the off-
air signals from all of the network's remote facilities. Miranda Kaleido-K2
multi-image display processors perform local video and audio monitoring in
the Montreal control center. Close integration between these processors, the
automation, and the Omneon Spectrum media servers allows the operators to
monitor the current and next clips along with associated clip timing
information. The advanced monitoring wall also provides monitoring of
closed captioning, and extensive video and audio alarm reporting.


                                                                            28
       Advanced remote monitoring of outgoing and off-air feeds from all of
Radio-Canada's French TV stations from Moncton to Vancouver is now
provided by Miranda's iControl system. Each regional transmitter output off-
air signal is probed, and a thumbnail image is streamed back and then if
necessary a higher-quality full-frame-rate streaming signal can be pulled in
on demand via the existing network infrastructure. to centralized operations
in Montreal The iControl monitoring environment provides a "geographic"
graphical view of the system with streaming video presented on demand
from each regional transmission site.

      So Miranda Technologies is clearly another Canadian world-leading
manufacturer of video and audio equipment for the broadcast, production,
and post-production and electronic-publishing markets, with a product range
spanning interfacing and distribution, routing, master control switching and
channel branding, multi-image display processors, and monitoring over IP
systems. Headquartered in Montreal, Miranda also has major offices in
Springfield (N.J., U.S.), Wallingford (Oxon, U.K.), Paris, Hong Kong,
Beijing, and Tokyo. The company's products are also represented worldwide
by a network of professional distributors and dealers.

       And Miranda is not the only Montreal based company with position in
the international marketplace for film-production, broadcast and multi-media
related production and distribution technologies. In June of 2005 Avid
Technology, which owns Softimage, announced that Kraft Sports
Productions, the production company for the New England Patriots had
deployed an Avid digital postproduction workflow to produce weekly
televised football programming. The new digital production environment
will be used to create a range of programs, including Patriots: All Access - a
weekly series broadcast on ABC-affiliate WCVB-TV.

       The end-to-end postproduction workflow centers around the
revolutionary Avid unity MediaNetwork a shared-storage system, and
connects an Avid AirSpeed system for ingesting media with Xpress- Pro and
Media Composer Adrenaline systems for video editing and finishing. "We're
producing programs that showcase the extraordinary teamwork behind the
New England Patriots organization.” Avid's tools allow us to cut production
cycles significantly and invest more creative time on delivering shows that
are sure to keep viewers on the edge of their seats," said Dave Mondillo of
Kraft Sports Productions.



                                                                           29
        "Moving to a digital postproduction workflow can be challenging. To
do it right, we knew it would require a reliable technology infrastructure and
a well-trained staff of editors and producers who are up-to-date on the latest
production capabilities. We chose Avid systems because they give us the
performance and flexibility to do more than we ever imagined." With
footage and clips stored on the Avid network and server system the new
postproduction workflow enables us to take advantage of a shared-media
environment, allowing multiple editors in different locations to work
simultaneously and in real-time. This new environment will eliminate the
need to shuttle hundreds of tapes back and forth to users, thereby increasing
efficiency and reducing the time it takes to complete projects.

      Dana Ruzicka, vice president of post solutions for Avid, said, "Kraft
Sports Productions is the only the latest among a growing group of media
professionals who are migrating from standalone editing workstations to
collaborative workflows. With a new end-to-end postproduction
environment in place, Kraft Sports Productions is empowering creative
teams to easily share media between systems and locations, and to
simultaneously work on various elements of any program that they currently
have in production.”

       Since launching its DigiDelivery system, Avid Technologies has seen
major music, post production, broadcast, film, and video game developers
around the world incorporate the network and server approach for fast,
secure exchange of digital files. Facilities that have adopted the Avid system
include Warner Brothers, Fox Film, Technicolor, Ascent Media, Universal
Music, SDI Media, and Skywalker Sound. "Harry Potter”, “Polar Express”,
and the “Matrix” series all benefited greatly from the time we saved using
the avid system." says foreign language dubbing supervisor Ray Gillon.
"We're all doing better work - and getting more sleep."

       The system combines an ultra-simple software utility with an
industrial-grade server, giving users the ability to exchange any kind of
digital file of virtually any size with anyone in the world. The network is
faster, easier to use, and more secure than FTP, and it provides a more
affordable alternative to overnight delivery services. The recently released
software upgrade which includes server-to-server relay which accelerates the
delivery process by automatically transferring deliveries from the sender's to
the recipient's server. In practical terms frees up workstations for more
creative tasks and extends the sender's deadlines.


                                                                            30
       For example, a delivery sent late at night is still ready and waiting on
the recipient's network when they arrive in the morning. The software
enables users to create and manage multiple deliveries, add them to a queue,
prioritize them, even quit and re-launch and the system will pick up exactly
where it left off. Users can name each delivery, add billing information,
view a history of recent deliveries, and save login information for their
frequently used accounts. The system addresses the growing need for a high
performance that lets you easily exchange any kind of file with anyone in the
world, but it is still secure enough for the major studios," says Gordon Lyon,
Product Manager for Avid. "By eliminating the need to ship disks and tapes,
we removes the analog bottleneck from the digital workflow."

       Digidesign, is a division of Avid Technology, perhaps the world's
leading manufacturer of digital audio production systems providing the
professional music, film, video, multimedia, and radio broadcast industries
with tools for digital recording, MIDI sequencing, editing, signal processing,
and surround mixing. M-Audio, a leading provider of digital audio and
MIDI solutions for electronic musicians and audio professionals, is also
business unit of Digidesign. Their products are marketed in over 40
countries worldwide through a distribution network of value-added dealers,
distributors, and OEM relationships.

       And Miranda and Avid/Softimage are not the only Canadian
companies with position in the international marketplace for film-
production, broadcast and multi-media related production and distribution
technologies. For instance more involved on the hardware side of things like
Miranda and focused more into the traditional elements of the broadcast
sector Leitch Electronics founded in 1971 in Toronto also designs and
manufactures equipment to store, distribute, process, switch and route high-
quality video and audio signals by way of both wired and wireless
distribution networks. Leitch‟s product-line is also applicable to all aspects
of 2k electronic media production, distribution and content creation from the
desktop video to full-scale HDTV compatible television studios. Leitch‟s
core product line includes non-linear editors and file servers unmatched in
terms of either price or performance. Leitch‟s product line is economical
enough for corporate, educational and industrial video production. While
powerful enough for film and video post-production facilities, major
television networks and cable companies.




                                                                            31
       And due to the breadth of Leitch‟s product lines and solid market
position in Canada they have played a role as a provider of entire broadcast
environments rather than merely of specific lists of equipment or boxes of
various kinds. This is because Leitch has always worked closely with both
customers and suppliers to assemble complete packages of technologies in
integrated formats matched to the specific needs of their particular clients.
"We are one of the largest companies, globally, totally dedicated to the
professional television industry," indicated Margaret Craig Leitch‟s
President and CEO in 2001 "we expect to participate in a greater portion of
our customers' capital budgets as they continue to look for ways to
streamline their operations and enhance their quality and creative output…
Leitch has been providing products to the professional video industry for
over 30 years based on a reputation for stability, customer service and
innovative, dependable technology solutions. Leitch's customer base spans
the professional video market and more recently has been expanding to
include cable, satellite, telco and post-production companies.”

        For instance in 2001 Leitch was chosen by Panasonic to play a role in
their involvement in the Olympic Winter Games at Salt Lake City. Leitch
provided equipment to Panasonic the games “host” broadcast equipment
supplier. And Leitch also has important Canadian customers for instance in
June of 2002 TVOntario chose Leitch to provide the bulk of the
infrastructure needed for two new master control rooms to support TVO and
TFO. Which means that as TVO surveyed the marketplace for digital
servers, switchers, routers and other infrastructure products they found
Leitch had products to fit into each of the necessary core technology groups.
TVO pulled heavily from Leitch's portfolio of solutions, including audio and
video distribution amplifiers, interface and conversion equipment, integrated
and multi-format routing systems and time reference products. At the core of
TVOntario‟s new master control infrastructure was Leitch's state-of-the-art
master control switcher used to control all the routers and servers
underpinning the production infrastructure in general. "We are pleased that
when successful broadcasters like TVOntario consider undertaking major
facility upgrades they look to Leitch as not just a supplier of boxes but as a
total project partner," said David Brown regional sales manager for Leitch.

       Leitch is able to take on the management of sides of technologically
upgrading the broadcast sector because they have for a long time been
actively involved in pre-competitive research and development projects of
various different kinds with many of the world‟s main suppliers of broadcast


                                                                            32
equipment in all the core technology groups. For instance Leitch's VR400
video server co-developed with Sony was the first broadcast server to
incorporate high-bandwidth centralized fibre channel storage, integrated
software and advanced multi-format video compression technologies. Or in
1999 Sony and Leitch collaborated to develop a comprehensive MPEG-2
based solution for all-digital acquisition, storage, and playback. Which
means from the very beginning Leitch has been forced by market conditions
and de-regulation in the US to be dedicated, perhaps at first unwillingly, to
open systems and multi-vendor inter-operability. Which is also one reason
for Leitch‟s broad portfolio of engineering solutions and the various system
design and engineering services offered to their clients.

       And Leitch‟s commitment to pre-competitive research led to an even
more intriguing aspect of involvement with the public sector in Ontario as
evident in their 2002 partnership in establishing “a multi-media and
communications laboratory” at the University of Waterloo. For which Leitch
provided the computing and test facilities needed to support research in
“multi-media communications and interactive production and distribution”.
It should be remembered that the transition to digital is still only in its
opening stages so the problem of seamlessly integrating these until now
discrete media from an engineering point-of-view, implicit to the vision
underlying the converged media sector, will involve solving many
engineering challenges mostly resulting from the need to deploy a
production network based on open systems and distributed-computing on a
network wide basis. “The Leitch-Waterloo lab will be able to address some
of the key engineering issues by combining the best university and the
industry based researchers in pre-competitive consortia working in a
coordinated manner on all of the core technology groups‟ necessary to the
network based production in general”

       During the industry wide upswing fueled by the dot.com boom Leitch
expanded its investment in research and development and after the crash,
like most technology-providers they streamlined and outsourced their
manufacturing side for cost-effectiveness a painful process that‟s still
underway. In June of 2005 Leitch announced it was outsourcing more of its
manufacturing work to SMTC Corp in a deal worth $20 million a year to
SMTC which will provide complete manufacturing services to Leitch from
their new plant in Markham, north of Toronto, and their operation in
Chihuahua, Mexico. The deal comes only a month after Leitch had
outsourced to SMTC the majority of its Toronto-based manufacturing a


                                                                           33
move expected to trim Leitch's workforce by 75 to 100 people. The
outsourcing is part of a cost-reduction strategy at Leitch, which has been
looking to trim its manufacturing costs so it can concentrate more on product
development, sales and marketing. "Making the right decisions about which
processes we can do ourselves and which can be better done outside the
company (and the country) is a critical part of our drive to become a cost
leader," Tim Thorsteinson, president and CEO of Leitch Technology, said in
a statement. "SMTC demonstrated a solid understanding of our industry, our
business model and our business goals, and acted as a true partner in helping
us develop a plan to enhance the effectiveness of our supply chain and
achieve our gross margin improvement objectives."

       The contracts were welcome news to SMTC, a global electronics
manufacturing services provider which last month reported its 2005 first-
quarter loss widened to $2.6 million US. "We are delighted to be partnering
with a leading technology company like Leitch and to be supporting their
operational and financial goals," said John Caldwell, SMTC's president and
CEO. "Both companies are optimistic on the mutually beneficial aspects of
the partnership in the areas of Leitch's gross margin improvement goals and
in the opportunities for revenue growth for SMTC as the customer looks to
expand its outsourcing strategy beyond the scope of the initial phases,"
Caldwell added.

        And needless to says not only Canadian companies that are interested
in selling, installing, maintaining operational production networks as the
television industry on a world wide basis threatened with market failure
jumps into IP based and interactive projects of all kinds on a world wide
basis. For instance in the US on top of Technicolor‟s comprehensive
involvement with the film, broadcast and telecommunications sectors.
Telestream, a leading provider of IP-based media encoding and delivery
solutions, arrived at this years NAB convention with its strongest presence
ever. They announced three major new product lines that bridge platform
and format gaps for broadcasters, media companies and content creators.
These exciting new products combined with the growing number of new
partnerships firmly establish the company's leadership position in the media
and entertainment industry.




                                                                           34
       Telestream introduced FlipFactory HD, which simplifies the
conversion of multi format source media to the new HD standard for
broadcasters and other content owners. Recognized for its depth and breadth
of broadcast format and system support, the system facilitates automated,
file-based exchange between leading broadcast media servers and
production systems solving the same challenge for HD workflows.

       For remote news submission, they introduced a personal media
delivery application for Mac and PC based workstations. Priced under
US$500 the system provides videographers and journalists with a flexible,
affordable solution for transmitting news cuts or other edited material from
Mac or PC-based laptop or desktop video editing systems over any standard
or wireless IP network. On top of this a new family of personal productivity
tools enable Macintosh OS X users to make, edit, and play Windows Media
from within QuickTime-based media applications. These products bridge
gaps between the Apple Macintosh platform and Microsoft Windows Media
format.

       These new products and partnerships demonstrate the important role
of bridging digital islands for digital media companies, content creators and
owners. Telestream products have set the standard for the encoding,
organizing, and delivery of digital media. Customers rely on their products
for convenient, cost-effective, and robust digital media access and exchange
over IP networks. The company's automation applications and smart media
organization solutions streamline workflows for broadcasters and media
companies, as well as content owners and content creators in government,
higher education, and corporate markets.

       So in TV newsrooms around the world and across Canada are already
shunning traditional satellite and fibre-line feeds in favour of Internet
protocol (or IP) technologies to distribute video signals around within their
production apparatus. “The advent of digital transmission to the newsroom is
as important as Telstar , the first satellite to transmit live across the Atlantic
Ocean, was in the 1960s,” says Robert Hurst, the president of CTV News.
“Now when we're ready to feed, we just feed. We don't need to stand in line
behind anybody,” he added. “And we're moving 10 times more material
across our various platforms than we were before.” While chances are the
viewers haven't noticed the switch sitting at home on their couches, the bean
counters working at the threatened still analog networks impacted by market
failure and facing a collapse in their advertising revenues, sure have.


                                                                               35
       At CTV alone, its Content Gateway System saves millions of dollars a
year. Gateway, as it's affectionately called in affiliated newsrooms and
bureaus around the world, is a multipoint-to-multipoint IP-based
audio/visual file exchange system connected through a network of land-
based 10-megabyte data lines not surprisingly provided by CTV‟s owner
Bell Canada Enterprises. The system lets newsrooms send reporters' stories,
voiceover material and production graphics to servers as encoded broadcast-
quality MPEG-2 files often faster than real time. With a traditional file
transfer protocol, sending a two-minute news story or a 120-megabyte file
could take an average of eight to 11 minutes. But CTV's proprietary system
transmits it in just 90 seconds.

        Producers access the system through an Internet browser and preview
a lower-resolution MPEG-1 version of the items before deciding which ones
are worth putting on videotape for their newscasts. “It's like moving your
little MP3 files around on Napster,” explains Albert Faust, CTV's director of
media technology systems and the brainchild be hind Gateway. “Except our
quality has to be amazingly superior our stuff is a minimum of a megabyte a
second.” Hardened newsroom veterans may have greeted Gateway with
suspicion when it was introduced, but now they love it. Fellow broadcasters
have even taken notice, awarding the system a Gemini in 2002 for
outstanding technical achievement.

       Today, CTV's digital delivery system is being used at 24 locations
across the country and around the world, including bureaus in New York,
Washington, Los Angeles and London. During the continuing war in Iraq,
stories were filed from Baghdad by satellite to London, but routed through
Gateway's data lines for the final hop home to the network's Toronto
newsroom. That alone, Mr. Faust says, saved $1,200 a feed. “We're now
averaging 16,000 files a month. That's about 600 hours of video.” “We used
to pay $250 or more for a [domestic] feed before. Now, we're paying about
five bucks an item,” Mr. Faust adds. “But with increased usage, that price
will continue to drop because our network's already in place.”

      Like many companies that adopt IP technology for voice, data and
video communications, the cost-saving benefits can be persuasive for
broadcasters. “You want to drill down that cost per minute of video per
show to be delivered,” explains Sudy Shen, chief executive officer of
Masstech Group Inc., another Toronto-based company that develops
hardware and software for broadcasters around the world. But delivering


                                                                           36
content digitally over broadband saves more than money, he notes. It can
dramatically improve a TV station's work flow. Whatever advances the
future holds for news-gathering technology, CTV's Mr. Hurst says viewers
aren't likely to notice. “They don't care as long as the picture looks clean and
its fresh and it's absolutely as current as possible or live,” he said. “That
means we can't be waiting an hour or two until a satellite comes up or a
microwave [transmitter] becomes available to feed.”

                            E-cinema on Location

       When the Sunningdale Country Club in Scarsdale, N.Y., opened its
gates last week to a location shoot for "The Sopranos," a new fixture was on
display in the mobile dressing rooms - a roving Wi-Fi hot spot. With a
device called the Junxion Box, the production company can set up a mobile
multiuser Internet connection anywhere it gets cellphone service. The box,
about the size of a shoebox cover, uses a cellular modem card from a
wireless phone carrier to create a Wi-Fi hot spot that lets dozens of people
connect to the Internet. The device acts as a portable Wi-Fi hot spot.

       The staff members of "The Sopranos," squeezed into two trailer
dressing rooms, needed only the Junxion Box and their laptops to exchange
messages and documents with the production offices at Silvercup Studios in
Queens. "We used to fax everything," said Henry J. Bronchtein, the show's
co-executive producer. "The paper would jam; it was messy. This is much
more reliable." Junxion Boxes have also been spotted on Google's commuter
buses for employees and along Willie Nelson's latest tour. But what may be
a boon for wandering Web surfers could quickly become a threat to wireless
providers.

       "The premise is one person buys an air card and one person uses the
service, not an entire neighborhood," said Jeffrey Nelson, executive director
for corporate communications at Verizon Wireless. "Giving things away for
free doesn't work anymore. It never did." Unlimited service on cellular
modem cards for PCs costs about $80 a (U.S.) month. The carriers are
clearly worried about a technology that could destroy that business, but they
have not formed a united front against Junxion. The makers of the Junxion
Box, based in Seattle, seem eager to head off any battle by forming
partnerships with the wireless companies. "We're not trying to build a radar
detector," said John Daly, 42, co-founder of Junxion Inc. and vice-president
for business development. "We believe we're creating an opportunity for the


                                                                              37
carriers. It may not be entirely comfortable for them right now, but we hope
we can get to a point where we can collaborate with them." John Kampfe,
director of media and industry analyst relations for Cingular Wireless, said
the Junxion Box was being evaluated and certified by Cingular and could
eventually be sold in conjunction with Cingular's wireless service for wide-
area networks.

       "There is a whole pricing model that has to take place with the
Junxion Box," Mr. Kampfe said. So far Junxion has about 200 customers,
many of whom are testing the product. The company went around the
wireless companies by making Trio Teknologies, a wireless services reseller,
its exclusive distributor. Peter Schneider, a partner at Gotham Sound, the
communications equipment company in New York that supplied Junxion
Boxes to the sets of both "The Sopranos" and the rapper 50 Cent's upcoming
movie, "Get Rich or Die Tryin'," said his customers would not be interested
in wireless modem cards were it not for the possibility to share the
connection through the Junxion Box.

       "That's the exact appeal of it" for his customers, he said. "That you
can rent it to a group. As word gets out, it will become part of the
communication equipment they rent." But for carriers like Verizon Wireless,
which spent $1-billion on its broadband network, it is difficult to let users
piggyback on that service. "We're not surprised that people are building
services like this and trying to attach them to our network," Mr. Nelson of
Verizon said. "It verifies how cool and how important our network is. We're
going to protect that investment." That may prove to be an uphill battle as
new technologies like Junxion alter the wireless carriers' control over the use
of their networks.

       “That's just something they have got to live with because that's the
technology now," said David Anderson, Willie Nelson's tour manager of 31
years. "Most people wouldn't or couldn't afford to have that many cards.
They weren't going to get 22 customers, but now they got 6." There are two
Junxion Boxes in each of the two tour buses and each has three wireless
modem cards so they can switch to the cellular provider network with the
best local coverage. It allows Mr. Nelson, whom Mr. Anderson describes as
a computer geek, to check his e-mail and surf the Web while on the road.
"The Junxion Box is good for going down the highway," Mr. Anderson said
from Hillsboro, Tex., where Mr. Nelson was performing earlier this month.
"It was frustrating in the older days. It's finally the way it should be."


                                                                             38
       So by now the trend towards network-based-production and
distribution of both films and television programmes originally deployed by
services elements of the film industry in Vancouver on a contract to
Technicolor in response to the 9/11 crisis. Has, like the server based and
interactive approach to the production of computer-graphics developed by
the Canadian software-providers during the 80s. Been opportunistically
adopted and is in the process of being deployed by the dominant players in
the telecommunications, IT, consumer-electronics and media production and
distribution interests on world a world wide basis. While somewhat
paradoxically Ottawa‟s departments, agencies and crown corporation with
the notable exception of Radio Canada are apparently just snoozing with
heads tilted slightly sideways quietly asleep at the switch.

        The deployment of these network based production systems of various
kinds, costs and degrees of technical sophistication is being triggering by a
cost-control and outsourcing revolution now being forced onto the
productive components of the film and broadcast industries in the US as 200
million dollar feature films and 5 million dollar hours for television become
the North American norm. And all this is happening as the serious market
failures at the box office and the collapse of advertising revenue are forcing
the still analogue components of the broadcast sector the major traditional
networks kicking and screaming towards the HD, digital, interactive and IP
based television formats now coming on line via the Internet. And these
trends in combination are really just the digitalization of the information
technology and media sectors in a coordinated manner that has actually been
in the cards since AT&T was first taken apart in 1982.

       These trends will clearly accelerate and as the still analogue
components of these American industries scramble in panic to effect the
inevitable transition towards digital, HD and IP based television formats.
Accelerating this process is also the prime objective behind the re-regulation
program the FCC is now targeting at the clearly obsolete components of
media infrastructure in the US. The still analogue as opposed to digital
broadcasters are now clearly viewed by the FCC, as inhibiting the further
evolution of the American political economy in general. As well as, and
perhaps more importantly from a strategic point-of-view, frustrating the
smooth functioning of the more and more necessary soft-power strategy. As
the political economy in the US makes its transition to an interactive and
transactional format while engaging in the war-on-terror on a world wide
basis.


                                                                            39
        Needless to say the American re-regulation strategy as yet has no
corresponding Canadian components. In fact the de-regulation process
which opened the entire convergence period in the US in 1982 still has no
Canadian equivalent! As the Heritage Department, Industry Canada and the
still patronage-driven CRTC clearly “captured” by the dominant carriers and
broadcasters, acting like a moose caught in the headlight of an on coming
locomotive, are clearly headed for a major crash of some kind in the
transition to HD, digital and IP based television signals in general.

       And all as yet more of Ottawa‟s various sector related department,
agencies and crown-corporation become embroiled in failures in corporate
governance, lack of accountability and transparency and charges of
patronage. As for instance is so evident in both the “sponsorship” and up
coming “technology-partnership” scandal, the latter will be even larger than
the former due to the larger sums involved. It now appears that Industry
Canada has pissed away certainly hundreds of millions and possibly multiple
billions of dollars during the 90s by applying more or less the same
patronage based approach applied by Heritage to the media sector. To the
entire technological change of the federal government which extends to the
operations of all of Ottawa‟s various departments, agencies and crown
corporations including needless to say those involve in the media sector in
particular and the cultural sector in general. .

       Whereby with heads fully extended into the sand due for the most part
to incoherence and confusion generated by the unconscious pattern of
development. The key federal and for that matter provincial departments,
agencies and crown-corporation, with clear jurisdiction over and
responsibility for the evolution of the media sector in particular and the
cultural sector in general. Continue spouting complete nonsense about their
recent achievements. Which are in fact actually failures in relation to
evolution of the Canadian political and cultural economy in general. While
somehow believing that the approach which official Ottawa has had in place
since 1968 is still functioning in an appropriate manner. And all as the
American monolith prepares the swallow the technological infrastructure
which underpins media production and distribution in Canada by way of
upgraded and even more virulent versions of the trade and direct investment
programs and strategies that have kept us more or less on our cultural knees
since these industries were first vertically integrated in the twenties.




                                                                          40
                     Production Network Experiment




      Alias and Sony Pictures Imageworks announced that Alias' Maya was
      the core 3d special effects software utilized in realizing the summer
      blockbuster, Spider-Man 2. Not only does the film mark an
      achievement in terms of the quantity of visual effects shots it sets new
      standards for computer-generated human characters. “Our Maya
      technology allowed the production team to take computer-generated
      visual effects to new levels with regards to the creation of digital
      humans." states Sony Pictures Imageworks CG supervisor, Peter
      Nofz. "It has all the major components modeling, character animation,
      dynamics, cloth, a scripting language that you need to do a project on
      this scale and each component is well integrated with the others which
      is very important to the way we work."… While CG humans have
      been one of the "holy grails" of the 3d animation industry, matching a
      computer-generated character to a live-action actor is more difficult
      yet. In order to help make their characters more convincing, the team
      "motion captured" the faces of their live actors and then brought that
      information into Maya. The other Maya technology that enabled the
      digital doubles to be indistinguishable from their live counterparts is
      API an “embedded scripting language” a key component of the facial
      muscle system, the animation of arms and the "hair" pipeline. In
      addition to dynamic effects, Maya's advanced modeling tools were
      used to make the visually stunning recreation of downtown
      Manhattan. "We had a couple blocks of Manhattan built in Maya from
      the first movie that we were able to re-use, plus we built another
      whole section for this movie," states Nofz.

                                                Business Wire Aug. 3, 2004


                Motion-control and Performance-capture

       By 2004 it was clear that Hollywood loved computer-generated
animation and special effects, and that everyone was returning that love in
the form of serious cash flows from the box office, sales and rentals of video
tapes, DVDs and growing off-shore market share. This was in large part due

                                                                            41
to the fact that the recently produced computer-generated feature films
ranked among the top-grossing motion pictures of all time. Shrek 2 at $437
million; Finding Nemo $340 million; Shrek 1 at $268 million; Monsters,
Inc. $260 million; Toy Story 2 $246 million. While Peter Jackson‟s largely
effects based Rings Trilogy clocked in at a massive 4 billion world wide
with hundreds of millions more yet to come in DVD rentals and purchases
around the world.

       This winning combination of technology, art and commerce was also
displayed on Wall Street when DreamWorks spun off its animation unit,
raising $812 million in a deal that saw their shares soar from $28 to $38.75
by market close. That IPO, along with the one from Google, highlighted the
growing importance of technology in the animation, special-effects and
media industries in general. Where breakthroughs in software, processing
power and data storage can be as important as raw artistic ability. And where
on the journalistic sides of the system recognition of the importance of
certain internet oriented idea-processing systems, as represented by rise of
Google styled search engines based on artificial-intelligence, expert-systems
indicated that an important “industrial” and perhaps even a “cultural
revolution” of some kind was just over the horizon. And all being driven
from south of the border, by way of a process that‟s been well underway
since AT&T was taken apart in 1982.

        Since the ground breaking “Toy Story” the holy grail among computer
animators has been the realistic rendering of human beings. And while
achieving that objective may still take some time, riding on the back of
recent cg related technical developments, the output of the special-effects
and animation industries keep getting better and better and audiences to
become more and more discerning. "No matter how much faster computers
get, it still takes the same amount of time to render computer animated
movies, because the effects keep getting more sophisticated," said Scott
Owen, a professor of computer science at the University of Georgia.
Computer-generated films are drawing raves for their stunning visual effects
and clever writing. But the industry and the technology behind it are still in
opening stages of an explosive period of development. And that ethos is
fuelling a cocky you-haven't-seen-anything-yet attitude within the
productive sides of these industries with predictions of future breakthroughs
in mimicking human actors perfectly.




                                                                           42
        If you look closely at the “cast” in any of this years blockbuster
photo-realistic CG products you will notice how realistic their skin looks. By
now its getting harder and harder to tell if it‟s a real or a computer-generated
face. Which also means we are at last beginning to address some of the
major problems involved in creating digital performers. One reason for this
particular improvement is the creation of an application that works out how
light affects translucent surfaces now being employed to create computer-
generated characters with faces that look more believable. And this software
is fast becoming a staple behind the blockbusters packed with photo-realistic
visual effects, as well as, being widely employed by the developers of
videogames.

       The man behind the technique is Dr Henrik Jensen of the University
of California. The secret in making “virtual skin” seem real has to do with
how light and skin interact in nature. Dr Jensen found that light does not just
bounce from surfaces such as marble and skin. Instead light beams penetrate
below the surface and scatter at different points. The breakthrough came
while at Stanford University when he came up with a mathematical formula
that calculated how light was absorbed and dispersed within translucent
objects. "The development of the mathematical model was the most difficult
aspect of the project. It required a number of new algorithms and techniques
not previously seen in computer-graphics." Once deployed however the
software offered the visual effects artists the means to move away from
computer-generated faces that looked plastic and unconvincing on the big
screen. By now the technique is used in nearly all visual-effects striving for
photo-realism and is also widely adopted within the video-gaming industry.

       But the creation of for instance the realistic Gollum in Lord of the
Rings Trilogy also involved some technologies and techniques developed
within the more traditional sides of the film industry like motion-control and
compositing especially when used in combination. In 1988, Lily Tomlin and
Bette Midler played dual roles in “Big Business” as two sets of twins,
leaving audiences guessing how the filmmakers juggled the foursome so
seamlessly. In 1996, Michael Keaton was amazingly cloned over and over
again in Harold Ramis' “Multiplicity”. The epic fight scenes in the original
Matrix were also a result of the sophisticated use of motion-control and
compositing technologies used in a carefully coordinated manner. For
example during production of Lord of the Rings director Peter Jackson used
a stand-in actor for the computer-generated Gollum able to interact with the
Hobbits in the foreground on one strand of film. While a second strand of


                                                                             43
film held the background shot or “plate” that was fully coordinated with the
foreground in terms of its camera moves, lens dynamics and other camera
related variables. Then a digital artist painted over the background, leaving a
place to insert the computer-generated Gollum. The motion-control software
records even the slightest camera movement and lens or lighting change on
the initial background plate and allows any subsequent foreground material,
animated, computer-generated or otherwise, to include the same the camera
moves, dynamics and qualities of light.

       It‟s the motion-control software that perfectly coordinates the
photographic characteristics of the actors and/or computer-graphic
characters with those built into the virtual realities within which the scenes
are played. It‟s the combined use of motion-control and compositing that
convinces the audience all elements of the composite shot are really all parts
of the same thing. And therefore it is these technologies and techniques used
in a carefully coordinated and skillful manner that enable the CG inserts to
be credible from an aesthetic point-of-view. At this point throughout the
elements of the production sector involved in both animation and special
effects new techniques and the applications falling from them are being
created usually within the context of a ground breaking production project of
some kind.

       For example a technique pioneered by DreamWorks on a Shark‟s Tale
was global illumination, an effect that shows the natural way light reflects in
a room or across surfaces in a given setting. The production team used a tool
called a bounce shader, which gauges where and how light will bounce from
surface to surface. The visual effects team used the application to create the
illusion of natural light and shadows for undersea scenes. But these more-
sophisticated algorithms call for greater computing power as more than
300,000 such frames were created during the production, and each frame
required more than 40 hours to render.

       The production used more than 30 terabytes of disk space, the
equivalent of 54,000 CD-ROM discs. So not surprisingly the biggest
technological advance for which animators pray is more speed and thus
more computer-power. DreamWorks continually updates its processors so
animators can get instant feedback on changes to a scene. More than 2,000
processors and more than 6 million central processing unit hours were used
to render "Shark Tale." Increasingly, the computers used to produce rich
animated graphics are becoming generic, industry watchers say. Some


                                                                             44
animation studios used to use Silicon Graphics machines powered by a
proprietary Unix system. But now, more studios including industry leaders
like DreamWorks and Industrial Light and Magic are moving to Linux-
powered machines. For "Shrek 2," DreamWorks took a novel step toward
licensing computational power from Hewlett-Packard. The company
effectively rented HP computers in the final three months of production,
when it needed more rendering muscle. Still, the dominant cost for making a
feature-length animated film is for labour, not computing power.

        Clearing some of the technical hurdles, means that animators must
ensure that the design of characters is appealing to viewers. Rendering
human beings is so complex and moviegoers watch the effects with an
unconscious scrutiny that many creators lack the ability to render photo
realistic characters, opt for stylized or cartoon like characters such as
monsters or animals. "The computer graphics explosion is ideally yet to
come, when we will be able to make human characteristics more organic,"
says Nick Foster, DreamWorks' head of animation software. “In 1990, visual
effects were still done photo-chemically. Back then computers were only
used to create some hard-surface objects such as spaceships in the original
“Star Wars trilogy”.

       Pixar's 1995 film “Toy Story” was a milestone in the computer
animation field because it was the first all-digital feature film.”. It also
underscored the challenges in creating realistic human characters the toys
were believable because viewers had no real reference point, but the adults
and child in the film were poor substitutes. But the film struck box-office
gold, spurring competitors to join in the race to perfect computer-generated
animation technology. Still, computer-graphics professionals walk a fine line
when they un-stylizing their characters. Japanese scientist Masahiro Mori
has described people's emotional response to humanlike robots as the
"uncanny valley," because fondness for the robots often falls off a cliff when
they become too real. At this point most animators and visual effects experts
agree that bringing realistic human characters to life on the big screen is now
perhaps the central problem facing the creative sides of these industries. "At
some point in the future, we will have true human characters but it still may
take a few years," says Scott Owen of the University of Georgia. But to do
that, "we need a lot more understanding of how humans move and act, and
more understanding of our perception to figure out what we do when we
take in that information. It's important to creating these effects."



                                                                            45
       “The human face has so many subtleties that a slight muscle or eye
movement can dramatically change the meaning of an expression. To study
facial expression and capture it is one of the hardest things to do,” says
DreamWorks supervising animator Tim Cheung. For this reason,
DreamWorks' animators until now have only tried to achieve "stylized
realism" in their films, in which they give the characters the complexity of
human appearance and emotion but don't try to replicate it too closely.
Making characters too realistic can turn off the audience. In Shrek, for
example, many viewers felt the character of the talking donkey was more
"real" than the human princess Fiona.

       Like other studios working on these problems, DreamWorks has
developed technology to make the animation processes simpler and thus
cheaper. This software contains information on the human physical anatomy
and its traits, allowing an animator to program, with one control, movement
that reverberates throughout the body. For example, when the enchantress in
Shrek breathes, the movement travels from her shoulders to her belly. The
animator would use one command to create that effect so the one action
carries the function throughout the structure of the body. "We're looking to
push the envelope on each film, and that requires us to invent new tools,"
Cheung said. "We write all our own software."

       The Incredibles was written and directed by Brad Bird, who joined
Pixar in 2000 after directing The Iron Giant for Warner Bros. "The story is
based on an idea I had in the mid-'90s," he says. "It's about a guy who gives
up what he loves and then resents it to the point that he doesn't see what's
around him. It seems like a big goofy Hollywood movie, but I feel like it's a
personal film in pop clothing." That guy in the film is Bob Parr, aka Mr.
Incredible, a former superhero who, along with his wife Helen (Elastigirl)
and their super-children, Dash and Violet, has been reduced to living a
normal suburban life, unable to use his superpowers - at least in public - for
fear of lawsuits. One day, Bob gets a chance to don his super suit, and the
story cranks up. But with 12 human characters in starring roles on the “The
Incredibles” the production team faced several "super" problems: They had
to rig 3d characters, which had bizarre cartoon shapes and superhero powers,
so that they could act like 2d characters.

      "[Director] Brad [Bird] wanted the puppets to move in hand-drawn
ways, like traditional animation," says producer John Walker. They needed
to make the characters look like believable humans despite their designs,


                                                                            46
which meant giving them realistic skin and muscles. They had to dress the
characters - some even had costume changes - and give them hair. Think
Sully and Boo of Monsters, Inc. multiplied by 12, and then extend that to fill
a 107-minute film. So many of the technical advances were prompted by a
need for efficiency and art direction. For hair, the advances were largely the
addition of internal and external forces that helped technical directors
control the “simulation-engine”, originally developed by Pixar's tools group
for Monsters, Inc. The team would attach hair to a keyframe, sculpt rest
behavior into hair, and change its dynamics. In addition, the technical gurus
in the tools group worked on keeping hair coherent while it was moving. To
help speed character rigging, the tools group developed new technology that
made it possible for the character group, which handled designs, models,
rigging, clothes, hair, shaders, and textures, to build one re-usable rig for all
the characters.

        Director Brad Bird developed storyboards with help from a team that
used Adobe After Effects and, like visual effects animatics but unlike
traditional animation, the initial storyboards included camera moves. This
made it possible for the crew to adopt other live-action techniques - building
only what the camera could see, for example. They also used live-action
elements splashes, shadows, and so forth and 2 1/2d matte paintings. The
basic models were built in Maya, and the rigging, or the articulation, was
added with Pixar's proprietary software enhanced to allow squashing and
super-stretching (Helen is Elastigirl, after all). To simplify the rigging, the
group had the characters all reference one template. "It could be applied
quickly," says Bill Wise, character supervisor, "and a change to the template
would propagate to all the characters. It has basic rotations at major joints,
squash and stretch, and a muscle system on top."

       "For the skin," Wise adds, "we had a sculpted shape that was sucked
down onto the muscles. The bones moved the muscles and the skin dragged
along, but we could let the skin slide or stick it down tightly." The crew
applied subsurface scattering, but not detailed texture maps. "We had
stylized, deformed people," says Bird. "We didn't want them to look plastic,
but too much detail made them look creepy." For this Pixar relied on several
commercial tools: Pixar's PRMan, Alias Maya, Adobe AfterEffects and
Photoshop , Apple Shake, and, for matte paintings, Discreet's 3ds max and
Splutterfish's Brazil. A new, proprietary interactive lighting tool helped
lighters balance the look. And, once again, the hardest technical problems


                                                                               47
the crew solved were those in which the technology is least evident. "When
Helen pushes Violet's hair behind her ear, when Bob puts his hand through a
rip in his super suit, those are our triumphs," says Wise

       As in traditional animation, animators working on The Incredibles
wanted to see the shape of the character, not just the character's
performance. "In our world, it's important to see what the line of the
character looks like," says John Anderson, a senior scientist in the studio
tools group who joined Pixar after developing simulation engines at ILM. To
be able to build skin over dynamic muscles quickly enough for the
animators, Anderson and the tools group developed statistical models.
"There has been a lot of research in statistical dynamics," Anderson says,
"and our work fits into that body of research."

       Essentially, they put a character through a set of exercises - a
representative sample of poses - and then used the muscle positions for those
poses to train a mathematical representation of the internal coefficients;
they'd create a compressed memory of what the muscles looked liked and
implement it as an algorithm. "The system would know that when a
character looks like this, the muscles look like that," says Rick Sayre,
supervising technical director. Thus, the animators could see the result of a
dynamic muscle-and-skin system without running a simulation. Then the
tools group spun that idea into clothing. "We trained a statistical model of
the cloth," says Anderson. An evolution of the simulation engine developed
for Monsters, Inc. moved the cloth during the training exercise. Once
trained, the statistical model took over. That made it possible not only to do
cloth simulation engines for multiple characters who appear throughout the
film, but also to do so while giving animators control of each major
character's silhouette.

       The use of statistical models is likely to be adopted by visual effects
crews faced with similar problems, another way in which technology proven
at Pixar might find broader use. At the same time Tim Sarnoff, president of
another industry leading cg production studio ImageWorks, said the
difference between computer graphics imagery and visual effects is
decreasing every day. Visual effects are photo-realistic, live-action plates
with digital elements imposed in places. Visual effects, such as those used in
the battle scenes in the "Lord of the Rings" trilogy, have merged with
computer graphics in the last 10 years, thanks to a greater ability to impose
animated characters in those plates, he said. "Suddenly you have a style of

                                                                            48
animation side by side with the real world, and that's blurring the lines
between what was originally considered visual effects and what is now
considered digital effects computer graphics imagery." That melding of
worlds is making it more difficult for people to discern what elements in
films are computer generated. For example, in the film "Seabiscuit" there
were 150 shots that were altered with a computer. "You never want any
effect that pulls people away from the movie. The goal is to deliver the
technology that makes the film," Sarnoff said. (Barbara Robertson
CNET.com)

        And DreamWorks and Pixar are not alone it taking this approach of
developing specific software extensions for either Maya or XSI modeling
packages while pushing the envelop during leading edge productions. For
instance ImageWorks made important additions to the animators toolkit
while working on "Polar Express," a groundbreaking project because it was
the first to be created based solely on motion capture technology. The
system captures the motion of real-life actors, such as the film's star Tom
Hanks, and reflects that digitally.

       From a distance "The Polar Express" might look like just another
holiday movie. Adapted from a much-loved 1985 children's book it was the
story of a nameless boy living in a 1950's American suburb whose
crumbling belief in Santa Claus is bolstered when, one snowy Christmas
Eve, a phantom steam train pulls up in front of his house and a kindly
conductor invites him to take a ride to the North Pole. With Tom Hanks
again starring under the direction of Robert Zemeckis as he did in "Forrest
Gump" and "Cast Away". "The Polar Express" sounds as safe as safe can be,
guaranteed to warm hearts and sell tickets. But there was a revolution in
production technique hiding inside this seemingly innocuous family film. As
it was the first star-driven film to cross completely over to the digital
domain, it might change the way movies are made and seen. Whatever
critics and audiences make of this movie, from a technical perspective it
could mark a turning point in the gradual transition from an analog to a
digital cinema. And though the transition may not be as dramatic as the shift
from silent to sound prompted by "The Jazz Singer" in 1927, it may have
equally significant consequences.

      Neither a traditional live-action film nor a computer animation of the
kind Pixar has perfected with "Finding Nemo" and "The Incredibles," "The
Polar Express" is something in between, a film that brings a true human


                                                                            49
presence into a virtual world by digitizing flesh-and-blood actors as well as
the environments they inhabit. In the process it does away with many of the
most basic elements of filmmaking: there are no expensive sets to be built,
no elaborate lighting to be rigged, no bulky camera to be painstakingly
hauled into place. In fact, there is no film. "The Polar Express" will touch
celluloid only at the final stage of production, when the completed feature is
transferred, by laser printer, from computer hard drive to film stock.

        "To everyone's credit at Warner Brothers, they were taking a giant
leap of faith on this," Mr. Zemeckis said during a recent trip to New York.
"It's really tough for anybody to get their head wrapped around it." For one
thing, Mr. Hanks plays five roles, ranging from a 7-year-old boy to Santa
Claus. Not all of these characters look like Mr. Hanks, but they all contain
the spark of his individuality. And on the economic side the perhaps $20
million advanced for Mr Hanks involvement covers the cost of the core of
the entire cast instead of just the bankable star driving the project from an
economic point-of-view. The world of the film is manifestly the world of
Mr. Van Allsburg's beautiful oil and pastel illustrations, a nocturnal
landscape punctuated by soft forms and warm lights. But Mr. Hanks, along
with the other members of the "Polar" cast, have not been simply
superimposed on a painted background. He was absorbed into the lushly
detailed images, almost as if his DNA had been digitized along with the
landscapes. It is a cliché to say that a film looks like a painting come to life,
but applied to "The Polar Express" that old phrase contains a new truth.

       Whether the audience can wrap its collective head around this
approach to filmmaking is only one of the many questions posed by a new
technology that turns the director into the god of his own virtual universe.
Will the new techniques finally make it possible for directors to be the sole
authors of their films, in the way painters control their paintings or novelists
their novels? Or will the unprecedented control eliminate the creative
turmoil of what has often been called the most collaborative of art forms?
Will the revolution serve the goals of storytelling and personal expression,
or will it lead to an obsession with trivial detail and pointless perfectionism?
Will actors embrace the challenge of playing against themselves in multiple
roles, as Mr. Hanks does in "Polar," or will they become digital puppets,
manipulated by unseen others? Will the new powers liberate visionary
filmmakers, or will they make movies even more vulnerable to the whims of
studio executives, who will be able to endlessly second-guess directors?



                                                                                50
        Long a pioneer in special-effects technology (in films like "Back to
the Future," "Who Framed Roger Rabbit" and "Death Becomes Her"), Mr.
Zemeckis sees promise and danger. "It was wonderfully freeing," Mr.
Zemeckis said. "This was like, well, I think we'll have the train come off the
tracks and skid across a frozen lake. All right, great. Let's write that." But, he
emphasized, the technology must take the back seat. "It's still about story, as
wild as it can be, or as simple as it can be," he said. "To write a screenplay
under these conditions takes some getting used to," added William Broyles
Jr., who wrote the adaptation with Mr. Zemeckis. "Usually, as the time for
filming approaches, you adjust your screenplay to what is possible. But with
this, as the time approached, it became clear than anything was possible. At
first, there was an incredible exhilaration, but then that was followed by the
realization that anything you imagine has to be in the service of the story."

       To watch "The Polar Express" in process and onscreen is to begin to
understand the promise of the new technology and its potential drawbacks.
There was little that resembled a traditional shoot at a warehouse in Culver
City, where the film was made. In place of a soundstage, there was a
domelike structure built of scaffolding that surrounded a playing area
roughly 10 feet square. Attached to the scaffolding were several dozen
infrared sensors, which could pick up and digitally record the light bounced
back by the dozens of small reflectors on Mr. Hanks's black bodysuit, as
well as by the 150 smaller reflectors attached to his facial muscles.

        With his face dotted by the tiny jewels, as the crew called the
reflectors, Mr. Hanks looked like the pincushion man from the "Hellraiser"
series. But after a few days of working with the reflectors attached, he said,
he no longer noticed them. "All I really miss are the costumes," he said. "I
have to remember to touch the brim of a hat that isn't there." When Mr.
Hanks entered the playing space - "the volume," as Mr. Zemeckis likes to
call it - his movements were recorded by a computer as points of light
floating in a dark three-dimensional space. Even in this raw form, the
connect-the-dots figure moving on the computer monitor was recognizably
Mr. Hanks. It walked like him, gestured like him and, most important,
crinkled and smiled and frowned like him.

      Filmmakers have been able to capture full-body motion for some
years using a process called motion-capture, in which a computer scans
sensors attached to a performer's limbs and records the broad outlines of
movements. "It's been around a long time from video games," Mr. Zemeckis


                                                                               51
explained. "They put sensors on the athletes for sports games and things like
that." The great leap of "The Polar Express" came in the ability to capture
facial expressions: "When we did the first tests," Mr. Zemeckis said, "we
had Tom do the body acting, and then we put him into a space where he sat
in a chair and had to re-act everything from the neck up. I said, 'You can't do
a movie like this.' So they went back and were able to figure out how to get
both sets of sensors working at the same time. And once we started the
movie, the technology kept getting better."

       Steve Starkey, a producer of the film, calls the process "performance
capture," because it records more than simple movement. There is
individuality and emotion in that blur of swirling dots, and it becomes the
job of the computer animators - in this case, people at Imageworks, led by
Mr. Zemeckis's longtime visual-effects supervisor, Ken Ralston - to retain
that individuality while shaping the clouds of dots into the Boy, the
Conductor, and the other characters. The process falters, however, when it
comes to the characters' eyes. Because of the impossibility of attaching
sensors to the actors' pupils, eye movements must be animated
independently, and they aren't always as convincing as one would like. The
greatest commercial risk of "The Polar Express" lies in betting a reported
$160 million budget that audiences will still be able to make an emotional
connection through these somewhat glassy orbs, windows to the soul that
seem slightly veiled.

       “Polar Express” was produced using proprietary motion capture
system but there are a growing number of services-providers jumping into
the market for the enhanced-computer-services connected to the production
of photo-realistic computer graphics. For example in June of 2005 VICON,
developer of Academy Award-winning motion-capture technology released
the newest version of its software. The application provides an intuitive
interface with all of the tools needed to manage, automate, capture and
process an entire production. VICON offers a complete real-time
environment for setup, calibration and capture with ultra high-resolution
systems. The system also takes full advantage of the unique grayscale power
of their MX cameras, with all-new processing algorithms to deliver the best
accuracy, quality and processing times.

       The system dramatically streamlines motion capture workflow,
offering unprecedented precision when tracking complex interaction
between multiple actors and significantly reducing the efforts previously


                                                                             52
required in editing the captured information with other systems and
solutions. It will automatically process the most difficult multiple-character
capture scenarios with rapidity and ease. Using all-new algorithms and a
calibrated biomechanical and kinematic model of the actors and props being
captured, the system solves most of the ambiguities that typically exist with
optical motion and provides the user with quality assurance tools to ensure
an intuitive and efficient workflow.

       The systems key features include: support for real-time capture with
MX3, MX13 and the world-leading MX40 cameras; real-time set-up tools,
camera optimization, calibration and diagnostics; support for unlimited
camera counts to meet production demands with simultaneous full-body,
facial and hand capture for multiple-actor capture; support for Timecode,
Genlock and full synchronization for studio integration; support for 10-bit
grayscale; more accurate marker identification; ground-breaking calibration,
3d reconstruction, kinematic labeling and processing algorithms and
automated batch processing. Which revolutionizes the quality, flexibility and
ease with which motion capture data can be applied to real-time and off-line
applications in film, television , video games, virtual prototyping and
more…

       "With our clients continuing to astound us with their production feats,
we are driven to push our technologies farther," commented Gary Roberts,
Vice President of VICON‟s feature film unit. "Unique grayscale processing
algorithms for the first time, the ability for true performance capture the
ability to efficiently capture and process hundreds of markers on the face
and body simultaneously across multiple actors. Further speeding up and
easing the workflow. The system also allows real-time visualisation of
captured and processed data alongside all new algorithms which are already
tripling throughput.” VICON is headquartered in Oxford, UK, and since
1984 has been providing with the latest tools to accurately capture the
subtleties of three-dimensional human motion for research, medicine, sport,
engineering, game development, broadcast and film.

                           Virtual Art-direction

       And building up the special effects and animation toolkit is not
limited to North America. For example at the recent Australian Effects and
Animation Festival (AEAF) local success story Rising Sun Pictures gave a
presentation on their work on the forthcoming ambitious retro-epic Sky


                                                                             53
Captain and the World of Tomorrow, a film in which everything except the
actors is a digitally generated illusion. With more than 2000 digitally
upgraded cyber-graphics shots, this beautifully realised vision of a world-
that-never-was also represents something of a landmark in digital
filmmaking.

        "Sky Captain and the World of Tomorrow" is full of visions of the
dark, soaring New York cityscape, dogfights in the sky and the majestic
Himalayan mountains. But what was the movie set like for the actors? A
whole lotta blue. That's because "Sky Captain," despite its grandiose
appearance, was filmed entirely against a blue screen with digital effects
filled in. Though real actors star in it, almost everything else is fake. Think
"Roger Rabbit" in reverse. While computer-generated imagery has for years
been a large presence in movies, "Sky Captain" is the first major motion
picture made entirely digitally with living, breathing actors. Only what they
actually touch is tangible. The movie is set in a late 1930s New York beset
by hundred-foot-tall robots. Soon reporter Polly Perkins (Gwyneth Paltrow)
along with ex-beau Sky Captain (Jude Law) set off on a journey to stop a
mad scientist's plot to destroy the world. Drawing from old 1940s serials,
such as "Flash Gordon," noirish pulp fiction and the classic futurism of H.G.
Wells, first-time director Kerry Conran has created a vintage-looking sci-fi
film using the most modern of technology.

        The steps to this film of more than 2,000 effects shots, though, started
in a California apartment with pen and paper. Conran began working on
"Sky Captain" in 1994 by designing stylish illustrations with his brother as
traditional storyboards, which he then converted to digital images on his
laptop. Toiling away on his home computer, Conran eventually produced a
six-minute short (essentially the first minutes of the feature film), which
attracted producer Jon Avnet and, in turn, Law and Paltrow. When
production began, this transfer of drawings to computer graphics was done
even more expertly. Those still graphics were then animated to produce a
rough, 3d video (with digital actors) called an animatic. After many
processes of refining the lighting, depth and composition, a grid was created
for each shot, and mapped out on the blue screen stage floor.

       Conran then shot the movie with doubles in a London studio, as a kind
of dress rehearsal. "We effectively shot the film twice," Conran told The
Associated Press. "I actually shot the film before I shot the film." Once the
adjustments of angles and shadows were made to make a cohesive, if


                                                                              54
cartoonish rough cut, the primary actors were brought in for just a month of
shooting in the studio. Not bad work compared to the normal months on
location. The animatic also served as a useful blueprint (no pun intended) for
Paltrow, Law, Angelina Jolie and Giovanni Ribisi. "What was valuable with
the animatics is that it showed the actors what the shot was," Avnet says.
"There was enough detail for (Paltrow) to understand exactly what was
happening and that when she's stepping on dots, it's actually an entire plane."
And so the entire movie was shot in one room -- one very blue room.

       "It was a sea of blue, literally," says Paltrow. "It was blue everywhere,
blue floors, blue walls with various dots in places to orient the computer."
The dots also oriented the actors, signifying things like a killer robot or an
airplane cockpit. But the blue screen and dots did not exactly facilitate
emotional performances. "It was difficult at times, but at the same time you
also kind of free your mind and your art follows," says Law. "And the fact
is, we were really in this kind of make-believe world, anyhow. You could
say it was kind of similar to doing theater in empty spaces."

       Shot on high-definition digital film, Paltrow and others in "Sky
Captain" are muted in a soft-focus to give them the textured look of old
black-and-white movies. Layers and layers of composites were done to each
frame to add greater detail to each shot. The total amount of digital painting
for the hundred minute movie is staggering. Avnet tried to do the math: "24
frames, times 60 seconds, times 100 minutes, and how many dots in each
frame? And then do seven, eight, 15, 20 iterations of each frame?"
Calculator-less, he answers appropriately with a whistle.

       The result of all of these steps is a unique creation of futuristic retro in
which Jolie captains an air station and the New York streets almost look
real. The experience of watching the film is a little jarring given the new
melding of reality and fantasy. "Sky Captain" also has an odd, somewhat
unsettling tempo because of the preordained, digital production. "It's so
difficult to get the effects right, that you tend to celebrate every shot you get
done," Avnet says. "But you have to look at the movie and see if it's any
good. That was very challenging just to make sure that we didn't have a
technically proficient movie that nobody would be able to follow." Conran
sees computer generated images as simply another step in a long line of film
technology advancement: sound to color to digital. "I use digital effects




                                                                                 55
more as a tool, no different from any other kind of production tool," he says.
It's potentially a "world of tomorrow" unto itself. (Jake Coyle Associated
Press May 2005))

        The Australian Effects and Animation Festival is part of the larger
festival, which includes digital video and digital imaging programs. Several
other companies showcased new technology for the creative sector. For
example there was a great deal of interest in the Disney team's development
of a unique new style of 3d animation for Chicken Little, its first attempt to
go it alone in this field without the expertise of Pixar. Disney's recent
traditional 2d animated films have been box-office disappointments, so a lot
is riding on the talents of these young animators.

       New Dawn also showed off its range of full-body and half-body
InSpeck 3d scanners that digitise the human body (or any other object) to
create extremely accurate models. The scanners pick up surface textures in
photographic detail. Then texture maps are wrapped on to the model to
create lifelike replicas in a fraction of the time it would take to build one by
hand in a 3d modelling application. This technology is finding favour with
games developers says Angus Blackburn, managing director of New Dawn.
"A typical request from a games developer might be for 100 characters in
costume to be scanned, and these scanners can do that very rapidly."

       Blackburn cites a recent project in the US for which 100 US baseball
players had their heads scanned in photo-realistic detail. The heads were
then placed on low-polygon 3d bodies. “Take a full-body scan of an actor
and match it to motion capture data and you have your digital thespian right
there," Blackburn says. Online gamers can now get their personal avatars
created using this technology, he says. These versatile machines have
medical applications - the Princess Margaret Hospital, in Toronto, Canada, is
using one to research spinal deformities. Enterprising plastic surgeons are
making unclothed full-body scans of clients and then calling up a library of
implant models to show how a breast enlargement operation would actually
look - the ultimate try-before-you-buy.

       The rest pf the production sector was well catered for at AEAF, with
Sony demonstrating its new high definition HVR-Z1P camcorder - aimed at
broadcasters and documentary makers. Discreet unveiled Lustre, a high-end
software package for film colourists and graders. Changing the colour range
of celluloid film can achieve the muted 1940s palettes of Saving Private


                                                                               56
Ryan or the otherworldly harsh blue light seen in the science-fiction film
Pitch Black. Traditionally these effects were created using a chemical
process but as film gives way to digital recording, the grading of scenes, or
entire films, is becoming more widespread. Lustre offers unprecedented
control over all aspects of the colour grading process and will no doubt
become a favourite tool of directors everywhere.

       The raw human figures - so bulky and balloon-like before they are
refined that the animators call them Michelin Men - are dropped into 3d
spaces that have also been created in the computer by groups of designers
and programmers. On Polar Express director Zemeckis relied on longtime
members of his team, including the production designer Rick Carter
("Forrest Gump") and the costume designer Joanna Johnston ("Who Framed
Roger Rabbit") to do their usual work, though this time, instead of being
built or sewn, their designs were constructed in the virtual environment.

       "For example, for the big square at Santa's village, the art department
designed it, and they did it just like they do a movie, except it goes one step
further," Mr. Zemeckis said. "They conceptualize it, they draw blueprints,
they build models, and then, once we sign off on it, they go and they build it
virtually in the computer, just like architects do now for previews and that
sort of thing." Now, the real fun begins. With his 3d characters inside a 3d
environment, the filmmaker has a literally infinite choice of camera angles.
He can place his virtual camera at any point in the 3d space, much as players
of video games, like the newly released "Sims 2," can do, though the games
have a restricted range of positions and much less detail. The he can move
the virtual camera in any direction, simulating pans and tracking shots, and
even a jittery, hand-held effect. He can simulate the look of any known lens
(as well as some unknown ones, as the extraordinary deep-focus effects in
"Polar" attest).

       He can alter the lighting at will, dropping in shadows and highlights
that would take hours to reset on a traditional shoot. Instead of having actors
sitting in their trailers, waiting for the crews to set up the next shot, they can
stay on stage and in character and go straight from one scene to the next.
And as Mr. Zemeckis pointed out, he also eliminated the risk and bother of
working with child actors, substituting the skill of one consummate
professional who has the only acting credit in the film's advertisements. The
process also promises a new level of exhibition. In a first for a major studio
feature, "The Polar Express" opened both in conventional theaters and in a


                                                                                57
3d Imax format in some cities, including New York. "And all we had to do
was run it through the computer again, just flip a switch and put the parallax
in," Mr. Zemeckis said. "It's all 3d already, so boom, it's just there."

        "I found in my big effects movies where I had to do a lot of major
blue-screen work, like in 'Contact' and in 'Forrest Gump,' it's really hard to
keep the energy from flattening out, because the first thing that happens is
the actor now becomes a prop if you're not careful," he said. "It takes a lot of
discipline for the director and the actor to rise above the tedium of doing this
blue-screen work, but there's none of that here. Performance capture is
different because it's all about the acting. Without the tyranny of hitting
marks and leading the lights and worrying about the boom shadow and your
makeup and your wig and the line on your wig and all that horrendous stuff
that stifles an actor's performance. Or when they do the greatest take ever
and they miss the focus.''

       "You could actually hire a director just to go out and work with the
actors,'' he speculated, "and then you'd take the raw material back. But if he
didn't do it exactly the way you wanted it, you could change it. And you
could get it the way you wanted without actually having to stand there in
front of the actor and convince him to do it your way.'' But that digital dream
could also have a nightmarish underside. Imagine a movie without any
physical reality, without any human presence or warmth. Imagine, in effect,
even less personal versions of the coldly corporate digital-effects
blockbusters that now dominate the summer.

       Mr. Zemeckis acknowledged that danger but argued that he had
already learned to overcome it: "What we did with 'Polar Express' was what
we do now in music without anyone ever thinking about it. We have
sophisticated digital sound equipment that can create any sound, and you can
manipulate a note, sustain it, shorten it, change it. "And we haven't replaced
any musicians, because the musician comes in and puts the emotional
warmth into the performance. He sits at a keyboard and he just basically
plays, but he puts that human emotional warmth in there. Then that goes into
the booth and it's expanded and layered and all those textures are put on it.
But basically what's there is there." Mr. Zemeckis concluded:

      "So now what we're doing is the same thing, but for visual
performance. The actor lays it down and then you just add things onto it or
take away from it - whatever fits the story. That's how I look at it in a


                                                                              58
positive way." All of these wonders come, inevitably, at a price - and not
just the financial cost, which at the moment exceeds even the $1 million-a-
minute average of Pixar-style animation. "In the next couple of years," Mr.
Zemeckis said, "part of every film's process is going to be to adjust the
images. And it'll be to change the color of an actor's tie or change the little
smirky thing he's doing with his mouth. Or you can put in more clouds or
move the tree a little bit. And that will be part of your normal film finishing
process, where your image will be perfected."

       Sometimes, though, it is the imperfections that make a movie come
alive. Back in the 1910's, when movies were beginning to be filmed in
studios rather than in streets and parks, D. W. Griffith famously worried
about losing "the wind in the trees" - the sense of the aleatory that makes
film seem spontaneous and real. Is faultlessness really a goal? Mr. Zemeckis
said that he believed that the technology's promise outweighed the risks of
any early excesses. It is so radical that it has actually solved problems that
came with the last breakthroughs in digital magic, he said. (Dave Kehr New
York Times October 24, 2004)

       And once again there are important Canadian players in the game "I,
Robot," the 20th Century Fox Sci-Fi Thriller starring Will Smith, has
shattered all prior conceptions of a futuristic robot-integrated society to
capture an Academy Award Nomination for Best Visual Effects. "I, Robot"
is a seamless, believable work of genius combining highly complex visual
effects and live action shots at a ratio of approximately one-to-one. The
“Lanning House” sequence created by Vancouver post-production a special-
effect house Rainmaker is one of the most talked-about scenes in the film. A
sense of reality permeates "I, Robot" yet all the action takes place inside
completely digital photo-real environments. There is continual tactile contact
between the actors and the robots, the actors and the CGI environments and
the actors and CGI props - all of it seamless. The first intense action
sequence is the destruction of Dr. Lanning's mansion. Thousands of man
hours went into creating this highly interactive sequence that yielded only a
few minutes of film but leaves an indelible impression.

      "This sequence is important because it is the first action sequence in
the movie." explains Rainmaker visual effects supervisor, Dale Fay, "It
involved very detailed photo-realistic CGI, live action and high-speed
miniature photography . The sequence introduces one of the film main
characters the demo-bot which was entirely CGI when seen in the early


                                                                               59
exterior shots. These shots were particularly demanding due to the large
screen presence they had and because the CGI demo-bot had a few
interactive moments with Will Smith. This combination required tremendous
attention to photographic detail, textures, animation and lighting."

      As the sequence plays out ,a demo-bot hammers and claws at the
house while Smith desperately tries to escape. The result is an absolute
razing of the structure, leaving no stick of wood untouched, no stone
standing. Rainmaker's Director of Visual Effects Brian Moylan, explains the
process: "We accomplished the sequence using a combination of green
screen set footage of Will Smith, a scale miniature model of the mansion and
demo-bot, and a full CG demo-bot and debris. Making all these elements
work together in the same space was the big challenge. Using this method
meant we could put the debris and destruction literally right on top of Will."

       Model builders spent several months constructing 1/4 and 1/6-scale
miniatures of Lanning's house. Approximately 30,000 individual bricks were
cast in Toronto, which were exact color-matched to the actual house built on
location in Toronto. A 1/4-scale demo-bot was constructed to interact with
the Lanning House miniatures. "Here, the challenge was to create the correct
scale, weight and choreography for the robotic vehicle," continues Fay, "We
pre-visualized the entire sequence to develop dramatic action for the demo-
bot as well as ensure that the mechanical puppet would be able to achieve
the desired performance. Motion control was also used to match the high
speed 1/4 scale miniature action to the live action photography . This gave
us the ability to supplement the basic performance with additional passes for
special lighting effects, dust and additional debris." Finally, the artists at
Rainmaker added selective 3d animation of architectural components and
debris to further enhance the drama and threat to a fleeing Will Smith.

       "Rainmaker did a colossal job as our specialty house for our models
and miniatures," remarks visual effects supervisor John Nelson, "They also
provided brilliant CGI work - scans, volumetric CGI effects and a CGI robot
- very complicated stuff." Rainmaker is a post production and visual effects
company with a twenty-five year history of technical excellence and
outstanding service. From its origins in the pioneering west coast post house
Gastown Productions, Rainmaker has grown into one of North America's
pre-eminent post production service providers. Rainmaker's facility in
Vancouver provides producers with an array of post production services
ranging from traditional film developing to the very latest digital image


                                                                            60
processing techniques. Rainmaker was one of several major visual effects
vendors on I Robot, including Digital Domain, WETA and Modern
VideoFilm. (Vancouver Sun February 21,2005)

                 Cost-control and Non-linear Production

       But as motion-control, capture and compositing become more widely
and cost-effectively available and as the various technical systems and
toolsets comprising the electronic-production-system continue to evolve and
as they are backed up by evermore computer power. They are also being
used to deal with more prosaic production related problems like shooting
around absent actors. Several times on Lord of the Rings there were scenes
when an actor missed a shot because he was needed elsewhere. But they
were able to shoot the background plate without the actor using motion
control to record the camera related variables, and then later, when the actor
was available, they could collect the foreground material including the actor,
and then combine the scenes while assembling the DI by way of the
compositing process.

       So motion control can be used to impact on particular production
processes in differing ways. Or since foreground, background and other
components of a composite are produced -- at different times -- the order of
their collection can be varied depending on the needs of the particular
production process. For instance a background plate can be inserted behind a
performance delivered by an actor as in a feature film. Or actors can be
inserted into a background plate as in the case of theatrical documentary.
Whether it‟s a feature film, theatrical documentary or docudrama, is merely
dependent on which component or components of the composite are in fact
collected first or are actually driving the completed presentation from an
aesthetic point-of-view.

       These developments in combination are now impacting the film
industry from an economic point-of-view in very fundamental ways. For
instance the fact that the various components of the composite shots can be
produced at different times means that the production value contained in the
background plates or virtual realities within which the scenes are played can
now be disconnected from the scripted and dialog based performances
driving the film or television programme from an emotional point-of-view.
This is important as it‟s the coordination of production necessary to build in
the production values into the backgrounds and the foreground performance


                                                                            61
including the dialog -- simultaneously -- that generates the chaos as well as
much of the expense of the traditional linear production processes. In the
tradition production process all elements of the scene must be combined in
real time during the photographic process. While in the non-linear
production process the components of the shots and the elements of
production value can all be produced independently from each other and
then combined in the compositing process while assembling the digital
intermediate or DI. The ability to produce the elements of production value
independently from each other has important economic implications to
production in general.

        It‟s the 80‟s and you‟re producing a Hollywood style epic with a cast
of thousands and are having constant budgetary headaches about hiring,
paying, feeding, clothing and maintaining the safety of all those actors. But
now it‟s 2004 and yet another a Montreal-based software-provider
BioGraphic Technologies is marketing a software package called AI-implant
that can give you with those thousands of extras as well as providing them
with the “direction” they need to execute their various individual tasks or
performances in a flawless manner take after take after take.

        BioGraphic developed its application by using the same artificial
intelligence (AI) and 3d modeling techniques common to military simulators
perhaps the important technological result of the peace dividend to so far
impact the popular culture in general from an aesthetic point-of–view in the
form of the arrival of videogames. Simulated events driven by simple forms
of artificial intelligence in combination with 3d modeling were first made
practical in military simulators during the 60‟s. Which in conjunction with
electronic projection enabled combat moves to be practiced without
endangering personnel or expending expensive ordinance. The techniques
and technologies enabling the simulators in the first place during the 80s
then migrated into the videogame industry during the 90s as well as into the
various other interactive products by deploying into the entertainment
marketplace on a world wide basis by way of the Internet.

        “AI, motion-capture and 3d modeling used in combination gave us the
abilities to create the illusion of realistic motion in backgrounds, extras and
by now even principal performers. Each character, including such non-
humans as animals, vehicles or projectiles, are basically given a series of
if/then statements by the software that allows it or them to make decisions,
creating a rudimentary simulation of intelligence.” says Kruszewski.


                                                                             62
BioGraphic‟s technology is designed to enable “software-driven animated
characters” to inhabit specific sets of virtual realities allowing the producer
to replace real extras in droves. "We use artificial intelligence to make a
brain for digital humans, to allow animated characters to make independent
choices," said Paul Kruszewski, president of BioGraphic Technologies. "It's
great not only because human actors are expensive, but [also] because you
can't blow them up." And while BioGraphic started out mainly drawing
techniques from and then servicing the gaming industry by now they are in
the process of migrating their technology into the parts of the production
sector involved in computer graphics, and special effects it would appear
once again pushed by the spiralling costs of production.

       For instance Stargate Films, a special effects house based in Pasadena,
used AI-Implant to create the long shot “cast” necessary to the recent
miniseries Spartacus. "We had to make ancient Rome and populate it with
thousands of characters each doing their own thing," said Joseph Meier,
Stargate's chief technology officer. "Since computer-generated characters
when seen from a distance now look real enough it made sense." "Obviously
on something like Spartacus high-end artists are at work making the finished
products," Mr. Kruszewski adds. "We only make the brain and nervous
system, artists can add the rest." What BioGraphic provides is the AI based
component of the game-engine that drives the rest of the animation system
and allows all the digital actors to “perform” without any further work from
the animators.

        And as the graphics cards and software packages developed for
application to the videogame industry continue to evolve and as gaming
consuls become ever more powerful and capable of rendering increasingly
realistic environments in an ever more cost-effective manner. The more
downscale elements of the production sector still pumping out product for
the economically challenged analogue broadcasters are turning more and
more to the game developers for help in upgrading the television production
processes from an aesthetic while also reducing the cost of production in
general. As these already stressed elements of the production sector are
being forced by the market failure for analogue and non-interactive products
to inexerably grope towards the new cash flows becoming available by
exploiting 3d imagery and interactivity on television in a more evolved
manner than has apparantly been possible until now.




                                                                              63
        For instance the History Channel in the US perhaps the most evolved
of the still analogue American television networks in terms of online and
interactive presence has utilized videogame consules to visually re-create
epic battles and tell the story of some of the key confrontations in Roman
history. Decisive Battles, a 13-episode History Channel series was produced
using a videogame developed by The Creative Assembly in the UK as a
integral aspect of television series production. Rome: Total War allows
players to create epic battles based on meticulous research by the game
developers at Creative Assembly. Players are able to micromanage every
aspect of warfare, as well as, rule the empire. The centerpiece of the game is
its real-time battles, which occur on lush battlefields in vibrant 3d. Players
are able to assign orders, formations and even give pre-attack speeches to
rally the virtual soldiers. There are more than 100 different types of soldiers
from 20 different factions, all based on real warriors. Artificial intelligence
features in the game allow combatants to behave in a lifelike manner, even
the ability to run when overpowered by the opposition.

       Margaret Kim, director of programming and executive producer for
the History Channel, “hopes that the fresh perspective the video game brings
to history will attract a new audience to the channel. The mission of the
History Channel is to make the past come alive," Kim said. "To that end,
we're always looking for ways to demonstrate the power and the excitement
of history, so that our viewers can experience it on a more personal level.
These graphics are a great way to bring the ancient battles to life." In
particular, the History Channel is hoping the use of video-game graphics,
“will score points with younger viewers. While TV can't yet compete with
the visual spectacles created by big-budget, computer-generated scenes in
movies like Troy, we can put viewers in the middle of battles with thousands
of men hacking and slashing each other to death.” Which also appears an
important charcteristic of the Imperial soft-power strategy in general.

       Games can also allow television producers access to visuals that even
Hollywood couldn't stage. While Decisive Battles will tell its stories through
the traditional interviews, drawings, paintings and re-enactments with actors,
the focal point of this series is Rome: Total War. "We've been heavily
involved from the beginning," said Mike Simpson, development director at
the Creative Assembly, noting that all the footage seen in the series was
created following a script written by David Paradine Television, the
production studio that created the show. The game developers, Simpson
said, created all the battles in the battle editor (a tool for customizing game


                                                                             64
scenarios) and choreographed the confrontations to happen exactly as they
did historically. "The benefit of this particular computer technology is that
you can see a huge overview of the battlefield, the vast numbers of troops
and their formations," said the History Channel's Kim.

       "Other techniques such as re-enactments give a different, somewhat
limited perspective, though those techniques have their value as well." In
cinematic terms this means the shots used to “establish” the overall context
of the television series are geneated by the battle editor. For instance in one
episode, Decisive Battles puts viewers in the middle of a battlefield where
Hannibal and his vastly outnumbered troops completely surround the Roman
army in the Battle of Cannae. The computer animation provides an overview
of the battlefield and illustrates the contrast between the classic Roman
column formation and Hannibal's crescent-shaped defense. Viewers can
visualize just how that unusual formation enabled Hannibal to trap and
defeat the Romans in the classic battle of double envelopment. Once
encircled, the Roman legions were pressed so tightly against each other that
they were helpless against Hannibal's infantry and his Libyan spearmen. As
developers continue to create games that allow players to relive and rewrite
history, Decisive Battles is only the first of many creative partnerships
between game developers and TV networks.

        "We're pioneering something new in this series, and we'll see how it
evolves," said Kim. "The gaming industry is one of the fastest-growing, and
it's likely that we'll see more convergence between video games and
programming in the future." Mike De Plater, creative director of the
Creative Assembly, said that the Total War engine can also provide sets to
use as backdrops for productions besides historical military re-enactments.
Many of the things that only the highest-budget films can currently afford
will become available to any production company with a small budget. "No
need to worry about locations and sets the whole world exists and is there to
pick and choose from," De Plater said. "The possibilities are endless, and
this isn't wild conjecture. It's inevitab le, and not that far away." And once
again motion-control, compositing and computer-graphics and all of the
other elements of the 2 and 4k production pipline are setting the stage for the
creation of new levels of productin value at an ever decreasing price
including the key conceptual leap now facing the film and television
production sectors the by now necessary and inevitable jump into both 3d
and more importantly interactivity on a media specific and sector wide basis.



                                                                                65
        And since the animators “animating” the characters is what consumes
perhaps 80-to-90 percent of the budget of a cg generated epics. Cutting
down or getting rid altogether of the costs of the animation process would be
to quote the original Lord Thompson “like having a licence to print money”.
The economic implications of these developments become obvious when
you realize that the real cost of designing the character is bound up in the
artists/animators designing and animating the characters in the first place.
But once you have the character in the form of a 3d model resident in a
computer as well as the virtual reality within which the scenes will be played
you can allow the game-engine to control what they actually “do” based on
AI techniques and thus are able to “animate” themselves allowing the cast to
move and so to “act” so that costs of the basic design work can be amortized
over the much larger amounts of screen time generated by the animation
system operating in an automated mode based on AI.

       This is also what makes it more cost effective to use CG technologies
to produce a hundred or more episodes of an animated television series
rather than just one blockbuster theatrical film. And also what makes it more
cost effective to produce Lord of the Rings in three parts allowing the
producers to amortize the investments in CG over perhaps six or eight hours
of screen time and more importantly over three admission payments or DVD
purchases rather than merely 120 minuets and one admission or DVD
purchase. This is also a reason, beyond shear opportunism and greed, for the
tendency towards sequels and spin offs even in the form of video games in
the sides of the production sector based on CG. For instance the
technological base originally assembled at great cost to produce Jurassic
Park is still at work and has by now populated literally hundreds of hours of
television programming with cost-effectively produced dinosaurs.

        So obviously the more programming you produce using a common set
of characters and matched virtual realities the cheaper and cheaper this
programming will become. So not surprisingly Discreet logic, Softimage,
BioGraphic Technologies and everyone else involved in computer animation
is concentrating on increasing the cost-effectiveness of their particular
components of the 2 and 4k toolsets and production pipelines. And so are
constantly working to simplify, rationalize and automate the animation
process to allow it to expand into the other and less upscale markets beyond
the expensive CG based blockbusters driving the evolution of the system in
the first place. "What we want to do is make it easier to get results faster;
make it so that users can drag and drop simple things like idling, walking


                                                                            66
and punching” Mr. Kruszewski said. "People don't want to know how to
create animations, they just want them to work. We're doing what we can to
make it simpler," he said, "so anyone can make good, convincing animations
and films." And clearly more importantly at least from a producers point-of-
view to make production more cost-effective whether its an upscale CG
generated 4k feature film or a more down scale 2k television programme.

        Which suggests that while a lot has been achieved using CG
technology to create digital actors and even principal performers we are still
a long way from replacing human performers in close up material. In fact
most of the techniques developed to speed up the animation process are
based on various motion and performance-capture systems designed to allow
us to capture and build real actor based elements of performance into
animated characters. For instance we are still only in the opening stages of
capturing the movement of the face and lips for employment on an animated
characters generally attached to a completely realistic rendition of a voice
performance often created by that same actor in the first place. As to look
realistic lip and mouth movements which effect the quality of the voice must
needless to say be matched to movements evident in the rest of the face and
at this level of engagement with the audience the term body language has
real meaning.

        So in fact as the CG animation systems improve and become more
cost-effective they are doing so by allowing the CG generated characters to
emulate their human exemplars in the first place. This is clear in the fact that
most of the dialog scenes even within the most CG intensive but photo
realistic productions are still played by human actors as its still much
cheaper to capture a real live “performance” on film than to assemble it from
a de- and then re-constructed CG based version of itself. That this is the case
is also clear in the fact of the big bucks still paid to the principal performers
merely for their voice performance that needless to say still anchor the
performance from an emotional point-of-view. Really the only place where
the CG generated principal performers are coming into their own is when the
characters are required to do things that real actors cannot or will not do. So
at this point even the most CG intensive productions are generally comprised
of a long series of traditionally collected dialog scenes with relatively short
CG generated action scenes the amounts dependant on budget being inserted
at the appropriate points an approach adopted more for economic than
aesthetic reasons. But where the CG related techniques and systems are



                                                                              67
coming into their own and primarily for economic reasons is in creating the
backgrounds including the non-principal performers or extras necessary to
street scenes either in the past in the future or now.

       And interestingly it appears the techniques and technologies are
continuing to bounce back and forth between the simulators and game-
engines as the characters generated by AI-Implant aren‟t just turning up in
video games, feature films and television programmes. In an interesting
form of feedback the US military are also now employing AI-Implant to
create realistic simulations of crowd scenes. "We used AI-Implant to model
the crowd at Mogadishu," said Rick McKenzie, of Old Dominion
University's modeling, analysis and simulation center referring to the 1993
incident in Somalia made famous by the movie Black Hawk Down. "We
developed crowd simulations that interact with the military's own
simulations of the technologies necessary to both attack and defend to
provide realistic training scenarios" By combining the outputs of the AI
based software “with what the military has learned from satellite imagery,
first-hand accounts and psychologists who study crowd reactions,” Mr.
McKenzie said “our team has made a reasonably accurate re-creation of
what happened in Somalia.”

       “We're going to go a bit farther with it now and are modeling the
interactions between crowds and non-lethal weapons, and how the military
can perform in peace time, like distributing food or going to get a bad guy,"
Mr. McKenzie then pointed out. You'd hardly expect to find dozens of
defense strategists setting aside two weeks at a time to play a video game.
But then, Urban Resolve is no ordinary video game. Developed by the U.S.
Joint Forces Command, or JFCom, a division of the Department of Defense,
the $195,000 program is a combat simulation on a massive scale. It pits two
opposing teams of soldiers against one another in a fight for control over a
city under siege, and it's capable of modeling the behavior of the nearly 1
million entities the soldiers, civilians, cars, tanks and so on that might exist
in such a conflict. In other words, it's one part Risk, one part The Sims and
one part raw supercomputing power. It's also the tool that could one day give
the US military the upper hand in urban conflicts akin to the ones currently
taking place in Iraq.

      "The old crusade days when you go into a city with catapults and
rubble everything are over," said JFCom's Jim Blank, modeling and
simulation division chief for the project. "We know now that you can take


                                                                             68
down a city by isolating different nodes in the city.... This lets us look at
each one of the nodes and decide how best to go after the adversary." For
instance, military leaders could use Urban Resolve to predict what would
happen if they destroyed the electricity source in a particular city. Such a
tactic might have the desired effect of preventing the rebels from
communicating with one another. Or it could backfire and harm hundreds or
thousands of civilians something the military would like to minimize. "If
you take down a sewer plant, you're going to cause a great deal of
discomfort to the city's inhabitants," said Blank. "A lot of these things have
gone on in previous conflicts but the result has been collateral damage that's
not acceptable."

        War games that consider these scenarios are not new for the military,
but they have never been attempted on such a grand scale, according to
Blank. For instance, the simulation that JFCom is currently testing allows
enemy forces (the "red team") to hide up to 3,000 operatives in any of
65,000 buildings. The opposing "blue team," meanwhile, controls about 300
agents who use various tools to track the enemies. The trick to keeping all
this in motion is running the program on two Linux-based supercomputers,
one at the Maui High Performance Computing Center in Hawaii and the
other at Wright-Patterson Air Force Base in Ohio, and using concepts
borrowed from artificial intelligence research to allow many of the
characters in the simulation to make their own decisions without human
input. This allows JFCom to run the simulations with only 30 or so human
players at a time. These players consist mainly of retired military leaders and
contractors who consult for the Department of Defense.

       "This technology has not really been used for immediate battle
planning before," said Bob Lucas, a division director at the University of
Southern California's Information Sciences Institute which helped port the
Urban Resolve software to the Linux supercomputers. "The vast majority of
people are computer-generated. Some are very complicated and consume a
whole Pentium by themselves. Some are so simple, you can run a few
hundred on a computer." Lucas admitted that the simulation focuses so
intensely on these calculations that it skimps on graphics. However, that
doesn't seem to have a negative effect on how players experience the game,
he said. "The graphics make it look like a video game from a previous
generation," said Lucas. "But the scenarios are realistic enough that my
blood pressure goes up." JFCom's Blank agreed. "These guys, over the
course of the last couple months, have become increasingly much better at


                                                                             69
doing their jobs and reacting to what (the red team) is doing," he said. "Their
success rate is very good. As you pull things away from them and you pull
sensors away from them, they figure out a way to gather the information
they need to figure out what's going on. They're into it."

       But realism isn't the only reason military leaders like the simulator. It
also gives them a chance to peek into the future by introducing weapons and
tools that don't yet exist into their battle scenarios. "You can say, 'Let's
create a sensor that detects people by the glare of their bald heads,'" said Dan
Davis, an project director who worked with Lucas on the project. "If you can
describe it, you can put it in." Taking this versatility one step further, the
soldiers and the buildings in the system could be replaced with almost any
characters in any scene, according to the researchers. Doing so could help
law enforcement agencies determine the best possible ways to deal with
crowds. For instance, police in the United Kingdom might use it to
anticipate the sort of behavior to expect from fans after a particularly tense
soccer match. For now, however, Lucas and Davis are staying focused on
the tests for the military -- a project that the researchers are very serious
about. "This is something that is good for the defense of this country," said
Davis. "It allows us to optimize the way our military is used so we don't
have to destroy our young men. We're saving young men's lives."

       JFCom plans to finish up its current series of tests focused mainly on
helping military leaders determine which types of sensors -- CIA agents, spy
planes, listening devices and so on -- are best for tracking enemy forces that
are hiding in a modern city. Two future testing phases, scheduled for 2005
and 2006, will focus on confining the enemy forces in a part of the city, and
on directly battling the enemy, respectively. Which suggests that at some
point the relatively near future it may not be uncommon for the AI based
“scripts” created within the military‟s simulators to start turning up re-cycled
into the productive sides of film and television and gaming industry no
doubt with the complete approval of the Pentagon always to happy to
support the entertainment industry based soft-power strategy as long as the
dominant components of the media sector in the US is willing to go along
with the party line.




                                                                             70
                    Institute for Creative Technologies

       POP quiz: Which organisation has evolved into one of the most
innovative video game designers today? Electronic Arts? Sony? Bungie?
Wrong. The answer to that question would be the United States Army. In
recent years, it has employed Hollywood‟s and silicon valley's best and
brightest to deliver a slew of games to help its soldiers become some of the
best trained in the world. Suggesting a sort of the reverse peace dividend is
also already functioning in interesting and even provocative ways. For
instance the US Army riding the success of its action video game America's
Army has set up a video-game “studio” inhabited by videogame industry
veterans to write other kinds of software to simulate training for a variety of
armed forces and government projects. The Army got into the videogame
business when it released America's Army in July 2002, essentially an
interactive Army recruitment ad. The game is available for download free
and so far 4.4 million gamers have registered to play it.

      To build on that success the America's Army Government
Applications Office was opened in January of 2004 with a team of 15 video-
game creators, simulation specialists and ex-Army personnel. Many of the
“studios” employees came from local video-game companies like Interactive
Magic, Timeline, Vertis, SouthPeak Interactive, Vicious Cycle Software and
Red Storm Entertainment. The new studio is headed by Jerry Heneghan who
spent 13 years as an attack helicopter pilot and was a “producer” at video-
game developer Red Storm Entertainment, best known for its Tom Clancy-
branded military simulations.

       The North Carolina location allows easy access to a variety of locales,
including Fort Bragg, where the programmers spend time getting “up close
and personal” with new Army vehicles. The Research Triangle's
universities (surrounding Washington DC) offer a steady flow of fresh
young talent and we're in close proximity to developer Epic Games which
provided the game engine for America's Army," said Heneghan. The studio
also works closely with other Washington agencies, West Point and
Picatinny Arsenal in New Jersey, where the Future Applications Team is
located. It also has close ties to Orlando, Florida, the epicenter of military
simulation and training technologies. In addition, the office is working with
a team of 24 video-game creators in Monterey, California, on the latest
edition to its main franchise, America's Army: Overmatch which will be
released in March 2005.


                                                                             71
       The office was born when other government agencies, including the
Navy and the Secret Service, expressed interest. The game's realistic 3d
environments, which cost roughly $12 million a piece to develop, are
opening new avenues for training. For example, there is a classified virtual
White House simulation for training Secret Service agents. Special
operations forces also practice adoptive training and “leadership negotiation
with indigenous cultures” through a research project with Sandia National
Laboratories. One recent success story is the Talon Robot System, a treaded
titanium robot that searches for enemies and takes pictures of caves and
terrain. First, the Cary studio worked with the Future Applications Team to
allow the robot to be tested virtually before being built. Then, once the robot
was approved for deployment in Iraq and Afghanistan, the team worked on
creating a training kit that allowed soldiers stationed in those countries to
practice navigating the vehicles before they arrived. Like most of the other
research projects, Talon will be in the consumer version of the game, just as
human AI technology is being implemented into the game.

       The gaming studio's executives say the office isn't in danger of getting
axed for lack of revenue, although they won't reveal how much it makes.
They can develop simulations that once would have gone to outside
contractors, for a much steeper price. "The positive response for this type of
training content has been overwhelming," said Heneghan. "We are having a
difficult time keeping up with the many opportunities presented to us." The
result of these activities is a virtual flood of video games designed to train
soldiers in everything from urban warfare to speaking Arabic. Among the
best additions to its virtual arsenal are Full Spectrum Warrior, a game
designed for the Microsoft Xbox console, and America's Army, a PC game
that doubles as a recruiting tool. Full Spectrum Warrior was highly praised
by the gaming press when it was released this year. It was described as
'distressingly realistic' in a review in The New York Times Magazine.

      Created by the Institute for Creative Technologies with help from the
US Army, the game is used to teach soldiers realistic strategies for fighting
an urban war: Don't hide behind cars, for example, because they provide
poor cover. Using virtual reality to teach combat techniques has been around
for some time. It has proven a boon to armed forces the world over,
especially those from small countries like Singapore, which lack the space to
conduct large-scale exercises.




                                                                             72
       The shrinking cost of using virtual reality for training is another boon.
Early simulators, though useful, were behemoths: Large, hulking machines
that required their own buildings and cost millions of dollars. Developing
games like Full Spectrum Warrior may involve significant costs, but when
done, thousands of copies can be made and each can be played on
inexpensive TV sets and machines by many at the same time. That adds a
whole new meaning to the term 'force multiplier'. However, there is a
downside to relying too heavily on video games for combat training. Critics
are now pointing out that while games can teach soldiers craft skills such as
marksmanship, using them to shape behaviour on the battlefield, or worse,
making them the gold standard for how an enemy may react, is asking for
trouble. Or how well can a video game simulate the thought processes of
people who are intent on killing each other. The answer is they can't, but by
teaching soldiers proper techniques such as how to approach buildings,
avoid open spaces and provide cover fire, they can help.

        Another danger, the critic‟s point out, is games can inure a soldier to
killing and being killed (no kidding that‟s why they are useful in the first
place). After all, if the enemy blows your team away, you just hit the reset
button. That's what sceptics call negative training: games that are
programmed with unrealistic physics and behaviour, which soldiers pick up.
With potentially deadly implications on a real battlefield. Another problem
is real-life physical limitations are not programmed into games. In Full
Spectrum Warrior, you can run seemingly forever without breaking a sweat.
And you never get bored because something is always happening. You don't
need to sleep, eat or drink, either.

        That can also be construed as negative training. Take this cautionary
tale from the pages of the Los Angeles Times. Second Lieutenant Mark
Goins and Specialist Mark Zapata, part of the crew of a US Army Abrams
tank, were operating in Najaf in Iraq, the scene of fierce fighting between
cleric Moqtada Al-Sadr's militia and US troops. The story did not make clear
what they were doing at the time, except it was quiet enough that neither
soldier saw what was coming: In broad daylight, a rebel quietly scaled the
back of the tank and shot them both dead at point- blank range. It's the oldest
trick in the book. But you won't find out how to guard against it in Full
Spectrum Warrior, because the enemy in the game doesn't move the way a
real-life one would. On the Xbox, the enemy is always in front of you,
because somehow, the gamer is always in with a better chance of winning.
No one buys a game to get blown away time and time again. And even in a


                                                                              73
great game like Full Spectrum Warrior, the enemy is limited by artificial
intelligence. He can lay an ambush, plant an improvised explosive device,
and even melt away when he senses the firepower he faces is overwhelming
and live to fight another day. But he never ever takes a soldier as a hostage.
Or creeps up behind your tank and shoots you in the back of the head. In
Najaf, and other combat zones, that's not the case. (Carl Skadian Washington
Post October 10 2004)

       It's a sweltering 90 degrees and soldiers Kevin Messmer and Kroften
Owen are hunched in a rubble-strewn apartment. Peering from a window to
avoid sniper fire, they see a bustling Iraqi city. Binoculars pressed to his
face, Messmer surveys the view and finds what he's looking for just across
the river, an insurgent stronghold near a mosque's towering minarets. He
whispers coordinates to Owen, who in turns calls them into a radio. A
crackling streak of artillery fire arrives seconds later, shaking the room as
the bomb annihilates the target in a thunderous cloud of thick, black smoke.
The mission is a success. Except the mission doesn't really exist.

       2nd Lts. Messmer and Owen are among the first troops to use a new
breed of military simulator that's part video game/part Hollywood sound
stage with a serious dose of theme park thrill. The apartment setting is all
about creating the illusion of urban warfare - in a way that stimulates the
senses. Littered with chunks of brown plaster and other debris, the room is
decorated in a decidedly Middle-Eastern manner. A picture hangs sideways
on one wall, the smashed remnants of a small vase lie on a small circular
table near the kitchen area. Like a Broadway show, walls and other set
pieces can be swapped out as the training merits.

        Hidden speakers envelop “the set”, located in a shopping center-sized
building, with sound effects both subtle (barking dogs) and earsplitting
(bombs). And the window? It's really an oversized display screen showing
an artificial cityscape with high-resolution computer graphics. The so-called
Urban Terrain Module where Messmer and Owen had their multimedia
immersion training is a one-of-a-kind facility, part of the Army's Joint Fires
and Effects Trainer System, or JFETS. Across a darkened hallway is the
Outdoor Terrain Module. It's a room with a sandy floor on which a parked
Humvee faces an oversized movie screen. Soldiers see a computerized desert
landscape. In this environment, too, the training is in how to precisely call in
artillery strikes. Since the center went live in September of 2004, more than
300 officers have trained at the compound, whose evolution is key to a


                                                                             74
larger Defense Department strategy to give future members of all military
branches the ability to better synchronize artillery, air support and other
weaponry on the battlefield.

       The multi-million dollar system's origins go back to 1999, when the
Army first partnered with a unique consortium of educators, video game
makers and entertainment companies called the Institute for Creative
Technologies (ICT). The goal: combine the expertise of these seemingly
disparate fields to create synthetic environments that mimic actual wartime
situations. "It's really all about cognitive training, decision-making under
stress," says Randy Hill, ICT's director of applied research. The Army saves
money on live-fire training, and also economizes by tapping outside
expertise instead of developing everything internally. Traditional outdoor
exercises are still part of the training, but they often don't convey the
chaotic, complex nature of battles as well, said Col. Gary S. Kinne, JFETS
director at Fort Sill.

       Back at the urban-training stage, Rick Bleau directs the action from a
control room hidden behind a sliding door Sitting at his office chair behind a
bank of flat-screen computer monitors, Bleau can tweak environmental
factors, such as level of sunlight, wind speed and temperature (between 50
and 100 degrees Fahrenheit). He can track soldiers' movements (their
helmets have built-in motion sensing cameras) and invoke more malevolent
commands, too. Anyone who keeps their head in the window for too long
can expect to hear the whiz-pop of a sniper's incoming bullet. "We've had a
lot of soldiers coming back from Iraq who say it's too real. The only thing
we don't have is the smell," says Bleau, a civilian government subcontractor
whose company manages the computer systems. "We're working on that."
They'll certainly have the resources. Last month, the Army extended its
contract with the ICT in a five-year deal worth $100 million. The ICT,
located at the University of Southern California (USC) in Marina Del Ray,
Calif., has collaborated with the Army on other projects. The most well-
known is the squad-based training program, Full Spectrum Warrior.

       A commercial version of the program, based on training from the
infantry school at Fort Benning, Ga., was released to critical acclaim as a
video game for the Xbox console in 2004. In the past, the Army has used
computer games for recruiting, too. The taxpayer-funded first-person shooter
America's Army, for example, teaches Army dogma while offering online
battle modes where groups of gamers can shoot each other for free. And at


                                                                              75
Fort Lewis, Wash., soldiers use an online computer game to learn how to
react to the ambush of a convoy. The bulk of today's soldiers have been
exposed to video games their whole lives, so few have trouble adjusting to
the Fort Sill multimedia immersion.

       "Anybody's who's played games, they have learned how to learn," said
James Korris, ICT's creative director. "They come to this with a body of
knowledge that the Army can take advantage of to make their training more
effective and more efficient." Many U.S. soldiers in Iraq already play shoot
'em ups like Halo and Battlefield: 1942 when they're off duty, so it makes
sense to use video games as training aids, said Lt. Col. Tony Schmitz, a Fort
Lewis instructor. Back at Fort Sill, Maj. Jim Singer says the artillery training
he took in 1993 was downright primitive by today's standards, consisting
largely of slide shows on a projector. "It's as close to the real thing as we can
make it," he said. "In 10 years we've come this far. I can't imagine what it'll
be like in another 10." (Mat Slagle Associated Press - Dec. 19, 2004)

                                  Machinima

       Paul Marino vividly recalls the first time he watched an animated film
made from a video game. It was 1996, and Mr Marino, an Emmy award-
winning computer animator and self-described video-game addict, was
playing “Quake” a popular shoot-'em-up on the internet with a handful of
friends. They heard that a rival group of Quake players, known as the
Rangers, had posted a film online. Nasty, brutish and short, the 90-second
clip, “Diary of a Camper”, was a watershed. It made ingenious use of
Quake's “demo-record” feature, which enabled users to capture games and
then e-mail them to their friends. (That way, gamers could share their
fiercest battles, or show how they had successfully completed a level.) The
Rangers took things a step further by choreographing the action: they had
plotted out a game, recorded it, and keyed in dialogue that appeared as
running text. Pretty soon, Mr Marino and others began posting their own
“Quake movies”, and a new medium was born.

       Eight years on, this new medium known by its practitioners as
“machinima” (“machine” crossed with “cinema”) could be on the verge of
revolutionising animation. Around the world, growing legions of would-be
digital Disney‟s are using the powerful graphical capabilities of popular
video games such as Quake, Half-Life and Unreal Tournament to create
films at a fraction of the cost of Shrek or Finding Nemo. There already is an


                                                                              76
annual machinima film festival in New York, and the genre has seen its first
full-length feature, Anachronox. Spike TV, an American cable channel,
hired machinima artists to create shorts for its 2003 video game awards, and
Steven Spielberg used the technique to storyboard parts of his film A.I.

        All of this is possible because of the compact way in which multi-
player games encode information about different players' movements and
actions. Without an efficient means of transmitting this information to other
players (or animators with network compatible workstations) via the
internet, multi-player games would suffer from jerky motion and time lags.
Machinima exploits the same notation to describe and manipulate the
movements of characters and camera viewpoints. The same games also
allow virtual environments to be created quickly and easily, which allows
for elaborate sets and props. Machinima is hot. It's got its own film festivals;
it's being used to produce pilots for TV shows and games; it's being fronted
for and sponsored by first-rank animation businesses such as graphics card
maker NVIDIA.

       Machinima is important because it allows the creation of films within
a realtime, 3d virtual environment such as a videogame engine. It is the
convergence of filmmaking, animation and gaming. This is storytelling
within an interactive virtual space where characters can be controlled by
humans, scripts or even artificial intelligence. By skillful re-use of gaming
assets and techniques, machinima producers can create short or long stories
for a fraction of the time and costs of a conventional 3d key framed
animated production. Unlike normal 3d animation, which is created frame
by frame and then rendered, machinima can be created and rendered
simultaneously within the hardware. This can of course put some real stress
onto the graphics processor unit and compromises in resolution and depth of
detail must be made to keep the frame rate at a filmic 24 frames per second.
Nevertheless, with the quantum increases in graphics power afforded by
newer graphics cards these days, it's easy to see that this will get better and
better with time - and machinima films such as Anna are already very good,
and can tell a moving story within what some call a "microbudget.”

       Machinima was first used to create cinematics, the story-telling parts
of videogames (also called "cutscenes") that set up a game sequence, or
transition from one level or area of the game to another. It made sense to
create these sequences using the basic tools that the game was created with.
Cinematics, which were once limited to short, low resolution sequences due


                                                                              77
to the memory constraints of gaming cartridges, are now a major production
item on games that have DVD-sized memory space to play with, and can be
short, artfully created stories within their own right.

       In fact, the game Anachronox produced by experienced
cinematographer Jake "Strider" Hughes had so many cinematic sequences
that the decision was made to edit them together, and Anachronox: The
Movie was born. "When I added up the total running time of all the
cutscenes (comprising the game), it came to two hours and 30 minutes, so I
thought it might be fun to string them together to see if it would work as a
straight narrative," Hughes notes. The movie won a number of awards.
Although the film shows its gaming roots, the compelling story of Sylvester
"Sly" Boots, a down-at-heel private investigator in a futuristic world, comes
through loud and clear.

       Another place you'll find machinima is as a game tie-in with a new
television show, Game Over, produced by Carsey-Werner-Mandabach and
currently airing on the UPN Network. The game, Game Over in
Machinimation, was produced by Fountainhead Entertainment and tells a
story about the main protagonist from the TV show, Raquel Smashenburn
(voiced by Lucy Liu), a modern working woman juggling family and her job
as a monster-fighting agent. Players can choose to play in either first- or
third-person mode. There are six levels to the game. Players can also use the
built-in game tools (based on the Quake III engine) to produce their own
take-offs on the storylines.

       "The ability to play a game as a character from a television show, and
then to take that character and make your own short film, is just plain cool,"
says Katherine Anna Kang, Fountainhead.'s CEO. The game was sponsored
by GameFly. "We are the leader in the game rental space, making games
easily accessible to the masses, says Jung Suh, co-founder of GameFly.
"Combining our strong following of dedicated gamers with UPN's loyal
viewers, both the Game Over show and the Game Over in Machinimation
game are teed up to be an extremely popular union."

       Another form of machinima was presented at the NextArt portion of
the Florida Film Festival in March, 2004, where the zany ILL Clan, a New
York-based animation studio, presented a live 3d animation show, On the
Campaign Trail with Lenny & Larry Lumberjack, which featured two
characters as they held a "Town Hall Meeting" as part of their campaign for


                                                                            78
the presidency. The animated characters were controlled and voiced by the
ILL Clan performers (who have a background in improv comedy), and
interacted with the audience. "And unlike the actual candidates, you can ask
them anything you want, and they'll give you an honest answer!" noted Matt
Dominianni, animator and voice of Lenny the Lumberjack. The ILL Clan
produced a similar machinima show, Common Sense Cooking where their
pioneering work was documented by the Discovery Channel.

       The ILL Clan was also responsible for machinima productions for
Spike TV, a division of MTV Networks. The company created short
animated vignettes, co-directed by David Kaplan and Dan Torop, to
introduce The Video Game Awards Each animated intro was created by
different game designers around the world. "By tapping the resources of the
public gaming community it allowed us not only to produce animation
quickly and effectively, but also to support the efforts of a lot of talented 3d
designers," says Frank Dellario, president of the ILL Clan.

      The use of machinima shorts as interstitials is reminiscent of how the
animated show The Simpsons started out - as interstitials for the Tracy
Ullman Show, where these short clips proved so popular that they were
greenlit as a full-length primetime show. This type of usage for machinima -
generating short clips to try out in one venue, then using the resulting
successful feedback to get approval for a full-out show or game - may
become a prime application for this genre, especially in an age where TV
shows, videogames and films represent huge gambles for studios, which are
increasingly reluctant to try out new concepts that do not already have a
proven track record in some other format and thus seem hell-bent on
flooding us with either unending sequels or re-workings of last years TV
shows.

      Machinima production for TV does not stop with shorts. An animated
drama made for broadcast television has just been commissioned by Scottish
media groups. Called Rogue Farm and based on a story by sci-fi author
Charles Stross, the machinima-based film tells the story of a near-future
couple that is hidden in a technologically advanced but deserted area, and
whose marriage is threatened by their reactions to a strange new threat - a
"rogue farm." "Machinima is a powerful technology, which is why we have
been developing and shouting about it for the past few years!" says Hugh
Hancock, director of the film and head of Strange Company, the animation
company that is producing it. Strange Co. also produced the popular


                                                                               79
machinima shorts Ozymandias and Matrix: 4x1. Rogue Farm was financed
by the Newfoundland funding scheme, run jointly by Scottish Screen,
Scottish TV and Grampian TV.

       There are two basic ways to produce machinima. It can be script-
driven, where characters, effects and cameras are directed by scripts for
playback in realtime. Unlike normal animation, the action is driven by
events, rather than keyframes. For instance, a storm effect could come on at
the moment a character exits a building. The second method (which is
predominantly used) is to record the action in realtime within the virtual
environment, with virtual cameras and lights positioned much as in normal
filmmaking. It is this ability to shoot live that gives machinima its advantage
for short production times. A live director should feel right at home in this
environment, and an animation director will enjoy the ability to give a
direction and have it executed immediately, rather than in weeks or months
down the road. Multiple takes can be done in realtime and then edited in post
for the final product. Post-production can take advantage of the fact that all
the data relevant to each scene has been captured - the location of the
characters and set pieces, the camera angles and light values - and these can
then be modified in a "what if" interactive process. It's as if the director can
retroactively adjust his camera angles or lights without having to call back
the cast and crew.

       The first step in generating a machinima project is choosing which
game engine to use. Popular game machines include Unreal Tournament
(either the original or 2003 edition), Half-Life, Warcraft, and the real genesis
of the whole industry Quake (now in versions I, II or III), created by id
Software (www.idsoftware.com) of Mesquite, Texas, which created the
classics Doom, Wolfenstein and Quake.

        As a beginning machinimator, you can work with an existing game as
it is, and simply do a run-through of your favorite game, having your
characters interact soulfully and meaningfully instead of hacking and
slashing each other, and then record the output of the game to a video source
such as a DV camcorder, after which you edit the video. After you get your
toes wet, you can then go onto the professional track and create detailed
scripts and storyboards, generate entirely new characters and sets [or buy
them from sources such as Turbo Squid and use the game engine toolkit to
fine tune the movements, lighting, camera angles and other features, and add
precise lip-sync and appropriate audio, with more extensive post production


                                                                              80
to align with your directorial vision. Professional quality production
involves recording at the data level, instead of as simple video. Recording
the data that describes the details of each scene - the exact locations and
motions of each character and model, the camera movements, the locations
of light and other physical properties - generates the most flexible form of
machinima, as it creates a product that can play back within the game engine
itself, and is ameliorable to future modifications by gamers.

       The obvious advantage of this medium is that it can be produced
quickly, even in realtime. The use of game controllers to move characters
around could mean that even animation non-pros such as businessmen or
teachers could produce animated sequences to illustrate their marketing
presentations or classroom lessons. An educator could move an avatar
through a 3d scene to explore the environment, for instance, and let the
underlying game engine take care of such details of physics as lighting and
gravity. Repurposing gaming assets can lead to production costs of a few
thousand dollars for a rough pilot or storyline demonstration for a TV show
or videogame. With the costs of a videogame hovering around $5 million
(not including promotion costs of several million bucks), it takes more than a
storyboard to pitch a concept these days; an inexpensive machinima
production could lead a possible sponsor to the "Ah, ha!" moment of
greenlighting a project.

       Furthermore, since the medium is Internet-compatible, a machinima
producer can use the Doom marketing model to gather revenue and get
market recognition. Doom was revolutionary in its day not only because of
the content of the game itself but because of its distribution and marketing
model - rather than sell the game to a distributor in return for paltry royalty
payments, id Software gave the game away free as shareware, and then
made a profit by selling both a packaged version of the game with printed
instructions and successive upgrades and new levels to gamers that had
become addicted to the gameplay and had to have more. A low-cost
machinima game released as shareware that catches on and creates buzz
could become a likely candidate for a production pitch

       Although machinima may be too low-res for some artistic tastes
(companies such as Blur prefer to create cinematics with more detailed
professional toolsets such as Maya), there is no doubt that machinima use
will grow for certain applications such as low-cost short animations, live
animation productions like the improv comedies produced by the ILL Clan,


                                                                                  81
and non-entertainment uses such as virtual worlds that enable educators to
get their points across by 3d graphics instead of text. There may also be apps
that are still nascent, such as machinima games built around product tie-ins
for the interactive television (iTV) market, which would allow the viewer to
interact with and possibly purchase products within a televised show, via
either an enabled set-top box or synchronized webcast.

       Games publishers have now begun to incorporate machinima into
their products. Epic Games has built a movie-making tool into its
spectacularly successful Unreal Tournament series and many games include
level-design software that both gamers and machinima artists can exploit. In
2004, Valve Software released Half-Life 2, a long-awaited game that
included tools specifically geared toward machinima: in-game characters
will have realistic facial expressions with 40 different controllable muscles,
and eyes that glint. Conversely, machinima creators have built movie-
making tools on the foundations of games. Fountainhead Entertainment
licensed Quake III to create a point-and-click software package called
Machinimation, which it used to produce In the Waiting Line by the British
band Zero 7. It became the first machinima music video to be widely shown
on MTV last year.

        This is not to say that machinima is ready for prime time just yet. The
production quality is good, and will only get better with the next generation
of video games, such as Doom 3. But it still has a long way to go to match
Pixar's Monsters, Inc. some frames of which (there are 24 per second) took
90 hours to generate using over 400 computers. The technologies will also
have to improve before the tool kit can impact the production system in
general. At the moment, machinima-makers are using a patchwork of
utilities developed by fellow enthusiasts. Quake, for example, has its own
programming language that can be used to build movie-making tools. This
in turn enabled Uwe Girlich, a German programmer, to create a program
called LMPC (Little Movie Processing Centre), which translated a particular
sequence of in-game actions into text. While David Wright, an American
programmer, then released a program called “KeyGrip” to convert this text
back into visual scenes, and to allow simple editing. Other programs allowed
machinima-makers to add dialogue and special effects. As the games have
advanced over the years, so have their associated tools.




                                                                             82
       But the machinima-making process is as yet still nowhere near as
slick as desktop video-editing, for example which with the rise of digital
video cameras has placed live-action film-making tools in the hands of
everyday computer users. But due to the fact that so most machinima movie-
makers have been video-game nerds their productions have historically
lacked two crucial elements: story and character. “There are as yet no
Ingmar Bergmans,,” says Graham Leggat of the Film Society at Lincoln
Centre. “Last year's machinima festival winner, Red-vs-Blue, was mostly
based on sketch comedy. Most other efforts are of the standard video-game
shoot-'em-up variety.” It is, in short, a situation akin to the earliest days of
cinema. Another problem is that if a machinima-maker were to score a hit,
there might be legal trouble. So far, makers of video games have looked the
other way as their games were used in ways they never intended. But if
someone were to make money from a film that relied on one of its games, a
game-maker might be tempted to get the lawyers involved. For now, this
does not concern Mr Marino, who believes that machinima is here to stay.
“Five years ago, the games were not nearly as vivid as they are today,” he
says. “The same goes with Machinima. We may not be on the level of
„Shrek‟, but that will change. It's inevitable.” (Economist Feb 2005)

       So finding themselves without an abundance of spare cash for actors'
salaries and expensive recording and editing equipment, some independent
filmmakers are turning to a new medium: machinima. Put simply, they're
using video-game technology to make movies. "Machinima is using
puppetry techniques to tell stories in really fantastic (digital) worlds," said
Hugh Hancock, artistic director for machinima production house Strange
Company. Hancock and his Edinburgh, Scotland, company have
manipulated computer-game technology to create movies that have been
praised at film festivals around the world.

       "Game technology lets users tell stories in these really fantastic
worlds," Hancock said in a recent phone interview. "For the first time, indie
filmmakers can tell incredible stories without the constraints of budget." Far
from the pixilated sprites and flashing lights of games from the 1980s,
today's video and computer games are robust and malleable 3d affairs. Such
as location scouts before them, machinima producers often scout for games
with open-ended play, games that let gamers go anywhere and do anything
in enormous worlds. Armed with sufficient freedom of movement and
plentiful in-game vehicles, weapons and characters, a good producer can



                                                                              83
record just about any scene. It's simply a matter of setting up a scene and
playing in-game characters so that they act it out. Voices can be edited in
later or recorded along with the action.

        Often, PC games are involved because creators can save recorded
scenes to hard drives or other digital storage devices. Strange Company has
produced 16 movies using PC games such as Quake, Quake 2, Half-Life and
NeverWinter Nights. "Blood Spell," the company's current project, is a
fantasy piece produced with the NeverWinter Nights role-playing game.
Strange Company's treatment of "Ozymandias" has the most critical success,
Hancock said. Film reviewer Roger Ebert has compared the short to
important anime works. "It's been shown at film festivals around the world,"
Hancock said. "I've had professors of literature with absolutely no video
game background calling us and telling us how fantastic it is and how they
use it for their students." One of the most prolific machinima projects is the
"Red vs. Blue" series, a periodic comedy show acted out within the popular
Halo games for Xbox and PC. That series has become so popular on the
Internet that producer Rooster Teeth has issued the first three seasons on
DVD. A random check of South Sound game retailers showed that several
were out of season 1 and low on stock of seasons 2 and 3.

       At first glance, the word "machinima" looks like it might be a
smashing together of the words "machine" and "cinema." While that would
capture some of the art's essence, the true origin of the term has more to do
with the latin "machina" (machine) and "anima" (life) - with a pinch of
"anime" (Japanese animation) tossed in. You'll Google with success by
typing "machinema" or "machinima." In late June, MTV2 began airing a
new season of "Video Mods," a show that has video-game characters
starring in music videos. "I love the punk do-it-yourself attitude of
machinima," said Alex Coletti, executive producer of "Video Mods."
"Anything that lets the audience create is cool by us. The whole mash-up of
games and music that is 'Video Mods' can only get better and cooler when
we can let the audience play, too." The Academy of Machinima Arts &
Sciences is preparing for the 2005 Machinima Film Festival, slated for Nov.
12 at the Museum of the Moving Image in New York. (Bill Hutchens,
Tacoma News Tribune Aug 1 2005 -for more on Machinima see Appendix-
A)




                                                                              84
              Production Network Experiment




US movie director Robert Rodriguez made his latest movie
completely without film, using commercially available digital
technology. Rodriguez, who directed Spy Kids and Once Upon a
Time in Mexico, used AMD Opteron based workstations and servers
throughout the production pipeline for his latest movie, Sin City.
Digital artists at his Troublemaker Studios in Austin, Texas, used
workstations and servers running Avid's XSI software to pre-visualize
scenes, create and render high definition digital composites, add
visual effects, edit and produce a final HD digital master. "The
process used to be a linear write-shoot-edit and fix-it-in-post. That's
gone. This technology gives me the power to be in the moment, to
explore ideas and create at the speed of thought as we produce. No
actual film was used in the process. It's a non-linear, organic process
that allows me to tap the creative potential of the entire production
team from the beginning to the end of the process" says Rodriguez.
AMD digital media and entertainment director Charlie Boswell says,
"Access to state-of-the-art production power is no longer the domain
of large studios and big budgets. Now anybody editing their home
videos or photos can use what the pros use," says Boswell.

                                 Staff Writer ITWEB April 11 2005

In December of 2004: PowerProduction started shipping StoryBoard
Artist version 4 this professional visualization software can assist
creatives with turning ideas into blueprints and storyboards for film,
video, animation, commercials, games, DVD, interactive television or
multi-media products of all kinds. The applications features are
designed to assist directors, filmmakers and other media visionaries
explore, plan and prototype their own media production ideas with
added motion graphics capabilities. Pre-production ideas can be


                                                                     85
presented using a combination of imaging techniques. Draw, import
photos and/or composite artwork and photos with hundreds of pre-
drawn characters/props/locations included in the applications
integrated database on content. Draw tools, alpha channel support and
color transparency enable digital photos and artwork to be layered into
storyboard frame along with the pre-rendered characters and props.
Library palettes of new pre-drawn images are incorporated to make 2d
and pre-rendered 3d artwork available quickly.

With the new Pan and Zoom feature, users can enliven their boards by
adding intra-frame pans, tilts and zooms. The Pan and Zoom feature
mimics the camera-stand setup to bring still pictures to life. Motion
graphics can enhance storyboards by presenting ideas in animatic
style with interframe timing and transitions, sound effects and music
in the visually intuitive timeline. For live action filmmakers, the script
import from screenplay software and word processing applications
gives a quick start to storyboarding scripted projects and can integrate
script breakdown information.

For Animation and other large digital projects, the batch import, fit-to-
frame/aspect commands, and import Adobe Photoshop layer features
streamline large-scale image processing and handling. For branching
interactive projects, such as DVD and videogames, the linking feature
allows multiple path media projects to be prototyped and tested by
jumping between frames and exporting the multi-linear concept into
QuickTime and Web formats. For paper-based presentations, print
storyboards with captions and scheduling data using StoryBoard
Artist's elegant layout interface. Storyboarding digitally expedites the
production process enabling time-saving coordination between the
various stages of media development and creation. The ease of image
creation and collection, along with the numerous export formats offers
the possibility of information distribution via data sheets for
productions crews to movie files for import into digital editing
applications and non-linear editing systems. Finally a place to collect
all the data and a cohesive way to present it.

                                    http://www.powerproduction.com




                                                                        86
                     Computer-aided-design or CAD

       When the video gamers first utilize machinima technologies to
produce low-to-no-budget films in a real-time virtual 3d space they also
demonstrated to the more upscale and mainstream -- but still forward
looking -- producer-directors. Just how easy and cheap it could be to use a
gaming consul to assemble a sort of “electronic story board” to simulate
during the early stages of pre-production the audio and visual components of
their particular production project. So when Steven Spielberg used a
Playstation to story board “AI” he clearly understood that a videogame
consul and suitable storyboarding and scripting application made it easy for
a director even with little or no computer-graphic skills. To assemble a
virtual world; release a number of AI-enabled actors into that world; and
then to follow the action and block out all the camera moves and the various
other inputs necessary to the operational aspects production on a media
specific basis.

        Once again it‟s the de-construction of the production process into its
constituent media specific components and their re-assembly into a non-
linear design and manufacturing process focused on the DI that enables the
existing digital production units to generate the upgraded production values
already evident in their outputs. And this computer-aided-design or CAD
based approach to pre-production is about to explode as more and more pre-
visualization tools are borrowed from the game and simulator designers and
are deployed into production sector ultimately no doubt to be downloaded to
the “creatives” by way of the 2k production pipeline. With the general idea
of expanding, rationalizing and upgrading the pre-production and production
process in general. And these various trends will continue and in fact
accelerate as the costs of 2k production pipeline continue to decease as the
consumer-electronic workstations, servers and wireless infrastructure come
on line and continue taking over and enhancing more and more of the
tradition job functions associated with media production and distribution in
general.

       So what began as a sort of non-budget and pirate art form may well
end up being a way of producing a sort of full length demo reel that
completely “simulates” the aesthetic, logistic and economic aspects of the
particular productions starting on the very first day of pre-production. A
simulation or electronic-story-board that can be used first to sell either
studios or networks on the particular production project. And then once pre-


                                                                            87
production is expanded to included the departmental infrastructure used to
get the producer-director, production-designer, director of photography and
all of the other media specific components of the production unit on the
same technological, economic and aesthetic “pages” so to speak. The
simulation assembled on the DI becomes the ultimate reference for
everything from camera choreography or where to place the dolly track,
determining focal lengths of lenses and include all the other details of
production technique and technology to be used during the execution of the
project. Starting on the first day of pre-production and extending all the way
through to the director's final cut at the end of the production process

       So in the long run the most radical impact of digital technology on the
existing film, television and multi-media production sectors could be how
the use of the 2k production pipeline when applied in a systematic manner to
pre-production will revolutionize the production process in general from
aesthetic and economic points-of-view. As well as, to radicalizing and
energize the various crafts and trades still necessary to the production unit
and production process in general. As to survive they are forced by
economic factors over which they have little or no control onto network
compatible and interactive media specific workstation such as now flooding
into the marketplace for consumer electronics on a world wide basis.

        At this point the traditional “linear” production processes involve
collecting and combining large amounts of information in a wide variety of
formats electronic, paper based and otherwise. But the paper-based
management and production systems generally in use within the still
analogue elements of these industries. Limit our ability to display this
information to the script, budget and management documents of the
traditional paper based production system. In other words during pre-
production on a conventionally “scripted” film or television program the
design team does not work with images and audio signals but rather on
paper. As the office technologies in place within the still analogue elements
of the production sector limit the building of abstract representations of that
being “produced” to the traditional script, planning and budget documents.

       But introducing images and audio-signals into the design process at an
earlier stage than traditionally possible by extending the use of the 2k
production pipeline forward into pre-production will allow production units
to identify and solve aesthetic, technical and logistic problems to a greater
degree during pre-production than presently the case. They will be able to


                                                                              88
fix things not usually evident until the production or post-production stages
of the process. They would also be able to integrate the individual media
specific components into each other in a more evolved manner. Moving the
pre-production process off the tradition paper and script based approach and
onto a set of interactive and network compatible and media specific
workstations loaded with a suitable software environment will allow the
digital production team to both upgrade and de-centralised the pre-
production process in general. At the core of the modernized production unit
will be fully network compatible computer-aided-design (CAD) package
operating in a distributed format on a production network wide basis.

       In the case for instance of feature film the cad system would generate
a computer graphic based real time model or electronic story board of the
programmes various media paths based on newly collected and materials
drawn from inventory assembled on the DI compiled on the productions
network of workstations starting on the very first day of pre-production.
Attached to the design graphic would be a detailed production plan, time
table and budget model allowing the continuous monitoring of the cash
flowing within the production system on a scene-for-scene, hour-by-hour,
week-by-week basis. Lower absolute levels of expenditure would be
achieved in most budget categories as technological and organizational
factors combine to result in major savings.

       For instance during the “step-outline” stage of scripting a detailed
budget model, production plan and timetable could be attached to the design
graphic and audio-track allowing the producer-directors and department
heads to connect the technical and logistical aspects of the production to the
structure of the narrative at an earlier stage than traditionally possible.
Connecting the production plan and budgets to the design graphic will allow
the narrative to become it‟s own a cost-control mechanism. As the
programming materials were collected during principal photography they
would then be dropped into the design graphic enabling continuous analysis
and programme redesign for both thematic and budgetary reasons.

       From an operational point-of-view this would mean that the imagery
and audio information collected during the principal photography would be
dropped into the original design graphic assembled within the DI including
the audio tracks produced during the rationalized pre-production process. So
the actual imagery and audio signals collected during production would
replace the sketches and scene blockings produced during the electronic


                                                                            89
storyboard process. The general idea is to allow the producers and
department heads controlling the pre-production and production processes
the ability to review the various aspects of the production from both
aesthetic, economic and logistic points-of-view as the production proceeded
in a more sophisticated manner than is presently possible.

        The high degree of integration between the cad, audio and image
processing systems will massively improved speed and confidence in decision-
making on a departmental basis. The budgetary and logistic implications of
creative choices will be much more explicit. So in traditional scripted formats
the cad system replaces the script, budget and management components of the
traditional non-interactive production system. With the script, budget,
timetable, management and audio, video, graphic components of an interactive
production system running in a distributed format on all the various
workstations and servers comprising the production network. The idea is to
upgrade our program design and production management capacity – in a
coordinated manner and on a departmental basis -- by using artificial
intelligence and function specific expert-systems to model the programmes
economic, logistic and aesthetic elements in a constantly updated simulation of
the programmes audio and visual designs connected to it's budgets, production
plan and timetable -- all operating on a production network wide basis in a
distributed-computing format.

        So the interactive approach as built into the network based
storyboarding process would replace the script, budget and management
documents of the traditional non-interactive production system with the
script, budget, timetable, management and audio, video and computer-
graphic components of an interactive production system. This would also
integrate the production and design sides of the unit allowing management
and the departmental infrastructure to monitor activities within the particular
production projects in a more evolved manner. The more integrated
approach would allow and encourage the various at this point unarticulated
components of the budget to be balanced opposite each other ensuring costs
are not incurred in one part of the production at the expense of another. The
chronic under funding of audio and photographic production would be
corrected by targeting a more reasonable proportion of the overall budget
“below-the-line” and into the departmental infrastructure.




                                                                             90
        This will soon lead to better targeting of production budgets at
specific creative, technical and organizational problems rather than just
reflecting costs of an un-rationalized production process as is now generally
the case The audio sides of the CAD system would present the sound
effects, sketches of the musical score, pre-recordings of the programs
dialogue components, outputs for any necessary playback units, control data
for location mixing, recording and re-recording matched to the output
formats of units actually undertaking the various prototype productions.
Audio-cad will stimulate the creation of a more evolved audio pre-
production process allowing a more sophisticated primary collection strategy
able to exploit the multi-channel recording, mixing and microphone systems
now coming on stream. Production units would come off location with audio
materials in more highly "produced" formats in terms of both
synchronization and spatial continuity the most capital-intensive aspects of
multi-channel audio production at this point. These systems will also allow
the production units to integrate the audio components of the programme
into it‟s visual sides in a more evolved manner and to exploit the various
smart-playback systems now coming on line.

      The special purpose hardware ware licensed into the audio-
workstations and network infrastructure would communicate to the super-
servers backing up the production network through dedicated fibre-optic
and/or wireless links depending of the nature of the particular production
processes. As with other elements of the multi-media database maintained
on the servers enabling production in the first place. The audio tracks would
be ordered by the slating convention the macro-structure displayed via the
workstations graphics package. Digital recorders would store the bulk audio
information and feed into signal processors capable of mixing, equalizing
and processing the sound streams to create all manner of effects,
enhancements and output formats. The workstation would look after
mapping the locations of the audio cues, synchronization, spatial continuity
and generating a cue sheet display. The architectures of the audio sub-
systems would be production unit specific and designed so that standard
processing blocks capable of a wide variety of effects could be cascaded to
provide numerous channels of total audio control matched to the actual
output requirements of the specific production units.

     By these means the audio workstation and network infrastructure
would automate the editing, mixing and cueing and packaging stages of the
production, transmission and display and playback systems in general. There


                                                                            91
would be a dramatic shift of budget resources out of post production into
pre-production with more of the actual audio production budget being heard
rather than just poured into frustrating post-production man hours as now
tends to be the case in both film and television production. Interestingly
connecting program design to logistics earlier in the process enables the
narrative form itself to become a cost-control mechanism. Ongoing cost
benefit analysis and programme redesign will stimulate steep decreases in
cost-to-airtime ratios with the biggest savings being possible in the most
capital-intensive forms of production such as television series, feature films
and television commercials. As well as everywhere else the overblown and
over the top Hollywood style of production is now deployed.

      Theatrical documentary techniques in combination with the new
compositing and image processing potentials would be used to keep the
percentage of budgets spent on staging large and complex events to a
minimum mostly by getting rid of the extras necessary the establishing shots.
This would combine the use of actors and scripted dialogue, with theatrical
documentary techniques and real time multi-camera coverage as well as
improvisational and performance techniques drawn from cinema direct.

        By these means the budget‟s components would be balanced opposite
each other and the chronic under-funding of audio and photographic
production would be corrected by targeting a more reasonable proportion of
the overall budget below-the-line or into the departmental infrastructure.
Lower absolute levels of expenditure would be possible in most budget
categories on a per-programme and per-departmental basis as the cost-control
aspects of the cad-strategy stimulate a general decrease in cost-to-airtime
ratios. All made possible through the cad-system's ability to manage the
production of information and entertainment programme types with higher
levels of production value produced by small but highly equipped production
teams functioning in a more evolved manner and using a uniquely Canadian
network based production strategy. The idea is to upgrade our production
management capacity by using function specific expert systems to model the
programmes economic and logistic elements in a constantly updated
simulation of the programmes audio and visual designs connected to it's
budgets, production plan and timetable.

       For instance in relation to the material costs of film production during
the transitional stages of the process the storage medium would remain
negative film but pre and post-production would be workstation and DI


                                                                             92
based and thus electronic. The cad process would tend to keep shooting
ratios down but the multi-camera coverage would bring it back up. A normal
consumption of negative is to be expected in traditional formats but since no
printing with the exception of tests would be done until the final print run
assuming there even is one. The costs of the photographic side of the process
will be reduced by 60-to-80 percent when compared to a completely film
based traditional production process. The reusable characteristic of the
magnetic materials would also keep the storage ratios down. Of course as the
camera-to-screen electronic image processing system reaches operational
status the traditional material costs of film production would disappear
altogether.

                       Flattening the Production Unit

       One significant affect of these new technological potentials on the
productive components of the system lies in how the job functions currently
comprising the production unit will be changed as a result of their impact.
The divisions-of-labour enshrined in the sector‟s existing contract base still
matches the by now obsolete industrial or studio based approach imported
from the US. It‟s this obsolete organizational matrix as built into the above-
and-below the line budget model and in Canada into the Media Guild, CEP
and IATSE contract mechanisms that‟s more than anything else responsible
for the spiraling costs of film and television production to the point where
two million dollar hours for television and 50-to-100 million-dollar feature
films are fast becoming the norm.

       Exploiting the flattening effects of convergence within the production
system will allow us to re-integrate the roles played by the key technical and
creative personnel within the production unit. Whereby the traditional job
functions will be merged or blended and placed in fewer but more highly
skilled hands. The merger of job junctions is both possible and necessary due
to the fact convergence integrates the individual media into each other as it
transposes them into digits. So the individual media specific job-functions can
be combined and re-balanced between the necessary creative and technical job
functions resulting in a more evolved and cost-effective production unit in
general. But this is not a reduction in job description, as the case towards
further specialization as evident for instance within the film, television and
multi-media industries in the US. But rather it‟s an expansion of the
responsibilities and skills-sets necessary to the principle creative and technical
elements of the reorganized production unit in general.


                                                                                93
       This approach is possible for the most part because the control-functions
of the camera, lighting, sound, grip and post-production technologies are being
automated lessening the need for hierarchy within the production unit on a
departmental basis. For instance: within the camera department one or two
people using robotic and wireless control technologies can operate any number
of cameras during action sequence or special effects coverage and be
functioning in a more evolved manner than via the industrial method involving
the endless list of specialized assistants to assemble maintain and above all to
manually-operate the various components of the system as evidenced for
instance in the IATSE contract.

       The impact of camera, lighting and staging robotics and of wireless
control technologies provides a graphic illustration of how an industrial
hierarchy can be flattened to create a more cost-effective production unit on
a departmental basis and in general. In the camera, lighting, staging and
transportation departments the job functions can be combined and placed in
the hands of the department heads. Where higher levels of production value
are required units would consist of department heads backed up by
additional personnel functioning in cross-departmental modes. The
assimilation of these technologies is also forcing and enabling a
rationalization of the production process and readjustment of the chronic
under funding of audio and photographic production as well as an alteration
in the relationship between pre-production and the rest of the process.
Leading to better overall targeting of financial, technical and human
resources at specific creative, technical and organizational problems.

        The most radically affected department will be production itself as the
cad-system is based on extending multi-media processing capacity to
everyone involved in the production and pre-production processes.
Combining the technical and creative job functions in this way will enhance
the decision making and communications capacity of the producers and
department heads while wiping out the job functions of most now operating
the still paper-driven components of the system including managing the
existing obsolete contract base. The production network wide electronic-
data-interchange system will eliminate most clerical, production assistant,
assistant director, script, continuity, secretarial, production manager and
accountant and assistant everything job functions by concentrating the


                                                                             94
decision making, design and production management capacities in the
offices of the producers and department heads resulting in the collapse of the
middle management and strengthening of the departmental infrastructure.

       Here we are seeing the production unit impacted by convergence as
the vertical hierarchy necessary to the still manually operated systems are
replaced a much flatter organizational structure based on expert systems,
network connections, horizontal integration on a network wide basis, as well
as, an a major upgrade of the pre-production process in general The smaller
production staff operating almost entirely above-the-line would control a more
evolved programme-design, production-management and cost-control system
than the old above-the-line bureaucracy ever did. While at the same time
resolving the various inappropriate splits between the creative and technical
job-functions imposed on the existing system by the industrial approach also
rooted in the above and below the line budget model. For instance the long-
term split within the Directors Guild between the production-managers and the
department heads will be resolved to the advantage of the department heads.

       The traditional trades or crafts facing the most traumatic transitions will
be the screenwriters, story editors, journalists, associate producers and the
other general hangers-on associated with the above or below-the-line budget
model currently burdening the system from an economic point-of-view. This is
because as the media sector upgrades new information and entertainment
product types and production formats are emerging to dominate the subsequent
period from economic and aesthetic points-of-view. The importance of this to
journalists, screenwriters, story-editors and various other above-the-line
hangers-on is they centre their jurisdictional claims on feature films and
television series. But the newer and more cost-effective programming formats
like news, current-affaires, documentaries, reality based programming and the
various interactive formats now coming on line did not fall into the jurisdiction
of the screenwriters or any other of the various above-the-line jurisdictions.
But rather into the jurisdictions like broadcast journalist by way of jurisdiction
over news-room technologies, rather than an equity position of any kind as is
the case with the writers and screen-writers copyright protection is merely
another form of above-the-line arrangement.

       Suggesting the writers, story-editors historical alliances with executive
producers rooted in copyright protection splits them from a closer relationship
with the elements of the production unit based on jurisdiction over the specific
media technologies. From the screenwriters point of view the counter-

                                                                               95
revolution has been in the cards ever since B.P. Shulberg, the only of the
original moguls a himself screenwriter, ran Universal Pictures from a rolling
series of “story meetings” in his famous sunken office. But it was Canadian
actor-producer Mack Sennet who along with D.W. Griffith and Thomas Ince
invented the shooting script who ultimately pulled the trigger on the present
day scribes.

       So for reasons not entirely of their own making the screenwriters, story-
editors (and other above-the-line hangers-on) have been more profoundly
affected by industrialization than the trades with jurisdiction over particular
media specific components of the technological base. A clue to why this might
be also rests in the fact that the writers generally don't consider themselves as
members of the production team at all. The most damaging aspect of this from
the point-of-view of our existing production capacity has been the progressive
disconnection of the traditional scripting and narrative design processes from
the production and performance sides of the medium perhaps one reason for
the seemingly never ending shortage of good scripts.

       The production network experiment will also include more evolved hard
and software environments enabling text database reading on a network wide
basis as well as global idea-processing strategies of various kinds. These new
research, analytical, writing and editorial tools, now manifest mainly in
relation to how the Internet is used -- within – the print-journalism and
broadcast sectors, will allow journalists, editors and story producers within
both the broadcast and print journalism sides of the system to monitor
alphanumeric traffic on a production network and media sector wide basis.
The producers-workstation connected to the network would allow any system
users with access to the network to monitor traffic on a network wide basis
from the vantage point of their particular “beat”. In the mature system any
question asked and answered on any subject during the period of history
encompassed by the servers backing up the production network and appended
archival systems would be instantly retrievable with only the need to know.
This massive read and retrieve capacity has already been extended into image
and audio storage and processing components of the archival sector thanks in
large part to the software engineering activities of the various components of
the security apparatus in the US.

       The writers and screenwriters face these difficult adjustments as their
disconnection from the parts of the production unit with jurisdiction over the
specific media technologies, caused by their above-the-line status, leaves them

                                                                                96
unable to understand what‟s going as the technologies impact primarily within
the below-the-line components of the budget at the level of the workstations.
The components of the production unit with equity-positions of any kind,
excluding the producers, are thus less able to protect themselves from the
jurisdictional encroachment in regards to all areas connected to narrative
design and production planning the current basis for their jurisdiction. As
producers and department heads make the cad-system operational they will
usurp the programme design and production management functions from the
grab bag of above-the-line functionaries currently burdening the system from
an economic point-of-view as a side affect of re-arranging the production unit
for increased cost-effectiveness in general.

       Another radically affected department will be production itself as the
cad system is based on extending multi-media processing capacity to
everyone involved in the production process. Combining the technical and
creative job functions will enhance the decision making and communications
capacity of the producers and department heads while wiping out the job
functions of most now operating the once paper-driven components of the
system. The network wide electronic-data-interchange system will eliminate
most clerical, production assistant, assistant director, script, continuity,
secretarial, production manager and accountant job functions by
concentrating the decision making and communications capacity in the top
end of the system resulting in the collapse of the middle management and
strengthening of the departmental infrastructure.

        Here we are seeing the production unit impacted by convergence as
the vertical hierarchy necessary to the industrial approach is replaced a much
flatter and more de-centralized infrastructure based on horizontal integration
between the necessary components of the production unit. The smaller
production staff will control a more evolved programme-design, production-
management and cost-control system than the old bureaucracy ever did.

       Many new technology related production processes are actually
cheaper than achieving the same objectives by traditional means. And since
the technological base will be drawn from an upgraded line of consumer
products, the workstations providing the perfect example, rather than the
more expensive lines of professional equipment now in place, the costs of
equipping the productive components of the system will trend to be greatly
reduced. These various trends in combination would allow a made for TV-
Movie for instance to be produced by a very small highly equipped

                                                                            97
production team working on a very compressed timetable. The relatively
lower gross payrolls more than compensated for by better and more regular
working conditions, access to the latest production technologies and the
compensation built into revised property relation's arrangements. And since
pay scales would not be based on inflated the above and below the line
industrial approach lower cost-to-screen time ratios will be established.

       The integration of the design and the technical aspects of production
would enable the productive components of the Canadian system to produce
cheaper forms of programming with superior technical qualities and unique
Canadian characteristics from both cost and content points-of-view. Of
course implementation will necessitate the restructuring of all elements of
the existing production, production services and technology provision
sectors around the production network; interactive broadcasting; and
distributed computing experiments. On top of this the property rights
arrangements surrounding the production network will need to be upgraded
to make any of this possible from an economic point-of-view.

      When push comes to shove the entire process is no more than the
federal and provincial crown-corporations and agencies responsible for the
general evolution of the sector coming to terms with our distorted pattern of
development. And then acting by way of both the top-down and bottom up
aspects of our tradition policy-driven pattern of development. Acting to
redress our currently exposed situation. By adopting a sectoral strategy
based on systematic technological change and de-centralization of the
necessary productive components of the media sector in Canada. An
approach that would both allow and enable the various federal and
provincial crown-corporations to “regulate” the evolution of the productive
components of the Canadian system in a more evolved manner.

         Digital technology converts the outputs from and inputs to the
necessary sub-components of the production system into digits. The
potential for jurisdictional complication as a result of this merger of media
formats is obvious. A significant element of the proposed sectoral strategy
from the point-of-view of the productive components of the system is a plan
to create an integrated set of union, talent and production-services contracts
to allow the producers, performers, artists, technicians, service and
technology providers and carriers comprising the productive components of
the upgraded system to “interact” with each other to improve production
values and reduce cost-to-screen ratios on a production network wide basis.

                                                                            98
         The new contract mechanisms are necessary as the divisions-of-
labour built into the existing contract-base still match our presently obsolete
technological-base. In order to equip our production units for cost-
effectiveness it will be necessary to develop a re-ordered set of contract
mechanisms that the bargaining units necessary to production in general will
sign off on. Needless to say the objective of these various adjustments is to
end up with a fully “networked” series of prototype production units
supported by the traditional developmental cash flows targeted into the
sector by both the federal and provincial components of the system. Each
equipped with an appropriate technological base and given access to the
production network, as results of the programs targeted into the sector by
both the federal and the provincial levels of the system. The individual
prototype production units mandated to test-produce media and information
products of all kind interactive and otherwise aimed into these currently
underdeveloped components of the emerging marketplace for new media
products and forms of programming either interactive or otherwise.

       The reorganization of the productive components of the system as
outlined above is critical to the viability of the interactive broadcast, e-
publishing and distributed-computing experiment as outlined above as the
cost-effectiveness of the new and adapted forms of production and thus the
productivity increases possible in the entire sector are rooted in
technological and organizational change within the productive components
of the system. Viable e-publishing or interactive broadcasting strategies can
only be based on viable media specific production strategies. And while
technological and organization change will improve the aesthetic and
technical qualities of the programming, both important to product
differentiation, it will be the downward alteration in cost-to-airtime ratios
that will be the most important affect of the proposed cultural strategy from
an economic point-of-view

        The objective of these various adjustments is to end up with a series
of prototype production units supported in part by the cash flows
traditionally targeted by the federal and provincial departments, agencies
and crown-corporations into the productive components of the Canadian
media sector. Each equipped with an appropriate technological base,
provided with access to the production network and mandated to develop
and test-produce new media products of some kind, interactive or otherwise,
aimed into the media marketplace comprised at this point by the Internet.



                                                                              99
                      Production Network Experiment




                  Grid-computing and Rendering Farms

        Writing in Chicago Sun-Times science reporter Jim Ritter recently
informed us that Fermilab, `the world's most powerful subatomic particle
accelerator', has 250 trillion reasons to outsource data. The lab "smashes
together subatomic particles at nearly the speed of light in order to better
understand the structure of matter and laws of nature," and Dzero, a research
team in the lab, has been recording about 4 million particle collisions a day.
In the last about three years, they have `250 trillion bytes of data', equivalent
to about 5 lakh Britannica sets `to fill a shelf stretching from Chicago to
Pittsburgh', describes Ritter.

       To crunch this data, Fermilab uses grid computing, and enlists the
services of labs and universities around the world, including possibly Brazil
and India. The promise is that grid computing will analyse within six months
the whole pile, and even help find "the elusive Higgs boson, a subatomic
particle known as the `holy grail of physics,'" explains the report. "Grid
computing works somewhat like an electric power grid. Computers in
different locations are connected in such a way that they act as one large
computing system," is the simple explanation provided by Ritter.
"Government labs, scientific groups and private industries are using grid
computing in fields as diverse as medicine, engineering, drug development,
and investment portfolios."




                                                                              100
       Grid computing is not a new concept but actually just a new name
applied to an evolved set of concepts. "In the early 1970s, computer
scientists had an idea to connect several computers together to act as one
computer. They called this field `network operating systems.' In the late
1970s, it was reborn as `distributed operating systems,' with the goal of
seamlessly connecting computers to look like one big computer." What
happened thereafter, in the 1980s and 1990s, is that we had networks of
workstations and meta-computing. Now, however, when you talk of large-
scale connection of computers `into a larger whole' it would most probably
be called grid-computing.

       How is this different from its predecessors? In two significant ways.
One, grids explicitly target computers in separate administrative domains.
"Each participant in a grid has complete control over its own resources (use
policy, security policy, what software to install, and so on)," unlike as in
distributed operating systems of the past, where all computers had to run
identical software. And two, grids go beyond computing and "provide a
seamless interface combining computers, networks, information (databases
and storage), sensors (e.g. real-time data collection) and people
(collaboration)."

       A sobering thought is that the true promise of grid computing
probably won't be available on a wide spread basis for at least another 5-10
years. Yet, the concept is behind a few nascent commercial applications
aimed at workload management by connecting geographically distributed
enterprises. Such as for instance Altair's PBS Pro which is "specifically
designed for the improved utilisation of IT resources and skills within grid
computing networks," informs Altair in a recent press release. The product
was `originally developed to manage aerospace computing resources at the
NASA Ames Research Center,' and it can intelligently queue and schedule
"computation workload across complex networks to optimise hardware and
software utilisation while minimising job turn-around times," It's a
middleware, `between compute intensive applications and networked
hardware operating systems'.

       As a provocateur, he's very effective. But as a prognosticator, people
are less convinced. Business writer Nicholas G. Carr raised many hackles in
the information technology industry when he published a piece in 2003 titled
"IT Doesn't Matter". His latest piece with a similarly extreme headline, "The
End of Corporate Computing," reopens the discussion of utility computing


                                                                          101
the notion that corporations subscribe to computing services over the
Internet much as they purchase electricity. Framed in the essay is the
analogy of the electricity industry and its development on a municipal basis
over a century ago. Carr argues that corporate computing data centers are
analogous to private generators, which were used in the early days of
electricity. These power sources burned fuel to generate electricity for a
single site, such as a department store or a wealthy person's home. (Tycoon
J.P. Morgan was the first residential customer in New York City in the late
1800s.) But private and small-scale power generators, which created direct
current, were eventually displaced entirely by alternating current technology,
which allowed utilities to send electricity over long distances, obviating the
need for a local power plant and the people to run it.

       To Carr, today's corporate data centers are the private power
generators of old: inefficient, underutilized and too costly in the face of the
network model of delivering IT services. "As the technology matures and
central distribution becomes possible, large-scale utility suppliers arise to
displace the private providers. Although companies may take years to
abandon their proprietary supply operations and all the sunk costs they
represent, the savings offered by utilities eventually become too compelling
to resist, even for the largest enterprises. Abandoning the old model becomes
a competitive necessity," Carr wrote.

        But while utility computing is an enticing idea, holding up the
electricity industry as the model for how computing should evolve doesn't sit
right with all IT executives. Peter Lee, CEO of grid software company
DataSynapse, said that Carr's conclusion that the combination of
virtualization, grid computing and Web services will result in utility
computing is "100% spot-on." But he said the electricity industry analogy
doesn't hold up entirely. "We do not think the computing industry will
eventually resemble the electricity industry as an exact parallel, because
unlike electricity, there are many more variables in terms of computing
power that would need to be standardized," Lee said. "Computing will,
however, become much more utility-like, both in terms of pricing and in
terms of on-demand power." In his piece, Carr theorizes how the shift to
utility computing could reshape the competitive forces in today's computing
industry. He argues that leading "utility suppliers" of the future will either be
today's large hardware providers, specialized hosting companies such as
Digex and /or Internet outfits such as Google and Amazon, or as-yet-
undiscovered start-ups.


                                                                             102
       Longtime computing industry executive Kim Polese, who is now CEO
of open-source start-up SpikeSource, said that Carr's competitive analysis
should figure in the effect of open source and offshore development from
emerging markets, both of which are causing "huge disruptions." "This
means to me that we can't assume that competition will come from the usual
places," Polese said. "The leaders of tomorrow may not even exist today, but
they could grow offshore from start-up into sizable companies quickly given
the strong demand for their services. The computing utility services may be
arbitraged across a network of service providers, of various sizes, with
pricing developed via dynamic price discovery."

       When queried IT industry executives agreed that the computer
industry will move to more hosted services over time. However, they see
limitations to hosting and disagreed with the notion that the computing
industry will evolve much as electricity did a century ago. They agreed that
corporations will take advantage of new technologies, such as Web services,
grid computing and virtualization, to lower computing costs. However, few
executives envision a whole-scale transition to utility computing, even in the
far-off future. None appeared to buy into Carr's assertion that the balance of
power in the computing world could shift dramatically from technology
infrastructure providers to Internet companies, such as Google or hosting
companies.

       For example, Charles Giancarlo, the chief technology officer of Cisco,
downplayed the importance of utility computing scenarios. Like many others
selling proprietary systems, Giancarlo says hosted services will become
more important in certain situations but utility computing services will not
be the norm in three to five years. "We think (utility computing) makes
sense for some small and medium-size businesses (like the productive
components of the media sector in Canada). But for large businesses, the
decision to host applications outside or inside of the network depends on
many different factors, including cost and network efficiency," said
Giancarlo. "Some of the largest companies can run their own applications
much cheaper and more efficiently than any utility computing provider."

      Other executives said that Carr's prediction that utility computing will
become the industry norm is predicated on improper assumptions about the
complexity of computing or blind spots in his knowledge. In particular, Carr
downplays the competitive advantage that custom-built software
applications can bring, compared to hosted offerings, said Eric Newcomer,


                                                                           103
chief technology officer at software maker Iona Technologies. "Computers
do not work without software. And unlike electricity or other raw
technology, software is designed for direct human interaction," Newcomer
said. "Overall, Carr has taken a very interesting analogy with some truth to it
to an implausible extreme."

       Meanwhile, other readers who responded to Carr's "End of Corporate
Computing" piece, voiced a mix of opinions. "The notion of the computer as
computing device has been obsoleted by the Internet. All of the real action
these days is in using the computer as a *communications* device," wrote
one reader. Others said that utility computing has yet to prove indispensable
to corporate customers. "I'd suspect that IBM's old mainframe philosophy is
behind this drive to utility computing. It's very easy to bill by the month and
provide premium services by the hour," wrote another reader. However, he
questioned the need for these services: "I'm not aware of any pressing need
that can only be met by Utility Computing."

        When responding to these various comments and criticisms Carr
suggests that, "what we don't know is the ultimate shape of the IT utility
model or the course of its development. That's what makes it so interesting
and so dangerous to current suppliers of the proprietary approaches. “What
we do know is that the current model of private IT supply, where every
company has to build and maintain its own IT power plant, is profoundly
inefficient, requiring massively redundant investments in hardware, software
and labor. Centralizing IT supply provides much more attractive economics,
and as the necessary technologies for utility computing continue their rapid
advance, the utility model will also advance. Smaller companies that lack
economies of scale in their internal IT operations are currently the early
adopters of the utility model, as they were for electric utilities....”

        "There are certainly tough challenges ahead for utility suppliers.
Probably the biggest is establishing ironclad security for each individual
client's data as hardware and software assets become shared. The security
issue will require technological breakthroughs, and I have faith that the IT
industry will achieve them, probably pretty quickly."

      But despite these considerations two initiatives in the spring of 2005
were targeted at making computing grids more widespread in the business
world. In May, a consortium of vendors called the Enterprise Grid Alliance
(EGA) released a set of recommendations for making grids more palatable to


                                                                            104
businesses. The guidelines addressed a range of technical issues, from
security to a utility like pricing system for buying computing power in
industry-standard increments. And in April the open-source Globus 4 a
toolkit for writing grid applications was released. These efforts are attempts
to create industry standards, which experts believe are important steps to
making the hazy notion of grid computing more accessible and widespread.
The consortium of grid computing researchers and interested corporations is
using the Globus toolkit for writing applications that run on several,
disparate machines. These tools and the EGA technical recommendations
are aimed to address computing tasks suited to the business realm rather than
academia, where computing grids have been used for years. Perhaps more
significantly, these efforts seek to establish industry wide grid standards,
something experts say is still lacking.

       The grid computing industry today is roughly at the same stage the
Internet was about 10 years ago, experts say. Before commercial customers
can share their computing resources more effectively across widespread
networks, they need a wide variety of standardized products. Today, most
examples of grid-computing are done using vendor-specific tools within a
single company, said Jonathan Eunice, an analyst at Illuminata. "We're still
in the stage of development where you build a grid, you don't buy it, because
these are all tools," Eunice said. "It's commercially viable, but you still have
to put a lot of things together yourself."

       The Globus toolkit makes it easier to build an application that taps
into computing resources such as servers, storage and databases that are
spread out across a network. The open-source software uses a number of
existing specifications, notably Web services. Using the software, corporate
customers in both the public and private sides of these industries will be able
to make more cost-effective use of their existing computing resources.
Often, servers or databases are substantially underused because these
resources are usually purchased to serve one specific application, rather than
be shared by many. The EGA‟s multiyear plan is to accelerate usage of grid
computing, help define where it is effective, and promote standards.

       "There aren't many people taking the big picture on grid," said Peter
Ffoulkes, director of marketing of high-performance and technical
computing at Sun Microsystems. "This isn't some academic group that's
trying to boil the ocean." Early examples show that grids are a compelling
way to save money on hardware. Financial services company Wachovia, for


                                                                            105
instance, used grid software from specialized provider DataSynapse to host a
new set of corporate banking applications. Rather than have a dedicated set
of servers, the applications seek out unused computing power from financial
traders' workstations. If a machine is not used for a certain amount of time,
the grid server software will offload a job to it. Once the workstation is used
again, the job is moved to another free machine.

       The set-up allowed Wachovia to avoid buying costly new hardware
for these services, said Robert Ortega, vice president of architecture and
engineering at Wachovia. The company was able to avoid buying eight Sun
Fire 15K servers, which cost hundreds of thousands of dollars each and
require dedicated staff to maintain. "We are now leveraging the grid
platform in scenarios that would have been considered traditional transaction
processing, such as creating a trade or retrieving market data," Ortega said.
Ortega noted a few challenges for grid computing, such as software
licensing schemes designed for software that runs on a single machine.

       Other challenges include the lack of grid-ready packaged applications
and the lack of common charge-back methods for pricing computing
services. Assurances of security and reliability of computing grids are also
required before people will be willing to share the servers and storage owned
by an individual company department. Indeed, some of the biggest
challenges facing the adoption of grids have more to do with people. Unlike
academia, departments within large corporations are not accustomed to
sharing their hardware resources or data with other groups. "People don't
want to share," said Wolfgang Gentzsch, managing director of MCNC, a
nonprofit that's built a large grid serving governments, universities and
others in North Carolina. Gentzsch led Sun's grid engineering efforts until
last year. "It's almost like we know how to handle the technology," said
Gentzsch. "But the cultural issues, that's a big change."

        If technology and marketing investments are any indicator, many
computing companies firmly agree that utility computing will become "too
compelling to resist." Starting in 2002 with the launch of IBM's On-Demand
vision of more flexible computing, several vendors have gotten on the utility
computing bandwagon. IBM offers hosted processing power and
applications to companies, while Sun earlier this year launched its Sun Grid
initiative where customers pay a flat-rate of $1 per hour per CPU, in a fee-
for-service structure similar to those used by utility companies. Microsoft,
meanwhile, is also well positioned to take advantage of any move to hosted


                                                                           106
services, said Bob Muglia, senior vice president of Microsoft's Windows
Server division. "I think there will be a split. Companies will outsource
things that can be very effectively run for an inexpensive price by
others...On the other hand, I do think there will always be areas where
people are putting in investment to drive business advantage that will either
remain in-sourced or under very tight control of outsourcing not purely
hosted. There's a mixture of all these things," Muglia said. "We'll work well
in both environments."

       IBM's Ambuj Goyal, the general manager of IBM's Lotus division and
former strategy executive in Big Blue's software group, fully buys into the
notion of utility computing: He wrote a paper for IBM on the subject 10
years ago and offers hosted services for some Lotus products. However, as
with many discussions about the future, the reality will likely lie somewhere
between extreme positions. "Rather than take a 50,000-foot view...you need
to get down to earth and look at individual cases," Goyal said. "A
standardized utility model has a role, but what a business should do depends
on each particular case." (Martin LaMonica and Marguerite Reardon
contributed to this article April 27, 2005 CNET News.com)

       In February of 2005, Sun Microsystems, having problems remaining
competitive, announced the details of a plan to rent computers over the
internet. Instead of crunching numbers on their own in-house machines,
customers of the "Sun Grid" service can pay $1 for every hour that they use
a processor on one of Sun's computers, and $1 per month for every gigabyte
of storage. This is the sort of thing people have in mind when they talk about
grand, but often vague, visions of "utility computing" or "grid computing",
in which computer power is supplied when needed, like electricity, over a
network by a central provider. Unveiling the service Jonathan Schwartz,
Sun's number two, submitted a computing job to the Sun Grid. In a flash, the
answer was ready, at a cost of $12: in a few seconds, 12 processor-hours of
work had been carried out by hundreds of processors.

       It is all very clever. But offerings such as Sun Grid, while novel, do
not solve the ultimate problem: the efficient allocation of networked
computing resources. People do not think of their computing needs in terms
of, say, 50 processor-hours; instead, they have specific tasks of varying
importance and urgency, and want to get those tasks done economically,
using whatever resources are available. The issue therefore comes down to
economics as much as technology. As long as the number of computers and


                                                                           107
users is small-as in a "cluster", rather than a genuine grid-resource allocation
can be done socially, or by an "omniscient" administrator who simply
decides who will be allowed to do what. But as soon as the grid becomes
big, any such arrangement will fail for the same reason that the Soviet
Union's economy broke down. Valuing and prioritising millions, perhaps
billions, of different transactions is too complex for any central planner.
Only a market mechanism of some sort can maintain order.

       Building just such a market mechanism is the nut that Bernardo
Huberman, a researcher at Hewlett-Packard (HP), and his team have been
trying to crack. The key, Dr Huberman realised, was to have a system that
can allow users to assign different priorities to tasks, to reflect their
importance. This rules out any system that would simply give each user a
priority without differentiating that user's many tasks. It also rules out a
reservation-style system of the sort that airlines use, since a lot of processor
cycles (like aeroplane seats) would end up unused, and the system would not
be able to accommodate new tasks as they arose, even if they were
extremely urgent. In a grid, it must be assumed, demand is changing
constantly and unpredictably, and so is supply (since individual host
computers on the grid come and go).

       Mr Huberman's answer is Tycoon, a piece of software for computing
grids that turns them into a sort of "stockmarket or clearing house", he says.
Users start by opening a bank account and getting credits. They then open a
screen that shows all the available processors, their current workloads, and a
price list. Users place bids for various processors, using a sliding price dial
that looks like a volume control. Allocation is proportional, so that if one
user bids $2 and the other $1, the first gets two-thirds of the resource and the
second one-third. If the deadline of one task suddenly moves forward, the
user can up his bid and immediately get more processor cycles for that task.
As users consume cycles, the software deducts credits from their account.

        The HP team has so far tried out Tycoon on a cluster of 22 Linux
servers distributed between HP's headquarters in Palo Alto, California, and
its offices in Bristol, England. Tycoon did well in these tests, and several
amusing animated films were rendered using its system. HP has now given
Tycoon to CERN, the world's largest particle physics laboratory and a
hotbed of grid-computing research, for more testing. This is only the
beginning, of course. Mr Huberman reckons that Tycoon, in its current form,
could run clusters of 500 host computers with perhaps 24 simultaneous


                                                                             108
users. But the ultimate vision of grid computing is for one gigantic network
spanning the globe and accommodating unlimited numbers of users. So a lot
still needs to happen.

        For a start, the metaphor for computing grids as "utilities", similar to
water or electricity supplies, is misleading, since there is no equivalent of
litres or kilowatt-hours. Processor cycles are just one component of
computing resources, alongside memory, disk storage and bandwidth. Mr
Huberman would like to combine all of these factors into one handy unit,
which he wants to call a "computon" (a cross between "computation" and
"photon", the name for a packet of electromagnetic energy). Tycoon's
descendants would then help to allocate computons across the grid's global
market. Of course, Mr Huberman adds, that will happen in a different
decade. (The Economist Mar 10th 2005)

                     Dreamworks and Hewlett-Packard

       In October of 2004 Hewlett-Packard Co., announced a three-year
multimillion dollar agreement with DreamWorks SKG to supply the
computer-animated film company with computer technology. No other
details of the financial arrangements were revealed at the time. But
DreamWorks designated H-P as its sole “preferred technology provider”.
The companies announced they among other things planned to work
together on “digital editing, digital cinema and digital asset management.”
The news followed the announcement, the day before, that DreamWorks
Animation SKG set its anticipated initial public offering at 29 million shares
which could raise up to $725 million.

       In April of 2005 Advanced Micro Devices (AMD), so far Intel‟s most
serious competitor, and DreamWorks announced a three-year strategic
alliance naming AMD the preferred processor provider for DreamWorks.
The objective was to “enable the company to experience significant
advantages in their computer-generated filmmaking process.” AMD
provided DreamWorks with a large number of Opteron processors with
“direct connect architecture” to comprise the company‟s next-generation
enterprise servers, workstations, render farm nodes, desktops and laptops.

      "The benefits of working with AMD have already had a substantial
impact on our business why building a strong partnership with them makes
sense," said Jeffrey Katzenberg, co-founder of DreamWorks SKG and CEO,


                                                                              109
DreamWorks Animation . "AMD technology will bring new levels of detail
and new capabilities to our animated feature films. Our artists and designers
are discovering that AMD is helping them break free of the limitations of
today's computing solutions," said Hector Ruiz, AMD chairman, president
and CEO. "They chose us because of our clear leadership in 64-bit
computing and dual-core technology. The Opteron processor provides
breakthrough performance that enables artists to create images never before
realized, and as a result, create some of the most technologically advanced
animated movies. DreamWork‟s “Madagascar" includes the largest crowd of
furry characters in a CG film to date, with more than 900 moving creatures
in a single scene. The faster rendering times and increased memory
bandwidth provided by the Opteron processors allowed DreamWorks to
meet this enormous challenge, and became an important creative component
of "Madagascar."

       In addition to expanding its render farms, DreamWorks is deploying
HP xw9300 workstations based on the AMD‟s dual core technology. The
use of these workstations builds on the unique technology partnership to
explore new, creative frontiers and redefine the art of filmmaking which HP
and DreamWorks informally created in 2001.” Rendering is the process of
completing one scene by adding detailed texture, lighting, shading, etc. The
significant performance improvement delivered by the Opteron processor
can dramatically reduce render times, allowing artists to see the finished
product much more quickly, further explore their creativity and maintain
their creative rhythm and tempo. Faster rendering drives productivity and
reduces the creative hurdles to animation, ultimately delivering unique and
more visually stunning animation. "AMD64-based workstations and servers
give our artists ever-increasing levels of performance and enables our
creative talent to continue to push the limits of their imagination," said Ed
Leonard, chief technology officer, DreamWorks Animation SKG.

       "The AMD64 architecture will also enable the broadest range of PC
users to break free from creative limitations and to join the growing
community of performers and artists looking to use the same technologies
the pros use," said John Volkmann, corporate vice president, global
corporate marketing, AMD. "While your average consumer won't render an
entire movie on their PC, they do want to use the exact same 64-bit
technology used by the world's most creative minds."




                                                                          110
       DreamWorks is expanding its use of AMD products as the partnership
matures. HP plays a critical role in helping DreamWorks achieve its
computer graphics-oriented goals through delivery of powerful and
manageable HP ProLiant servers. By collaborating with AMD, HP is trying
to ensure that production studios like DreamWorks make animators' wildest
visions a reality. "As the world's largest CG animation studio, DreamWorks
requires the industry's most powerful and robust technology.” Says James
Mouton, vice president, platforms, ProLiant servers, HP. "The winning
combination of AMD and HP helps animators add effects in real-time and
render images faster than ever before.

                         IBM, Disney and Others

      Perhaps not surprisingly also in June of 2005 The Walt Disney Co.
signed separate IT services contracts valued at more than $1.3 billion with
IBM and Affiliated Computer Services Inc. A $730 million contract with
IBM and a $610 million deal with ACS, both set at seven years and allowing
Disney to outsource "certain back-of-house IT work" to IBM and ACS, said
Disney spokeswoman Michelle Bergman. She said that the affected Disney
IT employees will have the opportunity to be transferred to the services
vendors, though she declined to say how many workers will be affected. A
source put the number at 1,000, or about one-third of the Burbank, Calif.-
based company's IT staff. ACS confirmed that its contract will bring about
500 Disney workers into its fold. Bergman said that Disney expects the
move to "improve organizational flexibility and the effectiveness of existing
operations." The transition will take place over the next two months, said
Bergman.

        Under the agreement, IBM will support Disney's IT infrastructure,
which consists of mainframe systems, about 3,700 Unix and Intel-based
midrange servers, and 1.4 petabytes of data storage, according to IBM
spokesman John Buscemi. IBM will be responsible for the ongoing
development and support of key Disney software, including its SAP
implementation and approximately 90 legacy applications from Disney's
theme parks and its resort business. The applications will be supported on-
site at Disney facilities as well as at an IBM application center in Tulsa,
Okla., he said. "Under the contract, IBM will centralize some operations and
standardize processes, tools and methodologies across Disney's diverse
computing infrastructure," said Buscemi. He added that the new contract
builds on a long-term relationship that was expanded in 2001, when IBM's


                                                                          111
business consulting services unit began work to consolidate Disney's
finance, human resources and payroll services onto a single SAP software
platform.

       Meanwhile, Dallas-based ACS will support Disney's technology
infrastructure and network architectures, as well as provide desktop help
assistance and some computer processing services, said ACS spokeswoman
Lesley Pool. As part of the deal, approximately 500 Disney IT workers "will
come aboard ACS and support Disney," said Pool. "This is a significant
opportunity to provide IT outsourcing services," Pool said. She pointed out
that the agreement will enable Disney to focus more closely on its core
businesses in the entertainment industry. "It lets us pick up the technology
offerings and bring them better technology, newer technology and take their
technology folks and begin to advance them in their technology career
paths," she said.

       Meanwhile to the west in Honolulu, Pipelinefx, a leading provider of
enterprise server farm management software for 3d animation and
interactive entertainment companies, announced it achieved optimized status
in IBM's PartnerWorld industry networks program. Pipelinefx flagship
product, qube! Remote Control has also achieved IBM designation. Built by
for the most demanding 3d animation and game production environments,
qube! Remote Control is a next-generation enterprise class server farm
management system built for environments as large 20 to 10,000 processors
or more. qube! is highly customizable, extensively scalable and can be
integrated into any production workflow. qube! operates in Red Hat Linux,
Windows XP, 2000 and 2003, and Mac OS X environments. qube! comes
with a Maya job type that contains everything necessary to submit and
execute batch Maya jobs. qube! Also supports Gelato, Shake, Softimage,
Discreet‟s 3ds Max, mental ray and many other 3d animation software
applications.

       In the case of Pipelinefx, the company's next generation software for
managing thousands of complex 3d animation rendering jobs has been
implemented on sixty dual processor IBM Blade servers known as eServer
BladeCenter. The solution implemented by the company allows film
companies specializing in 3d animated movies, special effects and computer
games to maximize throughput and efficiency and get the most value from
their hardware investment. "A large number of servers used for rendering 3d
animation jobs require a sophisticated interface for managing all the


                                                                           112
production resources of these integrated systems. A server with a system-
level error or failure that is otherwise undetected by job queuing/batch
processing software can create a bottleneck preventing other jobs in the
queue from executing and retard overall system throughput. The company's
software integrates both levels of control in one sophisticated application,"
according to Troy Brooks, Chief Technology Officer at Pipelinefx.

        The implementation that helped qualify Pipelinefx was conducted for
Mainframe Entertainment, one of Canada's leading 3d animation film
companies known for a host of popular cartoons and animated projects.
"This project will ultimately become the 137th largest supercomputer
facility in the world," said Bill Spencer, CEO of Pipelinefx, "and
demonstrates the scalability and sophistication of our next generation
software solution for the digital media and interactive entertainment
markets." Pipelinefx's existing customer base includes industry leaders
doing 3d animation, special effects and state-of-the-art computer games,
such as Electronic Arts, the world's largest game company, Radical
Entertainment, Mainframe Entertainment and Reel FX Creative Studios.

       "Interactive entertainment and digital media are tremendous growth
markets that hold great potential” according to Jeff Au, Managing Director
of PacifiCap Group, Hawaii's largest venture capital firm and Pipelinefx‟s
main investor. "Partnering with IBM is key to the company's commitment to
providing the ultimate enterprise server farm management system for these
fast growing markets." IBM provided independent software vendors (ISVs)
with comprehensive go-to-market programs specifically tailored to their
requirements. As part of the any project an ISV can team with IBM and
bring joint solutions out to market faster and industry-by-industry, reflecting
how customers are buying technology today. ISVs who achieve the
optimized level have successfully enabled and validated their industry
applications on IBM infrastructure software and hardware.

       For instance in Montreal in 2004 GridIron Software, announced it
was working with IBM to bring GridIron XLR8 grid computing to interests
involved in digital-content-creation, business-intelligence and electronic
design automation and publishing in general. The XLR8 application is a tool
and runtime software that makes it simple to develop, use and manage
software with the added speed of parallel distributed computing using IBM
eServer, pSeries and xSeries servers running Linux. The software enables
computationally intensive applications to run faster on multiple computers


                                                                            113
so IBM and its independent software vendors can offer enterprises a grid
computing solution that improves process workflow through enhanced
software application performance. And the GridIron-IBM system makes
existing grid deployments more valuable and broadens the range of solutions
available to companies considering grid implementation

             Super-computers, Animation and Special-effects

        Building one of the world's 10 fastest supercomputers takes a lot of
work, but the prestige is increasingly likely to be short-lived. In the latest list
of the 500 fastest supercomputers five of the top 10 systems are new to the
list. IBM dominates that top echelon: It built six of them, five using the Blue
Gene design that packs 1,024 processors into each six-foot tall cabinet. The
Top500 list is updated twice yearly and ranks computers by how many
trillions of calculations per second, or teraflops, they can perform using
algebraic calculations in a test called Linpack. Supercomputers are used for
tasks such as automotive design and pharmaceutical research, but
developments in the market often can influence mainstream machines as
well. High-ranked systems increasingly are clusters of hundreds or
thousands of low-end machines rather than single powerful behemoths.

       Turnover on the list is rampant. The slowest machine today is about as
fast as the collective performance of the first Top500 list in June 1993. The
November 1998 list had one system faster than 1 teraflops, but now all 500
exceed that mark. And the top performer, IBM's 65,636-processor Blue
Gene/L at Lawrence Livermore National Laboratory (used among other
things to model nuclear blasts) is faster than the collective performance of
the entire list of November 2001. IBM bumped the Blue Gene/L
performance from its March milestone of 135.5 trillion calculations per
second, or teraflops, to 136.8 trillion. And Big Blue expects to maintain its
top rank on the next version of the list in November as the Livermore Blue
Gene system is again doubled in size. "Our expectation is that it will
benchmark in the 270 to 280 teraflops range," said Dave Turek, vice
president of deep computing at IBM.

        A similar but smaller system, Watson Blue Gene arrived to take
second place with 91.3 teraflops. There are 16 Blue Gene machines total on
the list. Blue Gene systems cost about $2 million per rack, but IBM sells
partial racks as well, Turek said. The company is working on a separate Blue
Gene design code-named Cyclops that's geared for life sciences work, but


                                                                               114
Turek said, "We have no plans to anything with respect to
commercialization of Cyclops." IBM built more than half the systems on the
list--increasing from 216 systems on the last list to 259 on the current list.
"IBM remains the clear leader in the Top500 list and increased its lead,"
organizers said in a statement.

        Some of IBM's gains were at Hewlett-Packard's expense. HP dropped
from 173 systems to 131. But in the broader high-performance computing
market--not just the rarefied Top500 domain--HP leads Big Blue. According
to market researcher IDC, HP held 34 percent of the $1.9 billion market in
the first quarter of 2005, ahead of IBM's 28.2 percent, Sun Microsystems'
$12.3 percent, Dell's 11.9 percent and SGI's 2.6 percent.

       Chipmaker Intel also reached a milestone on the list. For the first time,
more than half the systems--254 in total--use its Xeon processor. However,
use of the higher-end Itanium processor diminished from 84 in the last list to
79. Intel's top rival, Advanced Micro Devices (AMD), built the Opteron
processors used in the No. 10 system, a new machine called Red Storm built
by Cray for Sandia National Laboratories. It was clocked at 15.2 teraflops,
but Sandia spokesman said the full system isn't expected to be running until
sometime before October. Cray and Sandia predicted performance of 100
teraflops a year ago.(June 22, 2005CNET News.com)

        Director Peter Jackson's Lord of the Rings films end with an epic
battle on-screen. Behind the scenes, however, another struggle was under
way. As each movie in the trilogy went into production, visual effects studio
Weta Digital Ltd. scrambled to add the processing power needed to render
an increasing number of computationally intensive special effects shots. By
the end of the three-part project, the Wellington, New Zealand-based
company had built a massive, 3,200-processor 3d rendering server farm to
cope with the load. The installation is ranked on the Top500 supercomputer
list as one of the world's largest supercomputer sites. With some 2,400 of
those processors residing on blade servers (and the remainder on 1U, or
1.75-in.-high, servers), it's also one of the most compact.

       Weta and other visual effects studios are rapidly turning to large
clusters of blade servers, often running Linux, as they balance the need for
more processing power with the desire to minimize costs and maximize the
use of valuable floor space. Special effects are playing an increasingly large
role in movies because audiences want them, says Greg Butler, digital


                                                                            115
computer graphics supervisor at Weta. "Film audiences expect visual effects
to keep blowing them away. The only way this is possible is through the
constant upgrading of our infrastructure," he says. With the Lord of the
Rings trilogy, the number of visual effects shots started at 540 in the first
film and roughly doubled for each of the next two movies.

       Industrial Light and Magic (ILM) in San Rafael, Calif., faces similar
pressures. "In the first Jurassic Park movie, we did 75 shots. Now, with a
Star Wars movie, every shot has some effect in it," for a total of 2,000 to
2,500 shots per film, says Chief Technology Officer Cliff Plumer. The
processing power required to render even a few shots is significant, says
George Johnsen, chief animation and technical officer at Threshold Digital
Research Lab in Santa Monica, Calif. "In the visual effects business, there's
no end to how many computers you can use," he says. A single shot can
range from a few seconds to several minutes. Each second of film includes
24 frames, each containing up to 4,996-by-3,112 pixels in 32- or 64-bit
color. Separate passes must be made for each object that requires rendering
in the frame and for attributes such as texture, lighting and reflections.

       "In the [upcoming] movie Foodfight!, there is a scene with 13,000
extras, and they all have animation cycles," says Johnsen. And artists often
repeat the rendering process to improve quality. As many as 150 passes may
be required - a frame can be processed on only one CPU at a time and takes
48 to 72 hours to complete, he says. Threshold Digital already has 512
processors in its render farm and plans to double that using IBM eServer
BladeCenters equipped with dual-processor HS20 server blades in the next
three months. The move to blades has been swift. Like many other studios,
ILM was using stand-alone workstations from Silicon Graphics Inc. to
render images three years ago. Today, it has a 2,000-processor render farm,
affectionately named Death Star, and half of the processors in it reside on
blade servers from Boston-based Angstrom Microsystems Inc. The blades
are "taking over quickly," Plumer says. At night, all of ILM's desktop
computers are added to the render farm as well. "Our processes are working
24 hours a day, seven days a week," he says.

       As is the case in other industries, the studios are demanding more
from IT while budgeting less. "Budgets aren't what they were. [Blades]
allow us to be more efficient," says Johnsen. Server blades are also more
efficient to deploy and manage. "We can get a system in-house and online
within two days, where historically it would take us about a week to build a


                                                                           116
rack of processors," says ILM's Plumer. While working on The Two
Towers, the second movie based on J.R.R. Tolkien's Lord of the Rings
novels, Weta suddenly found that it needed more horsepower. "We put in
500 processors in about three weeks, including building a new machine
room," says CTO Milton Ngan. Weta uses BladeCenters running dual-
processor HS20 server blades. The server racks, fully loaded, hold 84 blades,
or 168 processors -- a significant improvement over the density of Weta's 1U
servers.

       Management is also more automated. "With previous systems, we'd
have to physically go to each machine," Ngan says. The management
software for IBM's BladeCenter lets Weta use scripting to remotely
configure blades, update BIOSs and other firmware, and reboot or turn
individual blades on or off over the network. The increased processor power
and density of blade server farms has yielded significant benefits, but it has
also presented some unexpected challenges. "We packed [the blades] in
pretty tight, then ran into power and cooling issues," says Plumer. Individual
blades use less power -- for example, IBM says its HS20s are 57% more
efficient than its 1U servers -- but the blades are packed in much more
densely, pushing power within each fully populated rack as high as 15
kilowatts for the BladeCenter and even more for some other blade server
designs.

       At ILM, more reliable blades, more efficient rack designs and the
ability to spread out the blades to better balance the heat load in the room
seems to have solved the problem, Plumer says. Weta has dealt with hot
spots, but more improvements are needed, Ngan says. One small room
containing 1,000 processors has a concrete floor and a low ceiling that ruled
out having a raised floor. Weta sealed the racks to improve airflow, installed
three air conditioning units and piped air into the fronts of the racks to cool
the blades. The blades no longer overheat, but the air temperature at the top
of the rack is just under 85 degrees (75 degrees is the recommended
maximum). "We are building a new machine room that will be better
equipped to deal with blades," Ngan says.

      Threshold Digital has a rack that uses IBM's Calibrated Vectored
Cooling design to optimize airflow. Johnsen also installed an air
conditioning unit that injects air into the top of each rack and exhausts it out
the bottom. "We've had to do some very serious air conditioning to fill the
racks up. Instead of feeding the room, we're feeding the racks," he says.


                                                                             117
"We're dumping five tons [60,000 BTUs] of AC into the racks." The power
requirements surprised Johnsen, but "because we have four times the density
of processors, that was a fair trade-off," he says. The concentration of power
created by migrating to blade server farms has also had ripple effects on the
rest of the studios' IT infrastructures, which were designed to accommodate
graphics workstations. "The faster processors have put a strain on our
storage systems, which put a strain on the backup systems, which put a strain
on our network," says Plumer. "On an average day, we push 75TB of data
across our network." ILM's new data center, due to open next year, will
include a 10 Gigabit backbone. Weta has already moved to 10 Gigabit
Ethernet.

       As for storage, Weta has more than 60TB of network-attached storage
(NAS) on 1,100 disk drives under the control of 17 filers. But processing
many similar frames in parallel created bottlenecks. "When you have a
couple of hundred processors wanting the same data, a single file server
can't handle that," says Ngan. Weta spreads the files across multiple filers
and developed a "virtualized global file system" to improve performance.
Threshold moved to a Fibre Channel storage network and IBM's general
parallel file system, a high-performance cluster file system that supports
concurrent file access. ILM is using a combination of NAS and storage-area
network devices as well as near-line storage to deal with the large volumes
of data that move on and off the network with each project. A single shot
can require a terabyte of storage, Plumer says, up from a few hundred
megabytes a few years ago, while the work on a single film may generate in
excess of a petabyte of data.

       All three studios are looking for ways to get even more out of render
farms. ILM plans to double the size of its data center to more than 12,000
square feet next year. As Weta began production on a remake of King Kong,
Ngan says he contemplated phasing in blades that use Advanced Micro
Devices Inc.'s 64-bit Opteron chips or Intel Corp.'s extended memory 64
technology. He expects the larger memory space afforded by the 64-bit
CPUs to speed processing times. Threshold's Johnsen says the ultimate goal
is to break the cycle of using one processor to render each frame. "This is
why we are desperately developing multithread and relational grid strategies
for our rendering," he says. Johnsen sees blades as a key part of the
company's 10-year plan. "The natural progression is to some form of 3d grid
computing ... and blades are the next logical step," he says. (Robert L.
Mitchell Computer World October 2004)


                                                                          118
       In June of 2005 two of Canada's most powerful distributed computing
environments WestGrid and SHARCNET became interconnected over a
dedicated high-speed optical link, the first step toward a pan-Canadian
network of high performance computing facilities. The new connection was
announced at the annual Ontario Research and Education Summit in
Toronto. Distributed high performance computing is a powerful tool for
research and scientific discovery, supporting collaborative research in such
areas as human genomics, astrophysics, high energy physics, environmental
protection, financial modeling, containment of infectious human and animal
diseases and the development of nanotechnologies. But interestingly, given
our market position for computer-graphic and production network
technologies in general no mention was made of conducting to research in
relation to either media production or distribution.

       Computer networks are like highways. When all the signals are
operating, the lanes wide open and the pavement smooth, traffic zooms
through at top speed. Throw in a malfunctioning signal or a little
construction and the whole business slows to a crawl. When it comes to
network traffic, Vancouver's Apparent Networks makes the software that
acts as traffic reporter, traffic cop and tow truck all rolled into one. It could
be a computer network in a storefront insurance office or a corporate
network that circles the globe.

        When things start to slow down, Apparent's software can pinpoint the
problem and offer up the solution, presenting a virtual X-ray of the system
that lets the network engineer troubleshoot without even leaving his or her
desk. "We're the guys who can tell you don't cross the Lion's Gate bridge
because there's a lane out and we don't have to have a person on every street
corner to let us know that," said president and chief executive officer Irfhan
Rajani. He started Apparent Networks along with Fred Klassen, the brains
behind the company's network intelligence software, and Kelly Daniels,
another serial entrepreneur and a co-founder of Mainframe Entertainment.

       That talent for troubleshooting makes the companies products a
valuable commodity in a corporate world where network slowdowns and
inefficiencies can translate into huge costs. It has propelled the company into
prominence, most recently earning it the BC Technology Industry
Association's annual award for excellence in product innovation. That marks
the second time Apparent Networks has topped the BCTIA awards list. In
2003, the company won most promising start up.


                                                                               119
       For Rajani it's a hat trick, with Apparent Networks being the third
company he has nurtured from start up to success. "They've all been
Canadian and it's so far, so good," said Rajani, chatting in the Gastown
offices that he managed to score for a great price simply because Apparent
launched in the midst of the tech meltdown when dot-com offices complete
with foosball tables and refrigerators full of pop and junk food were going
begging. "We seem to be doing reasonably well." The modesty belies the
client list that includes such heavyweights as Veritas Software, the B.C.
government, Telus, which is also an investor, and others.

       For the 42-year-old Rajani, it's a far cry from the days when he first
graduated with a bachelor of commerce degree from the University of
Calgary in 1986. The oil patch wasn't booming at the time and the job
market there wasn't conducive to paying off a hefty student loan. Rajani
started his apprenticeship as an entrepreneur working in sales with a Toronto
tech company, but probably about the time in his career when others would
be burning out from the 24/7 schedule, Rajani took off for a year-long travel
sojourn. The global education couldn't have hurt. He came back and co-
founded Zentra Computer Technologies, supplying storage solutions and
services to companies like PMC-Sierra and Macdonald Detwiller before
selling it and moving on to his second start up, the software company
Telebackup Systems. It sold in 1999 for $143 million, allowing Rajani and
his wife Brenda to take a year off to travel before returning to Vancouver
where they launched both a new company and a family. "When we returned
it was a case of, 'let's do this a little differently,' " said Rajani.

        "I had financing from the last company, we put together a
management team and we spent a great deal of time up front identifying
where the market need was." At the time, it was bucking a trend. Everyone
else -- well not everyone but a substantial number of dot-coms and techno
wonder wannabe companies -- were closing their doors. The entire industry
was, as Rajani points out, "in its nuclear winter." "So while they were going
splat, it seemed a bad time to start, but in retrospect, the timing was good,"
he said. Not only was space like the Gastown premises plentiful, but so were
tech employees. "We were able to get good engineers," said Rajani. "We
weren't competing with fly-by-night dot-coms."

       At the same time, the devastated tech economy demanded real
products and a real prospect of profits instead of mere promises. "It was a
time that required discipline," said Rajani. "Our technology was not overly


                                                                           120
sexy but it was needed." The things that didn't disappear in the dot-com
demise were the networks, and Apparent took on the job of making them
work more efficiently, more effectively and without problems that could
slow and stop applications. If you've ever sat at your computer and grumbled
at the time it takes for it to work, or grind through a process, chances are
you'd appreciate Apparent's dedication to solving those problems. "This is
something that resonates with everyone, be it CEOs of Fortune 500
companies or consumers," said Rajani. "We're all dependent on these
networks that have proliferated over the past five years." The 40-person
privately held company had a 300-per-cent top-line growth in the last year
and Rajani says the challenge now is to keep it up. "It's about how do we
manage the growth and keep the momentum going," he said. "We really are
playing in the land of the giants and we have to do sure we don't get crushed
inadvertently."

            Arrested Development and Runaway Production

       Scott Dyer speaks about his adopted homeland with more animation
than the average Canadian. In fact, it was animation itself -- the art of
bringing still images to life -- that drew him here. "Animation is one of the
best parts of Canada. It's why I'm here, it's why I became a Canadian citizen
and it's why I intend to stay here," the 46-year-old said. A mathematician by
training, Mr. Dyer became a professor at Ohio State University and went on
to co-found an animation studio in Minneapolis. In 1997, he moved to
Toronto to join Nelvana, a studio that specializes in animated television
shows for children, where he is now the executive vice-president for
production and development.

       "Animation has always been important in Canada, and it's been
important for more than just consumption -- it's been an important cultural
art form. It has been appreciated… by the government as something that is
meaningful. You don't find that in many other countries," Mr. Dyer said. "I
think we sometimes forget how good we are at animation and how we truly
do dominate the world. Our products are everywhere." They certainly are.
Animated works by Canadians are celebrated at international festivals and
awards shows. Television series produced here air around the world.
Software packages designed by Canadian companies have become industry
standards. Studios that develop video games, a form of interactive
animation, are flourishing. And students trained at Canadian schools are
sought after by Hollywood studios and, increasingly, by firms closer to


                                                                          121
home. The audience for animation and video games is growing, becoming
more mature and diverse; the avenues for distribution, from the Internet to
mobile devices, are expanding. And Canada is the focal point for the
development of the software that makes it all happen.

       It all began in the early days of the National Film Board, founded in
1939 with a mandate to make and distribute Canadian films. Norman
McLaren, the Scottish filmmaker who founded the NFB's animation
program in 1943, is still revered in the animation world. "We consider
McLaren to be the grandfather of our animation heritage," said David
Verrall, executive-producer of animation at the NFB. The government-
funded organization has fostered a long line of innovative artists, from the
work of Mr. McLaren to that of Caroline Leaf, René Jodoin and Chris
Landreth, who won an Oscar this year for an animated short film, Ryan.
"The animation program at the NFB has had the privilege, with the support
of the public purse, to explore animation filmmaking as an art form, to try
things out technically, to try things out artistically and to engage
practitioners from different backgrounds to explore what you could do with
animation," Mr. Verrall said.

       Nelvana's Mr. Dyer connects that exploration and the international
attention it drew, along with federal and provincial tax incentives supporting
animation production, to Canada's pre-eminence as an animation nation.
"That embrace of animation by the government really helped to create the
environment that drove all the software, that drove the schools, that drove
the creation of the studios," Mr. Dyer said. Ryan, an NFB co-production, is a
3d computer-generated, or CG, film. Mr. Landreth and the film's producers,
including Toronto's Copper Heart Entertainment, have taken home 43
international awards for it, along with that Academy Award. To create Ryan,
Mr. Landreth and his team used Maya, 3d software developed by Alias, a
Toronto-based company. Mr. Landreth knows Maya intimately. He used to
work for Alias, testing and pushing its software to make animation that looks
ever more real -- or surreal, judging by the hallucinatory imagery in Ryan.

       Canadian software has played "a dominant role" in the industry, he
says. If there is one area of animation in which Canada can be said to
quantitatively lead the pack, industry insiders say it is the software that
allows artists to create 3d animation and produce the special effects found in
most of today's feature films. Alias sports an Academy Award of its own in
the lobby of its downtown Toronto headquarters, won in 2003 for technical


                                                                           122
achievement. Michel Besner, Alias's vice-president of development for
emerging markets, says "95 to 99 per cent" of the creators of animated
works -- films, TV shows, commercials and video games -- use software that
originates in Canada.

       "If it's not Alias with our 3d tools, it will be Autodesk's Discreet with
their video-editing solutions or Avid's division in Montreal, Softimage, for
video and 3d," Mr. Besner said. Rick Mischel is the chief executive officer
of Mainframe, a Vancouver studio that produces DVDs and its own line of
animated television shows. He likens software programs to such tools as a
painter's brush or sculptor's chisel. The programs allow an animator to create
an explosion or change the skin tone of a character with the touch of a
button. "It's the talent that brings the tool to life, but without the tools you're
nowhere," he said. "All companies use one of these Canadian software
suites."

        When Mr. Landreth was creating Ryan, he was the artist-in-residence
at Toronto's Seneca College. Twelve Seneca students worked on the film; all
12 are now employed in the industry, including several at Toronto's
C.O.R.E. Digital Pictures, currently working on a Disney feature. Mr.
Landreth naturally has high praise for the program and facilities at Seneca,
but, like many in the animation industry, he also singles out another Ontario
institution, Sheridan College in Oakville. "Sheridan has probably the best
reputation over the last 30 or 40 years of producing animation talent," he
said. "Only CalArts [California Institute of the Arts] has that kind of
reputation, but CalArts is a very studio-oriented school, a very Disney-
oriented school in particular, where Sheridan has more of an independent
vision."

        Sheridan receives about 800 applications a year for its 120 animation
spots. Angela Stukator, an associate dean at Sheridan who runs its animation
and new-media program, says the school teaches students to be prepared for
change in the rapidly changing industry. "We train our students to know that
their films are nothing if they are not on the screen, whether television or
film or games," she said. "We prepare the students to sell their artwork, to
sell themselves, to sell their ideas, to know how to pitch, to know how to be
entrepreneurs."




                                                                               123
       As for the future of animation in this country, those in the industry
expect Canada will one day make full-length animated films of its own.
Mainframe has built a high-definition 3d studio in British Columbia, and
studios such as C.O.R.E. are earning experience working on projects for
international companies. The NFB's Mr. Verrall said the Canadian success
story will continue because artists and business people have seen the
potential for self-expression -- and profit -- in the art form. "The notion that
Canadians are good at animation is what continues to encourage animation,"
he said. "We just continue to celebrate the truth of our excellence in
animation." (Scott Colbourne Globe and Mail July 6, 2005)

        Before the bore-fest that was the (2004) Academy Awards slips
completely from our collective memories, it's worthwhile pondering just
why they were such a snoozer. It was uninteresting mainly because one
film, the third installment of The Lord of the Rings, bagged 11 Oscars,
including best picture. They were particularly boring for anyone employed
in the Canadian film industry. The reason: Tiny New Zealand, at the far end
of the planet, has emerged as the new hot spot for Hollywood productions.
All three Rings were filmed in New Zealand. Their production costs were an
estimated $400-million (US), the vast majority of which was spent in the
country. Thousands of production people, from costume designers to gaffers,
had endless work. Director Peter Jackson is using the films' success -- the
first two Rings hauled in almost $2-billion at the box office -- to create a
mini-Hollywood in New Zealand. The country took the films so seriously
that Pete Hodgson, the minister responsible for research, science and
technology, was given cabinet responsibility to oversee Rings-related
projects. We're not making this up.

       Mr. Jackson has also built one of the world's most advanced sound-
editing studios. It, along with all the other sound stages and film
laboratories, will make the country one of the most competitive locations for
foreign film production. The Last Samurai, with Tom Cruise, was filmed in
New Zealand because Rings convinced Warner Bros. that the country could
handle complex, big-budget productions. Mr. Jackson's next project a
remake of King Kong, was also shot there. Harder to quantify are the spinoff
benefits, notably tourism and New Zealand brand building. They probably
will be huge.




                                                                             124
        It wasn't supposed to be this way. For a long time, Canada was
probably the leading destination for so-called foreign location films or to the
Americans “runaway” productions. Canada was nearby and spoke the same
language. It had a cheap dollar. What's equally important, it had a film
infrastructure studios (although of varying quality), crews, post-production
facilities, creative types, financing incentives. It meant you could fly the
directors and actors to Canada and leave almost everyone else behind. The
thought of anyone from Hollywood spending 15 hours on a plane to make a
film on a South Pacific island was ridiculous. "They were so wrong," says
Ken Dhaliwal, an entertainment lawyer at Heenan Blaikie in Toronto. He
says a sense of complacency had set in here in Canada.

        What can Canada learn from New Zealand's surprise success? Film
production in Canada is far from dead. But it's not booming. A (2004) report
prepared by the Canadian Film and Television Production Association
(CFTPA) and Heritage Canada says "Growth in the sector is unlikely"
because of the rising dollar and increased competition from, well, just about
everywhere. Australia and New Zealand are punching above their weight, as
is Ireland. Eastern Europe is coming on strong. Even Italy's long-forgotten
Cinecitta Studios, home of the Spaghetti Western, have come back to life. In
the 2002-03 year, foreign location production in Canada totalled $1.9-billion
(Canadian), up 8 per cent from the previous year, when the terrorist attacks
clobbered the industry. Previous increases had been far greater. There are a
few bright spots. Montreal is replacing Toronto as the city of choice for
foreign film production (unlike Toronto, Montreal actually has a few streets
refreshingly uncluttered with bland condo towers). Production in British
Columbia is still the strongest among the provinces, but fell for the first time
since the late 1990s.

       There is no easy way to get Canada back on top. Boosting the
financial incentives seems the quickest solution, but the political fallout can
be deadly. It's like building an opera house, oft described as a net transfer of
wealth from the poor -- the taxpayer -- to the rich guys who can afford opera
tickets. And you don't win political points by shoveling taxpayers' dollars at
Hollywood, where the cost of lunch at the Chateau Marmont can buy you a
used Buick. Hollywood studios have the equivalent of war rooms, where
accountants take a production budget and determine how much they can
save by spending it in various countries. Dozens of variables are fed into the
spreadsheets, from exchange rates and the cost of air travel to tax credits and
tax shelters. Out pops a number and suddenly hotels and crews are getting


                                                                             125
booked in Wellington or Dublin or Bucharest. It's an emotionless analysis
and a couple of percentage points either way can put a country in the
running or drop it to the bottom of the list.

       Canada certainly lost some allure in 2001, when the feds scrapped a
production services tax shelter. The shelter covered only 3 or 4 per cent of
eligible expenditures, but the loss of even that small amount was enough to
tip the scales against Canada. It also delivered the message to Hollywood
that Canada wouldn't work as hard to attract foreign film production. A year
and a half later, the federal government boosted the value of the production
services tax credit, but by then the damage was done. If there's a lesson here,
it's beware of tampering with incentives that work, especially when other
countries are boosting theirs. Throwing money at Hollywood will never be
popular. But job creation is. The CFTPA report estimated that 51,300 people
are directly employed in film and TV production, with many thousands of
additional indirect jobs.

       Here's another idea: Specialize in a certain production area. It would
be wrong to assume New Zealand and the other countries that compete for
Hollywood's attention will fall out of favour soon, restoring Canada's once
glorious market share. So the goal might be to keep what Canada has on the
film front and build something new on top of it. Canada is a world leader,
for example, in children's programming and animation. It's also good at
making documentaries. The first two are borderless, international markets;
Nelvana's Franklin series travels everywhere. Developing a proper industrial
strategy that builds on Canada's success in children's programming and
animation might be the way to go. The British have done something similar
in TV detective and comedy programs. And, in the meantime, pray the dollar
doesn't go to 85 cents (U.S.). At that value, no amount of incentives will
keep Canada on Hollywood's A-list. (Globe & Mail March 09, 2004)

        So the creation of the network based production strategy by Canadian
manufactures of computer software in the 80s. And then the generalization
of the concept into the productive sides of the film industry and broadcast
sectors – on a world wide basis -- started by Canadian services providers in
Vancouver during the late 90s. Has perhaps as its most interesting economic
characteristic, at least from a Canadian perspective, a distinct tendency to
shift the center of gravity within these strategic consciousness generating
industries from the traditional centers in LA, New York to what Harold Innis
identified during the late 40s as the more “marginal” elements of the system.


                                                                            126
This important trend is mainly manifest south of the border in the perceived
growing economic importance to Hollywood facing the contraction if its
traditional marketplaces in the face of the Internet of both “outsourcing” and
“runaway production” really just two aspects of the same thing impacting
different the elements of the media, information and consumer electronic
sectors at the same time.

        Its also important to realize that most of these engineering and related
economic trends are already ten to fifteen years old and accelerating as the
massive labour pools for outsourcing and marketplaces for media related
goods, services and technologies like India and China come on line by way
of the Internet. While the still analogue players in both LA, New York and
Toronto one after another fall under the sway of -- and become dependent
on - the telecommunications interests, technology-providers and/or
manufacturers of consumer-electronics like Microsoft or Intel now tending
to dominate the sector from both political and economic points-of-view. And
all interested, not surprisingly, in merger, buyout and vertical integration.
These firms are also the dominant elements of much larger and more
strategic industrial sectors in the political economy centered on the military-
industrial-complex in the US. And these companies clearly have much
better, and in fact the best, connections in George Bush‟s war oriented
Washington at this point, than for instance, the generally Democratic and
wrongly perceived as Liberal Hollywood could ever hope to have and all as
Jack Valenti‟s imposing figure fades towards the darkness of obscurity.




                                                                            127

				
DOCUMENT INFO