Lissa Meridan1 and Michael Donn2

        1.   New Zealand School of Music, Victoria University
        2.   School of Architecture, Victoria University,


The authors have collaborated in the running of tertiary level classes in time-based representation
of architecture and sonic arts where for one piece students collaborate in the production of a
single “animation + soundtrack”. The paper reflects on the interactions between students that
have taken the better work beyond the mere representational to the interpretive.

The representation assignment requires the visualization exercise to explore and explain a
significant piece of architecture through 4D time based representation in ways that the mere 3D
or 2D still image cannot. Recognizing that this 4D representation is still projected on 2D media is
a key part of this work. The depth of understanding that can be added by music to the difficult
task of placing the architecture – the scene in most movies – at the centre of the plot, as the
actor, is what is explored and illustrated.

In standard texts on film technique such as Joseph Mascelli’s classic book “The Five C’s of
Cinematography” [1] (camera angles, continuity, cutting, close-ups, and composition) the focus is
the actor. Thus, the language used to describe the close-up describes what proportions of the
body/face are visible. Standard texts on architectural ‘animation’ focus not at this level of
technique but on the craft of image production. When the ‘star’ of the ‘movie’ is to be the
architecture, there is a necessity to rethink the film paradigm. This collaboration has begun to
develop the craft of time based architectural representation to incorporate the traditional craft
roles of the film editor/director. A crucial role has been identified for the soundtrack to draw the
viewer into the architectural space via the musical imagination, to explain, illustrate and celebrate
the architectural space through a combination of sampled and/or abstracted sonic cues.
Collaboration of these two digital media converge to create a unique audiovisual experience of
the site.

1. Introduction

Since 2000, the authors who are from the New Zealand School of Music and Architecture School
at Victoria University have collaborated on an ‘animation and composition’ assignment bringing
together students from the Sonic Arts and Digital Craft programmes. This project draws on each
academic discipline to define a narrative using moving image and sound to produce a
scenography where architecture and landscape play the central role rather than simply providing

                                           Meridan & Donn
the ‘set’. These ‘film-makers’ have to come to terms with the meaning of conventional cinematic
concepts in this new narrative context.

Goals of architecture / building science assignment
All university courses linking architecture and music must start by laying to rest the ghost of
Goethe’s [2] oft-quoted aphorism: "Architecture is frozen music" [3]. This is normally interpreted
as design inspiration where the art of architecture is compared to the science of harmonies in
music. Proportion and harmony; golden section divisions of musical scores and building
proportions; rhythmic patterns of columns; all are common analogies drawn between building
structure and musical ‘structure’.

This collaboration between animator and composer asks the students to look well beyond this
analogy to the poetic interpretation of scenography in music and film. The animators select a
building design that they will ‘render and animate’. Complex building models constructed by
colleagues from previous years are made available. These buildings are mostly of conventional
‘architectural merit’, so background data about them is readily available from the library, with very
little searching. The animation narrative is an interpretation in the moving image of this

Goals of music assignment:
It is a particular challenge for the student composer to create a soundtrack where the guiding
narrative is the architecture itself. Here, music is used to illustrate a representation of a physical
structure and the space it inhabits. So on the one hand, the Sonic Arts student is required to write
“functional” music, to serve the architecture itself. On another level, this fusion of animation and
music creates a new cinematic experience of the architecture, one that aims to facilitate "an"
experience of the building, to mediate the “Grand Tour” by creating an imposed sense of the

Composers must first consider how the addition of music can help to create or recreate an
impression of the building. Their music must support the chosen narrative of the director,
facilitating the “story-telling”. There are also acoustic and spatial considerations that combine
with the programmatic aspects of the soundtrack to aid a sense of directed movement through 4-
D space, and create a sense of shape and drama within a micro-form time-based work.

Goals of collaboration
The initial motivation for the collaboration was that it seemed like fun – for both the staff and the
students. It would take both the Sonic Arts and the Digital Craft students out of their comfort
zones confronting their abilities to make what they do real to another discipline.

The narrative is to be at a higher level than merely representing to the viewer the path of a
camera through every door and down every corridor in the building. The soundtrack is not merely
an ‘accompaniment’, it is an interpretation of the building. What the animators first imagine is that
music is the catchy beat they can record from their iPod. This exercise reveals the analytical and
creative potential of the soundtrack.

Similarly for the composers, the act of collaboration has become more than merely ‘adding a
soundtrack’. It is a practical study in collaboration with all the trials and dramas of conflicting
visions and egos.

                                         Meridan & Donn-2
2.      Project Description

Phase I of the animation paper follows a conventional movie story boarding process. Very early in
the course students develop an existing CAD model into an animation model. This animation
model is more than a three dimensional representation of the building. The animation model is a
reflection of the content of the story board. The articulation of parts that are to move separately,
that are to be of different appearance in different scenes, and that are to be animated with moving
textures is necessarily part of the animation model construction process.

At the end of Phase I all the animators present: an animated power point presentation of their
movie story pitch to their animation and music colleagues. In addition to a series of high quality
test renders the story pitch includes 1) a series of high quality test images; and, 2) a sequence of
at least twenty images representing the story board/story line for their planned animation. Each of
these story images is to be accompanied by a quotation or piece of text that explains the
"analysis" - the narrative of the film.

The ‘storyboard’ shows what key scenes will look like; describes (‘fade-to-black’, ‘cut’, etc) each
scene transition; and identifies places where sound might provide or reinforce continuity. The
result is a sense of a coherent whole: whether film is to have a conventional ‘beginning – middle
– end’ narrative, or a more figurative Eisenstein-inspired [4] style of narrative.

At the same time as this presentation, the animators must prepare an animation ‘production plan’.
On this plan they must document all the technical aspects of production: where each element of
the animation (image file, 3D object etc) is to be drawn from – its origin in their disk storage
systems; the render time of typical frames from each scene; clear quantification of the render

The final step in the animation production process is editing the animation and the soundtrack
together. Each student has a deadline for the production of their soundtrack or animation. Tweak
week is the final step in the collaboration, where the Sonic Arts and Digital Craft students bring
together their independent files to produce the final cut, mix and master.

3.      Development of the Project

In the first years of the assignment, the ‘animators’ produced and presented A1 posters of their
buildings. Using design studio critique methods these presentations have become a valuable
learning exercise. It became evident that the Sonic Arts students needed more at this initial point
than simply an interesting suite of images. The challenge and refutation style of a design critique
enabled the music students to better understand the project and their potential collaborators.

Over time, the collaboration and the presentation workshop have evolved into a means of building
a higher quality of self-awareness amongst the animators of what it is they are producing. In order
to clearly communicate the intention of the work to composers the animators find they have to
create a convincing narrative. It is not possible just to play with the rendering and editing software
and produce a conventional architectural ‘walkthru’; it is clear that such an activity will produce a
fail grade. The architecture as actor must be a more integral part of the scenes than mere
backdrop / scenery.

Tweak week was instituted to allow a sensible time in a 6 week collaboration for the crucial
editing of the film. Since it was instituted, the singular importance to the quality of the learning of
both composers and animators of the final ‘tweak week’ has been assumed to be directly
affecting the quality of the final work. In 2005, the group of students in each course worked more

                                         Meridan & Donn-3
independently than in the past. In only a few cases was real collaboration in evidence. Weaker
students see tweak week as time to add titles and credits. Better students gain a small taste of
the central role the film editor plays in the look and feel of a movie. Contrasting the 2002-4
movies (with very similar technology available) with the 2005 movies reveals a qualitative
difference in outcome where the final ‘edit’ became merely an assembly of ‘soundtrack’ alongside

4.      The visualization process

Ultimately, the question that lies at the heart of the assignment for the animators is how to ‘give
life to’ the inanimate. The course is not about modeling a cartoon house with two eyes as
windows and a door that fits between them where a mouth would be and ‘speaks’ words.
Buildings do not move. Cameras do. But, the swirling camera of the architectural walk through is
what students are avoiding. Moving objects in the scene such as fog rising, or cars driving past or
television pictures moving are trivial.

The challenge is to provide movement in ways that are naturalistic or which create a reality that
makes the movement of the building or its components follow the created reality. Facades peeling
back; components sliding into place revealing construction process; the sun setting or rising; all
these can be made to appear to fit a film logic – to appear natural within the context of the film.

With the availability of good rendering software, it has been easy for over ten years for people in
universities and consultants’ practices to make animations / movies focusing on the built
environment. Unfortunately the bulk of these ‘walkthroughs’ end up as banal hyper-realistic fly-
past efforts with one single camera tracking through every room in a building or wafting aimlessly
round the outside. The viewer who is not rendered ill through motion sickness is still left with a
vague and often false impression of the building, the design concept and the proposed

The negative issues associated with basic walkthrough animations are:
a) lack of a clear understanding of the representation vs interpretation debate in film;
b) emphasis on lighting quality and communicating materiality to the detriment of an awareness of
spatial richness
c) lack of focus on audience reading of the narrative;
d) lack of visual connection between design concept and spatial representation;
e) inability to compare, contrast, vary rhythm of story telling with single camera path fly-throughs;
f) a background ‘soundtrack’ of the animators’ favourite CD track of the month whose contribution
to the understanding of the building is at best a bland anonymity and at worst a dominant,
copyright breaking distraction.

The Architecture as Actor course and the Sonic Arts course examine communication techniques
learned from film. Students are encouraged to look again at favourite films:
a) to develop a hypothesis of how an animation might differ from conventional film when the
‘actor’ is the scene not a person or a cartoon character;
b) to examine the work of an editor or scenographer and to understand how they tell stories within
movies – focusing on the manner in which editing enhances understanding of the scene;
c) to compare movies and their editing to gain an understanding of the editing conventions and
the manner in which they affect the pace, the rhythm, the structure of a narrative;
d) to analyse the narrative communication techniques increasingly described on DVD by directors
like Robert Rodriguez[5] and Peter Jackson[6] in the context of their films.

                                         Meridan & Donn-4
The following checklist identifies the elements of the story board that the students work through to
develop their movie. As there is very little precedent for placing the architecture as the actor in
film, students are requested to systematically consider the following questions in the development
of their storyboard. In the context of their particular narrative about the building:

Camera angles – for architecture as actor not scene
More difficult is the answer to the use of the close-up: is it a window detail? a dynamic Hitchcock
or Wellesian view or the more conventional orthogonal views of CAD programs.

Continuity – for architecture as actor not scene
Rules of continuity ensuring that an ‘actor’ does not apparently and abruptly change during an
editing cut from one scene to the next seem initially easier when it is the same model with
different camera angles providing our different scenes. With the architecture as the actor,
however, there are as many potential pitfalls that can befall the director or editor: e.g. transitions
from one scene to another must consciously deal with time according to the films conventions
that ascribe time and spatial shifts to fades that are different to those associated with cuts.

Cutting – for architecture as actor not scene
In film conversations the narrative is pushed and pulled by the editor’s cutting process. The best
animations render a series of scenes (combinations of cameras, lights and action) that are
intercut to create the narrative. This narrative is created by the order and pace of the rhythm of
the intercuts. Most students struggle with telling the story without a voice-over or text overwritten
on screen.

Close-ups – for architecture as actor not scene
Standard texts on film describe ‘close-ups’, ‘group shots’, ‘long shots’ and other variations on
these that fix the amount of a frame that the actor fills. In the context of the narrative the student
must examine how useful a traditional ‘setup’ long shot might be.

Composition – for architecture as actor not scene
The most difficult aspect of telling the story of a building is composing shots that show the forms
of the interiors and relationships between them. Even conventional movies like “Panic Room’ – a
film in which the architecture of the relationships of the rooms is crucial to the story line – revert to
computer aided design techniques like 3D line drawings.

5.      The music composition process:

For Sonic Arts students, this project poses a number of challenges. Not only are they faced with
many of the compositional challenges of soundtrack writing, but the scale is reduced to a
“miniature” where these programmatic challenges must be met in a very concise manner.
Additionally, the context has been inverted from the usual storytelling pretext, and the challenge
is how best to engage an audience with a narrative which characterizes a building as its central

Sound design
A sense of context is given to the building by the addition of sonic cues and ambiences. The
composer must consider what kind of sonic materials would be appropriate to the building and
how can these be manipulated over time to illustrate the animators chosen “sense” of the
building, or narrative. Should there be particular sound objects associated with certain aspects of
the building, and how can these objects be arranged to compliment the camera angles, rate of
movement, and sense of gesture? How can sound be used to enhance the sense of light and
movement in the animation? The composer must select sound materials that will enhance the

                                          Meridan & Donn-5
representation of the narrative by creating sonic cues to characterize the space, and spatial
ambiences to create a convincing sense of the place. These cues and ambiences serve as the
basic musical materials from which the music can unfold the story, undergoing appropriate
transformation, providing contrasts and balance, reflecting the light and movement present in the

The initiation of the project for the composer begins at the Storyboard presentation. Here the
composer has the opportunity to audition the film-maker, and to select a project and “director”
with which he/she feels an affinity (which is almost the complete opposite of the film industry
approach to hiring a composer!).

Supporting/Guiding narrative:
How sounds are arranged on the linear time continuum:
Due to the collaborative nature of this project where both animation and soundtrack evolve
simultaneously, this results in very little time (during tweak week) to successfully marry the two
elements together. It is therefore vital that the composer and animator communicate their
independent progress regularly. Hence the composer’s sonic ideas can impose structural
parameters onto the animation, and vice-versa.

It is very difficult for the soundtrack to take shape in a linear manner before the animation is
complete. The soundtrack is compiled more as a collection of musical ideas and cues, which can
later be edited directly to the completed animation. Considerations such as rhythm, pace,
continuity, repetition, dynamic shape and growth are discussed as the animation begins to take
shape. The interpretive potential of the music are explored and discussed as the work
progresses. For example, how can the rhythmic structures of the building be reflected in the
soundtrack? How can the textures and tones, the light and shade of the animation be highlighted
or contrasted? Can musical repetition create a sense of unity and balance against the camera
movements? and so on.

It is in fact the combination of the soundtrack and the animation which create the resulting
cinematic experience. The interaction between these two elements have the power to create a
sense of drama, energy and movement, to maintain a rhythmic impetus. Within this miniature
form, a sense of shape and climax can be achieved both visually and sonically. The “timing” or
unfolding of the narrative relies heavily on the ability of the composer to work with the camera
movement, capturing the pace and movements, and correctly judging the sync points.

The students are encouraged to analyse aspects of existing film soundtracks which provide a
particularly dynamic accompaniment to pictures, to examine how music is used to set the mood,
links between characters and particular musical cues (leitmotif), the role of music in animating a
scene, creating context and other more specifically stylistic aspects.

How sounds are arranged in the spatial dimension
The acoustical relationships between perceived space and experienced space are discussed, the
contrast between projected space and imaginary space, and the creative potentials of these
variants. An examination of indoor and outdoor contrasts, and recording techniques to capture
these contrasts, and how the addition of reverberation and other methods of placement of sounds
in the stereo field can create textural variety to give a sense of background/forground etc.
Subjective parameters need to be carefully considered to assist in providing a convincing sense
of location, its size and the objects in it. These parameters may be abstracted in a single musical
idea or several layers of sonic material. Which combination of sound and image best conveys the
ambience or atmosphere, which best captures the essence of the character the animator wants to

                                        Meridan & Donn-6
Technical considerations
The brief for the Sonic Arts students clearly states that all sound material must be original. This
requires that each student collects or creates all “samples” and musical materials for the
soundtrack. Careful consideration of recording technique, microphone placement, timbral
suitability, EQ, dynamic range etc must be made for each sound created for the soundtrack. This
material is edited, perhaps transformed, and arranged to support the narrative of the animation, to
suit the camera angle, range and pace of cuts, intrinsic pr imposed rhythms, transitions etc.
Generally the animator provides a detailed time-line for the composer to cut the music to, so that
the time spent during tweak-week can be used to refine the details, tidy up the sync-points and
get the mix right.

Mixdown and Mastering
After the linear and spatial musical cues have been synchronized with the animation, the final
mixdown takes place in the Lilburn Electroacoustic Music Studios, at the School of Music. Here
the final cuts and cues are spotted and the animator and composer negotiate any changes. Any
post-production effects are added, final sound levels are set, and the final master is produced.

6.      Conclusion

Future developments in architectural visualization education must continue to explore the
‘architecture as actor’ question. The collaboration of these two courses has since 2000 clearly
demonstrated that the animation and the music are two equally important parts of an ‘animation
of a building’. Without the appropriate soundtrack, an animation is much less comprehensible.
Equally, with a poor soundtrack, the power of the message of the film is significantly weakened.
Eventually, the directors and editors of these ‘animations’ will develop their own approach – their
own signature approaches, just as, over time their movie counterparts have.

However, this is not a process where research and academic papers will develop the field. The
collaboration of these two digital media has a potential that is demonstrated every time they
converge in a movie makers musical ‘soundtrack’. That is to create a unique audiovisual
experience of site. The collaboration encourages the exploration of Architecture as Narrative – is
it the experience that is ‘narrated’ or is there some further reading of the building mediated by the
audio-visual experience? Beyond this, there is also an enrichment of the craft of Communication
in architecture - exploration of how the combination of digital animation and Sonic Arts can aid the
communication of design, representation, experience.

The project facilitates an exchange of ideas that in the best students, expands their work beyond
digital craft into both digital art and spatial analysis: composing time, space and place. The
collaboration assists each discipline requiring them to understand each other’s language and
spatial / aural concept. The process becomes a liquid transition of ideas between the two
practices. Discussion facilitates an examination of the realm where the two come together – the
spatial and the aural experience they are creating. There is as usual in the creative process no
recipe for success, rather, analogous to the ‘Five C’s of Cinematography’ a series of key issues
are beginning to be developed that the students’ visual and aural compositions must
Sound "cues": what in the field of semiotics might be described as what is signified by a
particular sound. Real/imagined sonic events (sonic cues) create a sense of place/experience.
Questions arise as to how realistic or impressionistic these cues need be.
Sound “keys”: In animation there are visual ‘keys’ or markers of particular stages of the story
development. Sound can been used to provide the same place marking even with different

                                         Meridan & Donn-7
visuals. Questions arise as to how to hold a thought aurally or visually while the other part of the
narrative (i.e. visually or aurally in turn) proceeds.
Sound “continuity”: Inevitably a film is time-based. For improved communication of ideas or
understanding of space, movies often manipulate the space-based experience so that it is not
linear with the time base. What is required is focusing on questions of continuity of story, not
continuity of experience.
Sound and space composition complementarity: Soundtrack composition drives the forward
momentum of the time-structure, whilst the architecture – the space composition – drives the
narrative and suggests sonic materials. Questions arise as to how to make a space welcoming
aurally and visually; how to make a space delineate path, aurally and visually; how to move
beyond the foley editor’s ‘sound effect’ to a genuine ‘soundtrack’.
Sound as illustrator or emotional enhancer: Sound can be used as an emotive aid to draw the
listener into a way of hearing/ experiencing/ visualising/ creating a sense of place for a particular
site. Alternatively a "translation" of design parameters such as rhythm, pace, form etc, music can
be used to illustrate these design elements. Some of the best films use the latter to convey the
former. However, this process is not a recipe for success. The weaker composer looking for
‘inspiration’ in this manner can produce a banal piece of kitsch. Similarly, the weaker animator
looking to follow Disney’s Fantasia and use the music to ‘inspire’ the rhythm and presentation of
the animation, can produce a weak imitation of a ‘rock video’ all flash and no content.

As the ‘storyboard’ below, captured from the 2003 class DVD, illustrates: what is needed is more
experimentation, more participants: research by design producing over time a significantly
improved understanding of these processes of production and design.

Figure 1: Ian Porter 2003: Animation of Santiago Calatrava’s Milwaukee Art
Museum exploring the mixing of drawings and models of the building and of
‘another reality’ the drawing board. Pace and rhythm of the music must somehow
distinguish the different ‘realities’ whilst still maintaining continuity.

                                         Meridan & Donn-8
6.    References

[1]   Mascelli, J., The Five C’s of Cinematography, Silman-James Press, Los Angeles.1965.
[2]   In Music section of : “Said What?” web site:
      Last accessed, February 2006.
[3]   Martin, E., ed. Pamphlet Architecture 16: Architecture as a Translation of Music,
      Princeton Arch; 1 edition. January 1, 1996
[4]   Rabiger, M., Directing: film techniques and aesthetics, Focal Press, Burlington, MA,
[5]   Rodriguez, R., Mexico Trilogy (El Mariachi / Desperado / Once Upon A Time In Mexico)
      DVD, Sony Pictures, (2005); Disk 1: "10-Minute Film School"; Disk 2: "10 More Minutes
      with Robert Rodriguez: Anatomy of a Shootout"; Disk 3: “Film is Dead: An Evening with
      Robert Rodriguez”, “Ten-Minute Flick School”, “Inside Troublemaker Studios”.
[6]   Jackson, P., Jackson, F. and Boyens, P. The Lord of the Rings - The Motion Picture
      Trilogy (Special Extended DVD Edition) New Line Home Entertainment, 2004.

                                    Meridan & Donn-9

To top