Read on to find out how she uses digital simulation as a compositional tool.
Many of the tools that I use for composing music are borrowed from other disciplines. In addition to sitting at the piano and transcribing thoughts onto manuscript paper, I also draw and model my musical ideas in 3D. To provide a little bit of background before delving into the detail: I originally trained as an architect, but I have always played and written music, and I only started to compose music for acoustic instruments since I started my practice-based PhD in 2017. My research investigates creative reciprocities between sonic and spatial practices and many of my projects have taken the form of site-specific, spatialised performances for acoustically-distinctive sites. The LSO Panufnik Composers Scheme offered an opportunity to explore spatial concepts within a musical context, beyond room acoustics.
In A Study of Passing Objects in an Accelerating Landscape the space of the orchestra becomes a study of perspectival depth, in the context of a train journey. The orchestra is conceptualised as a series of objects which pass us at increasing speeds. As the movement accelerates, the detail in the foreground becomes blurred and our focus gradually shifts towards slower changes in the mid and background, until all detail is smeared, and a distant horizon line is the only feature that holds our attention.
In addition to making creative decisions based on first-hand observations of a speeding train-journey (during which time some of the piece was written) I also simulated a number of train-journey-specific spatio-temporal phenomena using parametric 3D modelling and animation software; techniques which are more commonly used in architectural practice when designing spaces. These 3D tools enabled me to study the visual parallax between objects from the foreground to the background, and the changing appearance of a speeding object at different distances from an observer.
It’s important for me to mention that, despite existing as a collection of geometric objects defined as explicit entities in Cartesian space, this model is not intended to output a mathematical system for directly mapping geometric relationships to musical ones, as might be typical in the work of fellow architect-composer Iannis Xenakis. Instead, I wanted to use these tools to help me simplify the scene as framed by the train window. The objective was to interrogate the principles of perspectival construction as defined in optical space, as possible concepts for organising musical material and questioning the construction of perspectival depth in musical space. It’s also worth me mentioning that these tools were never intended to be seen by an audience; they are merely tools that I have developed as part of my spatiosonic* practice.
*Spatiosonic is a self-coined word Emma-Kate uses to describe correspondences between spatial and sonic concepts and/or behaviours.
Let’s have a more detailed look at three of these tools:
This video shows the musical score on the left and a plan view of the orchestra in the space at LSO St Luke’s on the right. As each section of the orchestra plays, the instruments are visualised as a simple square, the size of which depends on their dynamic. This plan view is then projected to form a perspective view by means of a simple two-point perspective principle.
In order to construct this image, we need to define a horizon line, two vanishing points, a station point, a picture plane, a ground line and an elevation, or height information. In the case of my orchestral piece, the picture plane is conceptualised as a vertical plane between the audience and the orchestra, centralised around the conductor. This decision followed a discussion with a conductor friend who stated that for large orchestras, where significant distances exist between musicians, the conductor can be thought of as defining a position in space at which the sound of each instrument is calibrated to arrive ‘in time’ (or at least from the perspective of the ears of the conductor). For this reason, it seemed appropriate to assign this space of ‘calibration’ to the job of the picture plane which, in an optical perspective, receives any geometry in plan before it gets projected into 3D space. The stage becomes the landscape and the different orchestral sections (and sometimes individual instruments) become objects within this landscape. The station point is conceptualised as a listener, in the audience.
In this visualisation, the station point (the listener) was placed towards the front-centre of the auditorium, though in theory they could be anywhere behind the picture plane (the conductor). The parametric nature of the software in which this model was made, enables an instant re-drawing of the scene as we switch between listeners in different positions. The elevation (or height information) is derived from various harmonic relationships between each of the ‘objects’. The horizon line plays a particularly important role, both in the construction of the image as visualised in the animated tool and as a defining feature in the ‘landscape’ of the music, where it is imagined as a minor 2nd drone, which subtly oscillates throughout the piece to simulate the slow change that we observe along the horizon during a fast journey.
The resultant 3D projection gives me a method for quickly visualising a range of musico-perspectival metaphors, against the typical layout of the orchestra, as experienced by an audience. This particular study highlights a series of interesting tensions and correspondences between the experience of real/physical perspectival space and virtual/metaphorical space as expressed within the music.
This study quite simply places a series of objects within a 3D space and registers both a sound and a mark on a score (at the bottom of the video) when these objects pass by a window (in yellow). Even though this animated 3D model is far less visually and audibly inspiring than a real journey through a beautiful landscape, it does provide an opportunity for studying how the alignment of objects stretching from foreground to background changes as speed increases.
Though the lines that appear at the bottom of the image appear a little like a MIDI score (musical instrument digital interface), they don’t get so literally translated, for example, as rhythmic stimulus for a melody. Instead, the purpose of this study is to inspire and inform ideas relating to the alignment of orchestral objects (in the accelerating landscape) as they pass us, as a simulation of optical parallax*. Amongst many other decisions, this study informed a change in time signature by which objects are ‘compressed’ as they approach and appear to extend as they leave the scene. Almost like a pseudo doppler-effect, but with a temporal consequence as opposed to a tonal one.
* Parallax is the observed displacement of an object caused by the change of the observer’s point of view. In astronomy, it is an irreplaceable tool for calculating distances of far away stars. Parallax enables astronomers to measure the distances of far away stars by using trigonometry.
The acoustic behaviour of the space in which the music is performed isn’t the only way in which sound and space are in dialogue in this project, though it is unavoidably still important as the space will always assert its material and volumetric opinions in relation to the sounds that inhabit it. Also, the clear localisation of individual sounds within the space of the orchestra will always exist as a feature of its live performance, as the amount of space that an orchestra occupies and the resultant distances between instruments is impossible to flatten, or reduce to a single region of space.
The orchestra is often discussed as a single entity, which is much easier to imagine when listening to a recording, which (despite the best efforts of the sound engineer) often squeezes the orchestra into a narrow stereo image in which the nuances of its spatial complexity can’t be captured or reproduced. However, on both experiencing live orchestral concerts and verifying these experiences with a (heavily simplified) visualisation of the acoustic behaviour of the orchestra, it poses the question as to whether this ‘entity’ could instead be considered as a series of small ensembles which are sometimes ‘in concert’ and sometimes not. Though this isn’t an entirely new idea, it is one that I’ve explored throughout the LSO Panufnik Scheme and as a result of being able to visualise direct sound and early reflections, as made possible by acoustic simulation tools which are typically used in the design of new spaces, rather than the design of new sounds within existing spaces.
All of the tools discussed in this article are highly bespoke and I’ve developed these (and others) in response to a personal desire to generate musical works which discuss and explore spatial ideas on both a physical and conceptual level. The LSO Panufnik Scheme has provided a fantastic opportunity to explore such ideas in depth and for me to interrogate the tensions and correspondences between the wealth of interactions between sound and space in both physical and ‘virtual’ space.
To find out more about Emma-Kate’s work visit her website here.