SpatÂ (or Spatialisateur in French) is a real-time spatial audio processor that allows composers, sound artists, performers, and sound engineers to control the localization of sound sources in 3D auditory spaces. In addition, Spat provides a powerful reverberation engine that can be applied to real and virtual auditory spaces.
The processor receives sounds from instrumental or synthetic sources, adds spatialization effects in real-time, and outputs signals for reproduction on an electroacoustic system (loudspeakers or headphones).
Its modular signal processing architecture and design are guided by computational efficiency and configurability considerations. This allows, in particular, straightforward adaptation to various multichannel output formats and reproduction setups, over loudspeakers or headphones, while the control interface provides direct access to perceptuallyÂ relevant parameters for specifying distance and reverberation effects, irrespective of the chosen reproduction format.
Another original feature of Spat is its room effect control interface relying onÂ perceptive criteria. This allowsÂ the user to intuitively specify the characteristics of a specific room without having to use an acoustic or architectural vocabulary.
SpatÂ is, for instance, used for real-timeÂ 3D audio rendering with aÂ 350-loudspeaker array in IRCAMâs variable acoustics concert hall.
Spat relies on an efficient signal-processing library programmed in C++ toÂ provide state-of-the-art software technologies.
The software bundle comes as a set of Max/MSP external objects (i.e. plugins that can be inserted into the Max/MSP environment).
Spat contains more than 100 external objects, many abstraction patchers, help patches, tutorials, large HRTF database, etc.
Spat objects are highly configurable and most of them supportÂ up to 250 input/output channels.
The main features are:
- sound spatialization (panning) in 2D or 3D. Here is a non exhaustive list of supported panning algorithms:
- stereo (AB, XY, MS)
- binaural (with near field compensation) and transaural
- vector-base amplitude panning (VBAP)
- distance-based amplitude panning (DBAP)
- vector-base intensity panning (VBIP)
- nearest-neighbor amplitude panning
- speaker-placement correction amplitude panning (SPCAP)
- B-format and higher order Ambisonics (HOA) without order restrictions
- near-field compensated higher order Ambisonics (NFC-HOA)
- artificial reverberation. Multichannel scalable/tunableÂ algorithmic reverberation based on feedback delay networks. Efficient multichannel real-time convolution without latency.
- perceptual control of the acoustic quality of the room:Â warmthÂ and brilliance; presence/proximity ofÂ the sound source; room presence; early and late reverberation,Â heaviness and liveness. Easy control over the radiation of sound sources (aperture and orientation).
- low-level signal processing:Â Â equalization, Doppler effect, air absorption, etc.
- graphical user interfaces for controlling/authoring/monitoring the spatial sound scene.
- many objects to create/edit/transform spatial trajectories.
- many useful tools to manipulate multichannel audio signals: multichannel sound file player/recorder (spat.sfplay~, spat.sfrecord~) up to 250 channels; multichannel EQ, compressor, limiter, gate, etc.
- tools for room acoustics and/or speakers calibration: delays/gains measurement and correction; measurement, analysis and denoising of multichannel room impulse responses, etc.
- headphones monitoring of any multichannel stream.
- various audio effects: stereo enlargement, Leslie cabinet simulation, ping pong delays, graphical equalizers, parametric equalizers, etc.
- Â etc.
The SpatÂ bundle contains Max/MSP implementation of the award-winning Flux:: Ircam Tools plugins (Ircam-Spat, Ircam-Verb, Ircam-VerbSession, Ircam-HEar) (seeÂ http://www.ircamtools.com).
Fields of Application
- Film & Video
- Scientific Research & Development
- Virtual reality
- Sound Design
SpatÂ can be used inÂ several contexts:
- Live concerts, sound installation and spatialization in real-time. The composer can associate each note or sound event in the score with a room effect or a specific position in space. SpatÂ can beÂ controlled by a sequencer, a score-following system, or any other algorithmic approach. Being integrated into the Max/MSP environment, Spat can easily be linked to any remote control device (tracking system, tablet, smartphone, joystick, gestural sensors, etc.).
- Mixing and Post-production.Â A spatialization module can be affected to each channel on the mixing table in order to have access to an intuitive and global control of the positions of each source and their associated room effect.
- Virtual Reality.Â The spatialized auditory component is essential in creating the sensations of presence and immersion in virtual reality applications, or in interactive installations. In suchÂ scenario, the binaural mode (3D reproductionÂ over headphones) ofÂ Spat is particularly well suited. Binaural renderingÂ is remarkably convincing when the processorÂ is linked either to aÂ tracking system (that follow the subject’s position and orientation) or to gestural controllers.
T. Carpentier, M. Noisternig, O. Warusfel. Twenty years of Ircam Spat : looking back, looking forward. In Proc of 41st International Computer Music Conference (ICMC), Denton, TX, USA, pp 270 â 277, Sept 2015.
Spatialisateur is an IRCAM registered trademark. The design of Spat and the reverberation module are protected under several French and international patents ([FR] 92 02528; [US] 5,491,754, [FR] 95 10111; [US] 5,812,674).
All other trademarks belong to their owners. Max/MSP is the property of IRCAM and Cycling’74.