Microsoft DirectX 9.0 SDK Update (Summer 2004) |
Audiopaths and Buffers
Each DirectMusic segment plays on an audiopath that controls the flow of sounds from the performance to the synthesizer, then through DirectSound buffers where effects can be applied, and finally into the primary buffer, where the final output is mixed.
Note The buffers referred to here are used for streaming and processing
Applications can create standard audiopaths and then play segments on them. For example, an application could create one audiopath for playing MIDI files to a buffer with musical reverb and another for playing WAV files to a buffer with 3-D control.
More sophisticated audiopath configurations can be authored into a segment in DirectMusic Producer. For example, a nonstandard configuration might direct parts in a segment through different DirectSound buffers to apply different effects to them.
An audiopath can be seen as a chain of objects through which data is streamed. An application can gain access to any of these objects. For example, you might retrieve a buffer object to set 3-D properties of a sound source, or an effect
See Also
© 2004 Microsoft Corporation. All rights reserved.