This experiment is a simulation of wave modulation, using Unity and Google Resonance.
This experiment is setup with two simulated speakers facing one another, with the listening position set up in between the two speakers - initially halfway. The speakers are initialised at 440Hz sine waves, at 20 distance units (DU) from the listening position. The polar patterns are set to be very direct towards the listener, with the listener having a figure 8 style pattern.
As the frequencies and distances change, the interaction between the two waves change, introducing AM(Amplitude Modulation) and FM(Frequency Modulation).
Finally the varying waveforms result in different modulation effects.
This was all visualised using a script that displays the output waveform.
The outcome was almost as expected, with both FM and AM being displayed, however the AM was not quite functioning as it should. The AM only seemed to be working when the sound source was in transit, so there may be some issue involving the way Google Resonance or Unity sums thew waveforms, or in the way the the space is simulated. The next step is to test different environments with the Resonance built in reverb spaces. I want to see how introducing the reflections modulates the sound further. I will also test alternative methods of sound spatialisation, to see what effects other softwares produce.
I will be also performing the same experiment in reality to test certain elements, though I will be sticking to the simulated environment mostly, as access to interesting spaces is limited at this current moment.