How to stop SaveOutgoingStream from recording a 'muted' user?

Matt_Harmony
edited November 2022 in Photon Voice

TL;DR: Is there an option available to allow a Recorder to have 'silence' in the audio file when a user is 'muted'?

We are working on a project that requires the audio recording of full sessions; this ideally would be a single audio file that includes all incoming and outgoing audio.

Using the suggested 'mute an audio source to mute a user' settings, the Recorder still records their audio, it just doesn't get played back. This becomes a problem when using the 'SaveOutgoingStream' as you still end up recording a muted user.

The alternative is to disable TransmitEnabled, which does not record the 'muted' audio into the final file. However, it does mean that the final file is not as long as the recorded session as there is no 'silence' when a user is muted.

Is there an option available to allow a Recorder to have 'silence' in the audio file when a user is 'muted'?

Or is this not supported by Photon Voice currently?

Best Answer

  • vadim
    vadim mod
    Answer ✓

    Add 'Mute' property to SaveOutgoingStream and zero the buffer in IProcessor.Process() before writing it to the file if Mute == true.

    Or create a separate IProcessor with the only purpose of muting the audio stream and add it before SaveOutgoingStreamToFile in the audio processing pipeline.

Answers

  • vadim
    vadim mod
    Answer ✓

    Add 'Mute' property to SaveOutgoingStream and zero the buffer in IProcessor.Process() before writing it to the file if Mute == true.

    Or create a separate IProcessor with the only purpose of muting the audio stream and add it before SaveOutgoingStreamToFile in the audio processing pipeline.

  • Thank you for your response Vadim.

    I realise when I tried to implement the 'mute' property in SaveOutgoingStream I had made a mistake and re-implementing that correctly was a suitable solution.

    I am now playing with doing a similar thing for Incoming audio, then I'll have to work on combining those two streams into a single audio file.

    When they both try and write to the same WaveWriter it sounds distorted; I'm guessing the buffer values need to be adjusted based on the number of input sources, but I'm not 100% on that as Audio is not my area of expertise. If you have any suggestions they would be appreciated.

  • You can't write several audio streams in a single file. You need to mix all streams and write the result. You can do this in realtime or after recording, by mixing the contents of several files.

    Unity mixes all AudioSources in the scene when produces game audio output. You can intercept this mix and write it in the file in OnAudioFilterRead method of the MonoBehavior attached to the AudioListener object. If you also play outgoing stream in Unity scene, this would be the easiest way to get the mix.