Linking PUN and Voice

Hi, I am using PhotonNetwork.Instantiate to create player avatars on the local device and to all clients in the OnJoinedRoom callback. This prefab has several PhotonViews to sync various gameobjects position and rotation. I also have Photon Voice running based on the DemoVoice scene so I have a GameObject with a VoiceConnection, Recorder, Connect and Join and Web Rtc Audio Disp components.

Both parts of this are working however I now need to know which players are speaking and I thought i should be able to use a PhotonVoiceView on the player prefab to tell me this. However it doesn't seem to be working - i.e. the IsSpeaker and IsRecording are always false. Do I need to do anything else to connect the two together?

In my digging around trying to solve this I ended up adding a PhotonVoiceNetwork as well. At the top of cs file it says:
// It also sets a custom PUN Speaker factory to find the Speaker
// component for a character's voice. For this to work, the voice's UserData
// must be set to the character's PhotonView ID

I am not sure if I need to do something extra in my code to set the Voice's UserData or should I be using PhotonVoiceNetwork at all? This component also seems to duplicate some of the functionaity of VoiceConnection. At the moment I am not using the SpeakerPrefab setting to create my player avatars but should I be?

I am a bit confused as to what should be happening where and what Components I should be using. Can you please advise as to the best way to achieve this?



  • Hi @gstevenson,

    Thank you for choosing Photon!

    The Photon Voice 2 integration with PUN 2 is done using PhotonVoiceNetwork + PhotonVoiceView. This does the linking between remote Recorder (outgoing voice stream) and local Speaker (incoming voice stream) via UserData / PhotonViewID.
    PhotonVoiceNetwork extends VoiceConnection and is made to look like PUN's PhotonNetwork.

    The Photon Voice 2 integration with PUN 2 is a suggested approach, optional but we believe is suitable for the most common use case: voice linked to a 'moving' networked character avatar. It is explained here.
    You can use Photon Voice 2 with PUN 2 in any way you want.

    DemoVoiceUI is a demo that does not involve PUN.
    PunDemoVoice is a demo that involves PUN integration.
  • Thanks for the quick response. I've spent another frustrating day trying to get this to work.

    I didn't mention before that I am building a Quest app which I susepct is at least half the problem. I am not attempting to use the OVR Lip sync as there appears to be issues with that.

    I have tried to replicate the PunDemoVoice setup in my App but without much luck. Unfortunately the PunDemoVoice scene doesn't exactly work well on a VR device but I am also having trouble getting it work in the Editor. Although I can see the Speech and Speaker indicators being displayed on multiple characters I can't actually hear anyone speaking. When deployed to the Quest you can just about make out the characters but no speech or speaker indicators working and no sound.

    I have added some code to my project to display the various microphone inputs available using code from the PunDemoVoice. There are 3 displayed on the Quest. Any idea if I need to change this from the default one that Unity is picking? The options shown are: Android audio Input, Android Camcorder Input, Android Voice Recognition. The code suggests the first one is what it will be using.

    With a build where I have enabled Debug Echo on the Scene Recorder and enabled PhotonVoiceView.SetupDebugSpeaker (as descibed in 3b) I still cannot hear myself. I have also disabled VAD. On the root GameObject of the player prefab I have setup a Photon View (with no Observed Components - I added this so that the prefab gets completely destroyed when player leaves) and a PhotonVoiceView setup as described in 2a.

    If you can think of anything else I could try I would most appreciate it.

  • Hi @gstevenson,

    For the Unity Editor, maybe it's an issue w/ the 3D settings / distance between players?
    Did you test on different machines to see if audio devices are not faulty?

    For the Oculus did you try this?
  • Hi,

    I think the devices are working ok. If we use the SpeakerPrefab option on the Voice Connection script to create our Avatars (that have a Speaker component on them) then we can get it to work - i.e. two people can hear each other so the mics and speaker are working. However I don't want it to work that way as I am trying to use PhotonNetwork.Instantiate to create the Avatars and then use a PhotonVoiceView to know which ones are speaking.

    I am not using the Oculus Avatars so the Can Own Microphone issue I don't believe applies?

    Is there anything extra I need to do about Android microphone permissions or should that be dealt with by Unity or PhotonVoice?

Sign In or Register to comment.