Streaming a live avatar with live animations using PUN

Hello!

I am developing a multiplayer VR game in Unity, using Photon Unity Networking. In addition, ONE of my characters in the game are using the Xsens Awinda body motion tracking system, in order to have a more realistic body which follows every movement. I am streaming this movement data into Unity, locally from MVN Analyze. However, I cant seem to stream these animations over the network so the other players see the same as I do. The avatar has an Animator Component attached to it, but since the animations are streamed live from MVN, there are no layers or parameters to be streamed by using the Photon Animation View. Do any of you have any advice on how to handle this problem?

Best regards
Håvard

Comments

  • Do you know how the data from MVN Analyze looks like? Is it a stream you can access and how much data is it??
    If it's lean and accessible, you could possibly just forward it to the other clients (using RaiseEvent). The next questions would be: What if any of the frames does not arrive? Is the format capable of skipping frames or anything such?

    Alternatively, you might have to try to extract the needed data from the result, meaning: Once the posing is applied to the character, you could run a script to access any node's positioning and send that (e.g. as a byte[] you write). Said data can be sent via RaiseEvent or you put a PhotonView on the character and use OnPhotonSerializeView.

    Hope that helps.
  • Thank you so much for your reply!
    The data from MVN Analyze is being sent at 40Hz. How do I forward these data to the other clients? Would that be somewhere in the Xs Live Animator Script? I am kind off new to this, so I am not completely sure how to handle these problems, when I cant use the standard PhotonTransformView and PhotonRigidbodyView.

    Best regards
    Håvard