Streaming video/frames with rpc

Options
Can someone maybe help me out on debugging what might be wrong with my streaming code.

So short summary, I have an phone app that controls playing/pausing/stopping videos on multiple oculus go devices that connect through a pin code (room name). You can select an oculus go to watch (or are defaulted to the only one if only 1 is connected) and it will stream the image to the phone through photon.

It works fine on windows and it used to work fine from android to android too, but now it only appears to work when I run my app on the OculusGo and run the phone app on the computer in the Unity editor.

So in my experience while testing:
windows(oculus) to windows(phone) = works
oculus to windows (phone) = works
oculus to android phone = doesn't work (used to work)
windows (oculus) to android phone = doesn't work (used to work)

https://drive.google.com/uc?id=1UdqI3NX5oUO7FkLnmlCtpBK3hZ410wS8

As you can see from the screenshot I debug all my values and have taken precautions that my frames being sent are very small, but for some reason the phone doesn't seem to receive anything at all. There is nothing interesting to see in logcat either.

Real time capture script placed on OculusGo center eye camera.

[RequireComponent(typeof(Camera))]
public class RealtimeCapture : SingletonBehaviour<RealtimeCapture> {

    VideoController videoController;

    [HideInInspector]
    public VideoPlayer videoPlayer;

    public Text targetFPSText;
    public Text imageQualityText;

    public float targetFPS = 6;
    [Range(0,100)]
    public int imageQuality = 25;
    private bool reading = false;

    private Camera camera;

    private void Start() {
        camera = GetComponent<Camera>();
        camera.forceIntoRenderTexture = true;
        videoController = VideoController.instance;
        updateValues();
    }

    private void OnEnable() {
        RenderPipeline.beginFrameRendering += RenderPipeline_beginFrameRendering;
    }

    /// <summary>
    /// Unbind begin frame rendering on disable
    /// </summary>
    private void OnDisable() {
        RenderPipeline.beginFrameRendering -= RenderPipeline_beginFrameRendering;
    }

    /// <summary>
    /// Call on post render
    /// </summary>
    /// <param name="obj"></param>
    private void RenderPipeline_beginFrameRendering(Camera[] obj) {
        OnPostRender();
    }

    /// <summary>
    /// https://docs.unity3d.com/ScriptReference/MonoBehaviour.OnPostRender.html
    /// </summary>
    public void OnPostRender() {
        bool canReadFrame = VideoController.instance.realTimeCaptureEnabled && PhotonNetwork.IsConnectedAndReady && !reading;
        if (canReadFrame) StartCoroutine(readFrame());
    }

    /// <summary>
    /// Read a frame after rendering it, encode it and send it to the master client via RPC
    /// </summary>
    /// <returns></returns>
    public IEnumerator readFrame() {
        reading = true;  
        if (videoController.centerEyeCamera.targetTexture != null) {            

            Texture2D texture = new Texture2D(videoController.centerEyeCamera.targetTexture.width, videoController.centerEyeCamera.targetTexture.height, TextureFormat.ARGB32, false);
            texture.ReadPixels(new Rect(0, 0, camera.activeTexture.width, camera.activeTexture.height), 0,0);

            byte[] jpg = texture.EncodeToJPG(imageQuality);
            //Debug.Log(jpg.Length/1024 + "KB");

            //Send the frame/jpg as byte array
            videoController.photonView.RPC("updateRealTimeTexture", RpcTarget.MasterClient, jpg, videoPlayer.frame);
        }

        //The FPS cannot go much higher than is currently is
        //probably won't work with much higher values >10
        yield return new WaitForSeconds(1/targetFPS);
        reading = false;
    }
}
Two functions from the VideoController:

private void renderFrameToRT(RealTimeFrame realTimeFrame = null) {        
        RealTimeFrame frameData = realTimeFrame;
        if (frameData != null) {
            if (!realTimeRenderTexture.IsCreated()) realTimeRenderTexture.Create();

            try {
                Texture2D texture = new Texture2D(realTimeRenderTexture.width, realTimeRenderTexture.height, TextureFormat.ARGB32, false);
                texture.LoadImage(frameData.jpgData, false);
                texture.Apply();
                realTimeVideoTexture.texture = texture;

                if(Debug.isDebugBuild) {
                    //Set debug info
                    frameReceived++;
                    lastFrameSizeInKB = frameData.jpgData.Length / 1024;
                }

            } catch (Exception e) {              
                if(Debug.isDebugBuild) Debug.LogError(e.Message);
            }
        }
    }

    [PunRPC]
    private void updateRealTimeTexture(byte[] bytes, long currentFrame) {
        if(realTimeCaptureEnabled) {
            renderFrameToRT(new RealTimeFrame { frame = currentFrame, jpgData = bytes, });
            PhotonNetwork.CleanRpcBufferIfMine(photonView);
        }
    }

Comments

  • frecreate
    Options
    I am having trouble debugging this thing as my debug message don't show up in logcat, and build and run in unity gives me an error while building the apk and installing manually works fine.

    As far I can see the updateRealTimeTexture is never received in android, i've looked around and have seen others with this problem. However this code did use to work.
  • frecreate
    Options
    Alright, apparently I had to remove a AndroidManifest.xml. But I am not sure what difference this makes, I guess Unity figures out it's own manifest now?
  • Ok so for anyone wondering this quite a confusing problem that is only present when you are also using the Oculus SDK in your project, the AndroidManifest might sometimes be generated from the SDK and then this will conflict for some reason (probably permissions/settings that would otherwise be autogenerated) with the Photon RPC's.

    I am not trying to blame photon or anything, but when you have this manifest present some appear to work while others don't. So this might just be a whole range of problems that just stack up after another and it becomes a very confusing mess. But once again, it is probably just the problem of having Oculus and Photon in one project that must be built for Oculus, android and ios.