Syncing large amounts of data on game load

Options
When a client joins an existing game, a potentially large amount of information needs to be synced across (This is a voxel based game, so there can be thousands of operations to sync in a small area).

The problem I am running into is the client seems to have a hard time accepting this many buffered RPC calls at once. I get a few warning messages stating this, then my client gets disconnected.

I've switch my photon views to "reliable delta compressed" which helped a lot, but I'm still running into this issue as soon as I am syncing more than a few buildings worth of RPC calls. (One call per block, hundreds of blocks per building).

Is there a way to give the client more time to process the buffered RPC calls to avoid the disconnect? Should I be doing this differently?

FYI. I do not have a photon view on each block, but manage all blocks through RPC calls on one central photon view. I do use photon views on all moving objects (Creatures, players, etc..)

Thanks

Comments

  • Tobias
    Options
    What you encounter is an expected issue. RPCs are very convenient to sync actions but they are far from being effective syncing the data you try to sync.

    If your world is big, PUN is not suitable for your case. You need to come up with Interest Management, need a lean way to sync blocks (in chunks) and a way to update blocks of chunks.
    In short: It requires a more clever, custom solution. Sadly, we can only help with the high-level. In doubt, if the game is minecraft-ish, then you would want to implement a custom server. There is no out of the box solution for that.
  • Just to confirm.

    Are you saying I should be using and customizing Photon server, or are you saying I will likely need to write my own server?
  • iHaveReturnd
    Options
    A project I'm working on involves a large amount of objects being synced, not to the same degree but what we've learned from it may help you find a solution that works with PUN.

    You said that you have one RPC per block, this floods the network very quickly and it can't handle sending that much traffic. What we did to solve this was two things.

    1. Send one RPC for a large amount of blocks, like a chunk in minecraft. In this RPC you pass a string parameter, and each letter in that string represents a value. You could assign a single character, or multiple characters to a block and parse it when it is received. So for example pass something like " "001:005" " or "AA:AB:CF" in that example the 001, 005, AA, AB and CF would be a type of block, not the index of a block just two different examples of coding. You can pass a very long string and iterate through it to find what each block should be. We found we had much more stable connections and success when doing it this way vs sending a huge number of RPCs. Strings are very small when it comes to how much data you send as well compared to sending hundreds and thousands of rpcs.

    2. Send the RPCs with time delays through a coroutine or other delay mechanism. If you make it send only one to 5 rpcs every .1-.5sec instead of bursting all of them at once, there will be a much more steady stream instead of slamming all of the info into the network at once.

    If you want me to elaborate on the example or anything let me know. I hope this makes sense and helps you out!
  • TechCor
    Options
    Best way to send large amounts of data across any network.

    1. You need to be serializing your data into one byte[] buffer. Use something like MemoryStream and BinaryWriter. This is more compact than strings.

    2. Given the amount of data you are pushing, I would think you should try to compress the data with something like Unity.IO.compression, but this is optional. It can save huge amounts of transfer time though.

    3. There is probably a maximum size you can send in one chunk, figure this out. Then, send your compressed data in pieces by using an index to the byte array and create an intentional delay of about 200ms between calls.

    4. Do everything in reverse. Decompress, read the data with BinaryReader. Bam. Fast, lean delivery.

    As for using RPC, not sure if there is overhead that would cause problems. If you use Photon Server, you could modify RaiseEvent to be able to send to a specific player. Photon Server/Cloud should be fast enough for what you want to do. Very large amounts of data should use a web server.
  • Guess I'll have to start working on more robust networking sooner than I had hoped.

    Thanks for the suggestions.