Is there a way to store a large piece of binary data in the Turnbased room?
Options
Hi.
I'm trying to store a texture in the Turnbased room, to make it persistent. The texture size is quite large and when I try to get a byte array from it and save it to room Custom Properties - there's a lag and Photon disconnects me.
Is there a convenient way to do that?
Or should I save it using direct call to my Azure Website (the one behind Photon Web-hooks).
Thanks
I'm trying to store a texture in the Turnbased room, to make it persistent. The texture size is quite large and when I try to get a byte array from it and save it to room Custom Properties - there's a lag and Photon disconnects me.
Is there a convenient way to do that?
Or should I save it using direct call to my Azure Website (the one behind Photon Web-hooks).
Thanks
0
Best Answers
-
Hi @alandyshev.
Sorry, but room properties are just not made for this usecase.
A 2,8MB array is way too big for being used as a room property and a Hashtable with half a million (!) entries or a room with half a million(!) properties are both even a lot worse.
Putting such amounts of data into room-properties just won't work.
Why do you think you need to transfer the whole texture pixel by pixel?
The normal approach would be to generate the same texture on each client and only send the parameters that should be used for the creation algorithm to create it as a message and then when one client does something that changes it, just let it send a message that contains the information that other clients need to have them do the same changes.
As an analogy imagine a racing game.
When a player steers his vehicle left, you would just send the information that he did that and let the receiving clients just use the same algorithms that they use for changing the course of their local car to change the course of that remote car. Each client might also regularly send up to date coordinates of their locally controlled car to the other ones so that they can correct differences in the resulting positions of the steering maneuvers that got introduced by latency, but by no way you would send the coordinates of each single polygon of the car every frame.5 -
@Kaiserludi Thanks for answer.
The thing is that I need the image to be stored between sessions.
I've already switched to Photon Server and now I'm making a plugin that's working with Azure Storage and saving data from players' events.
First I tried to do that via webhooks - but some events were not handled. I think it's due to low speed of webhooks.
I hope to fix this by using plugins.
Thanks.0
Answers
-
Hi,
what size are we talking about here?0 -
Hello Markus.
I'm trying to add texture with size 1024x512.
We get the raw data from texture as a byte array and add it to room Custom Properties.
The length of byte array is 2 796 224.
Photon disconnects when SetCustomProperties method is called.
Earlier I tried also to split byte array to parts and load them one by one every 2 seconds. But still Photon disconnects earlier than I upload whole data.
Another approach I tried is to represent texture as hashtable of pixel index and color.
I even tried to add each pixel index as Custom Properties key, so that only the newest changes to the texture are sent to others. But it doesn't work that way. Still disconnects when total amount of data in Room Custom Properties becomes too large.
I will be grateful for any advice.
Thanks.0 -
Still haven't resolved this. Any advice? Please.0
-
Hi @alandyshev.
Sorry, but room properties are just not made for this usecase.
A 2,8MB array is way too big for being used as a room property and a Hashtable with half a million (!) entries or a room with half a million(!) properties are both even a lot worse.
Putting such amounts of data into room-properties just won't work.
Why do you think you need to transfer the whole texture pixel by pixel?
The normal approach would be to generate the same texture on each client and only send the parameters that should be used for the creation algorithm to create it as a message and then when one client does something that changes it, just let it send a message that contains the information that other clients need to have them do the same changes.
As an analogy imagine a racing game.
When a player steers his vehicle left, you would just send the information that he did that and let the receiving clients just use the same algorithms that they use for changing the course of their local car to change the course of that remote car. Each client might also regularly send up to date coordinates of their locally controlled car to the other ones so that they can correct differences in the resulting positions of the steering maneuvers that got introduced by latency, but by no way you would send the coordinates of each single polygon of the car every frame.5 -
@Kaiserludi Thanks for answer.
The thing is that I need the image to be stored between sessions.
I've already switched to Photon Server and now I'm making a plugin that's working with Azure Storage and saving data from players' events.
First I tried to do that via webhooks - but some events were not handled. I think it's due to low speed of webhooks.
I hope to fix this by using plugins.
Thanks.0 -
@alandyshev:
Still, I would strongly recommend against sending data of this size through Photon.
Photon is intended for sending small messages between players in real time.
Too big for our per-client buffer size on Photon Cloud: > 500k
Too big to be sent with UDP without resulting in an amount of fragments that might cause problems: > 100k
Too big to be sent without splitting it up into multiple UDP packets: > 1,200bytes including overhead from UDP, Enet and Photon (how much bytes of actual payload this equals depends on how you structure your data (a flat byte-array costs less overhead than a Hasthable of Hashtables), but ~1k should almost always be save).
For messages that get sent very regularly (10 times a second or even more often) I would reccomend to keep their size below 1kb.
If a message only get sent rarely (for example once at the start of a match), then a size of multiple KB is still fine, but I still would recommend to keep it below 10kb.
In exceptional cases something like 20kb or even 50kb might make sense, but usually messages that big indicate that you are doing something wrong and should rethink if there isn't an approach that would require a lot less data or if some data is simply not suitable to be sent as a Photon message, but should simply be downloaded from / uploaded to a dedicated file server.
You can work around the 1,200bytes and the 100k numbers by selecting TCP instead of UDP as network protocol in the Photon Client.
There is no workaround for the 500k limit on Photon Public Cloud.
On a self-hosted Photon server instance you can configure the size of the per-client buffers for incoming and outgoing data. I personally know of an Enterprise Customer who did exactly this and could then transfer their up to 10MB save game data through their custom Photon server using TCP.
They used the Photon C++ Client SDK for this, not PUN. I don't see why PUN should not also work with opRaiseEvent() payload sizes in these dimensions, when you adjust the server side buffer config and set PUNs network protocol to TCP, but this has not been tested, let alone proofed to be working well in production, so there might be subtle unknown bugs.
However doing this is NOT the recommended approach.
Instead you should rather simply use a HTTP file server to transfer and store data like your images. Those are actually specialized and therefor much better suited for file transfer than Photon.
Also when you don't need to modify this data very often, it could make sense to compress it before sending it. Photon does not compress your payload for you.0 -
The thing is that I need the image to be stored between sessions.
Another idea:
An alternative approach could be to store the parameters that lead to the creation of exactly that image + the relevant data from any events that let to a modification of it and to then just rerun those actions to let each client recreate this data locally at the start of each session.
This might result in a lot less data needing to be stored between sessions0