Error processing an incoming Event

Options
stucko
edited December 2013 in DotNet
Hi All,

Ive been trying to send serialized data to the client by using

"Photon.SocketServer.Protocol.TryRegisterCustomType" and "Protocol.GpBinaryV162.Serialize" methods.
it works nicely, ive even tried to have custom objects in a custom object. That too works nicely.

However now I face a problem. After serializing a custom object which has dictionaries (converted it the a dictionary of objects) and lists (converted it to an array) of custom objects (successfully sent via PublishEvent) I got this on the client side :
Non matching Profiler.EndSample (BeginSample and EndSample count must match)
System.Object:__icall_wrapper_mono_array_new_specific(IntPtr, Int32)
ExitGames.Client.Photon.Protocol:DeserializeCustom(MemoryStream, Byte)
ExitGames.Client.Photon.Protocol:Deserialize(MemoryStream, Byte)
ExitGames.Client.Photon.Protocol:DeserializeHashTable(MemoryStream)
ExitGames.Client.Photon.Protocol:Deserialize(MemoryStream, Byte)
ExitGames.Client.Photon.Protocol:DeserializeParameterTable(MemoryStream)
ExitGames.Client.Photon.Protocol:DeserializeEventData(MemoryStream)
ExitGames.Client.Photon.PeerBase:DeserializeMessageAndCallback(Byte[])
ExitGames.Client.Photon.EnetPeer:DispatchIncomingCommands()

OverflowException: Number overflow.
ExitGames.Client.Photon.Protocol.DeserializeCustom (System.IO.MemoryStream din, Byte customTypeCode)
ExitGames.Client.Photon.Protocol.Deserialize (System.IO.MemoryStream din, Byte type)
ExitGames.Client.Photon.Protocol.DeserializeHashTable (System.IO.MemoryStream din)
ExitGames.Client.Photon.Protocol.Deserialize (System.IO.MemoryStream din, Byte type)
ExitGames.Client.Photon.Protocol.DeserializeParameterTable (System.IO.MemoryStream memoryStream)
ExitGames.Client.Photon.Protocol.DeserializeEventData (System.IO.MemoryStream din)
ExitGames.Client.Photon.PeerBase.DeserializeMessageAndCallback (System.Byte[] inBuff)
ExitGames.Client.Photon.EnetPeer.DispatchIncomingCommands ()
ExitGames.Client.Photon.PhotonPeer.DispatchIncomingCommands ()

Comments

  • Tobias
    Options
    I think you found a weak spot in our serialization code. I would blush but you couldn't see that anyways.

    Long story short: We kind of feared that we can't support any level of stacking of objects into objects, etc. This stuff is pretty complex to serialize and create on the receiving end, so something will break at some point.
    Our focus wasn't so much a perfect de/serialization lib but a relatively lean byte code that is geared towards gaming.

    At the moment, I can't promise to fix this. Instead, if you needed more complex stuff, you can use any de/serializing code that gives you the results you need and let Photon only see (and send) the byte-arrays that this produces.

    I would need your code and requirements to be able to have a look what breaks and if we can fix it at all. Maybe it's good input for the next version of our protocol, so please mail us if you're not giving away secrets. Mail to: developer@exitgames.com
  • Hi Tobias,

    Just to let you know, Ive managed to not encounter that error after I made my custom data type smaller.

    Before this my custom data type was 500k bytes in size, when trying to send from the server, a warning told me that its too huge. So I reduced it to 34k bytes. This is when I said I successfully sent the data that the client side requested but had the above error thrown by the client upon receiving the data.

    After that I went ahead and dissected my custom data type (to see if its the property's fault) and sent it property by property. That worked fine (as in no errors so far in serializing/sending and retrieving/deserializing).

    Then I went ahead to experiment further to combine several properties. That worked fine to until I reach roughly threshold of 30k+ bytes for that custom data type.

    Then I reread the response given by Kaiserludi on the Server section of the forum.
    Supported types are:
    byte
    short
    int
    long
    float
    double
    bool
    string
    Hashtable
    Dictionary
    arrays of all of the types listed above (max size is SHRT_MAX for all except byte arrays, which are supported up to a size of INT_MAX), including jagged arrays
    Object arrays
    CustomType

    my understanding is that when sending data its always in bytes, thus max size is INT_MAX, the size limit above is the length of an array of a customType for example customType[SHRT_MAX].

    But seems like the error I'm getting is related to the size of the bytes of my custom data type which in this case triggered when its above 30k bytes total for that single custom data type (a data type that consist of dictionaries of datatypes).

    I've now decided to just send smaller sets of data to the client.

    Btw i'l prepare the the code that caused that error above for your future use. Just wanted to know the size was really the issue.
  • Tobias
    Options
    Good you mentioned the size you're sending.
    We didn't expect custom values to get this big and so we're currently sending them with a short-typed length. This will be the limit you hit.
    I'm sorry we only realized this now.

    As our protocol was built with faster, smaller messages in mind, we decided to use only 2 bytes of length info, except for byte[], which has an int value. Also these length values are signed, which is limiting things, too.

    We are working on a new protocol version which will have compressed integers. These will lift the limits but it won't be available for another while.

    As you don't get any info about how much of your 500K arrived while you receive it and if the device or connection is bad, sending 500K in reliable commands is a lot and can cause disconnects. With that in mind, splitting and minimizing your data will be good in any case!
  • Kaiserludi
    Options
    Hi again.

    int is used for the element count of byte-arrays. This has nothing to do at all with the max size of custom types.

    500kb for a single custom type instance is pretty much extreme. Photon is built for realtime communication, its not optimized to be used as some kind of file transfer.
    Would you like to explain, why you think, that you need to send so much data? I am sure that we can give you some recommendation to optimize your data amounts drastically.
  • Hi Kaiserludi,Tobias,

    Thanks for all your prompt replies.

    Currently I'm sending all the basic data (things that wont change during the game) which are mostly structs in the serverside.

    This data set will only be sent once when the game connects to the server.

    The 500k byte sending was a bug(wrongly pulled from the db). After fixing the bug we roughly have 30-40k byte of data to be sent when the game initially connects to the server. Is 50k considered alot?

    We will look to reduce the this dataset's size in the future. Currently we are keeping references of objects (some are dictionaries,lists) within objects. From our analysis this is the big chunk of that 30-40k since the same object is being serialized more than once (2 child object referencing a parent object, the parent object gets serialized twice for example). This is what most of our code look like

    currentCase:
    class customObjectA
    {
      int id;
      sting name;
    }
    
    class customObjectB
    {
      int id;
      list<customObjectA> listofchildrenobject; 
    }
    

    One of our plans to reduce this is to not keep reference of the object but instead just keep and id/identifier which has a small size (int/short probably). But this approach would force us to change alot of our code (and would not be as convenient to access), which is why we are not doing it now.

    alternativeCase:
    class customObjectA
    {
      int id;
      sting name;
    }
    
    class customObjectB
    {
      int id;
      list<int> listofchildrenobjectid; 
    }
    


    Did we do the serialization wrongly? for the "currentCase" data structure, could we just serialize "customObjectA" once , or not at all? since we serialized all the "customObjectA" we have before we serialized "customObjectB".

    Any other way to optimize this?
  • Tobias
    Options
    If multiple ObjB reference (less many) ObjA, then you should serialize all ObjA and reference them by ID. Else, ObjB includes the used ObjA and you could serialize those within ObjB.
    Depending on the data you got, you could apply a compression on the byte[] you originally write (and vice versa for reading).

    In general: Yes, 50k is already quite some data. We send only 1k per package and there is no API access to "how much data is still coming" and while a client receives this data, there is a high chance it doesn't get real-time updates. Photon is built especially with real-time communication in mind.
    It should work, though :)
  • Thanks Tobias, will definitely relook at the way our data is structured/sent,
    Will probably bug you guys again once we start our data restructure.