photonsocketserver.exe Memory Leak

Options
innocentchris
edited September 2014 in Photon Server
I have a question about possible memory leak in Photon Server.
I monitored memory usage of photonsocketserver.exe using windows task manager, and I noticed that memory usage of photonsocketserver.exe keep increasing over time, slowly but definitely.
(about 100kb/sec) I have checked if there is any allocation or any instance i created, but there was none.

Does anyone know what's happening?

Comments

  • That's hard to say with so little information. ;)

    - which Photon version?
    - are you running "our" code that comes with the SDK (Default or Loadbalancing instance) or have you made modifications to the server-side code?
    - how many CCUs in total, are there connects / disconnects while you monitor the memory usage? How many?
    - How many data is sent? Msg in / out? TCP or UDP?
    - does the memory usage decrease when all clients are disconnected?
    - any modifications to the PhotonServer.config?
    - did you manage to get a dump, for example with .NET memory profiler? http://memprofiler.com/

    These are just the "starting points" at which I would look. We are not aware of a general memory issue in the latest Photon Server SDK. It is okay that Photon keeps allocating memory while clients are connecting and sending data, but the memory usage should decrease (at least partially) if your workload decreases.
    In most cases, memory "leaks" are caused by custom .NET code that creates objects and keeps hold of them forever.
  • Thank for the checklists.
    I tried to figure out the cause using .Net memory profile , and I think this issue has to do with this "http://support.microsoft.com/kb/947862" because I found out that the number of System.Threading.OverlappedData instance keep increasing over time.

    1.we are ussing 3.33. (btw Why does 3.4 not allow to serialize System.GUID? that's main reason why we rolled back to 3.33)
    2.We made modifications.
    3.none. without any connection, the memory usage keeps increasing.
    4.not much. I ran an empty test project/
    5.It looks like it decreases, but it keeps increasing gradually
    6.yes. such as max connectivity and other stuffs. Is there particular one I should look for?

    thank you.
  • Thanks for the information!

    The KB refers to asynchronous usage of .NET sockets in a case where the remote end stops sending / receiving data - have you implemented anything like that?

    Photon's socket handling is done in native code and a memory leak of the Photon core would not show up in the .NET counters. In general, if Photon does not have any work to do, it does not increase it's memory usage (you can start a "fresh" Photon from the SDK and see that it does not take much memory at all). So we need to find out what it is now doing even without any / much traffic.

    I recommend that you:
    - upgrade to the latest Photon version, if possible
    - check your code carefully and see what it does when it is "idle" (no incoming client connections)
    - try to use default settings in PhotonServer.config

    In general, we don't recommend to change the PhotonServer.config settings (buffer sizes, threads, etc.) - in most cases, the default settings are the best, and you should only change them if you are dead sure that you have a very specific issue / requirement and the config change will definitely resolve that issue.

    Sorry that I can not give a solution "out of the box" - I can only give hints where you could have a look.

    Oh, and regarding the GUIDs - you can just use byte[]:
    viewtopic.php?f=5&t=4467&p=17414&hilit=guid#p17414
  • It looks like it was the Counter Publisher..well at least the memory usage does not increase anymore after I excluded from list application config.

    is it a possible answer?
  • Interesting... I'll look into that. We are caching the data in the counter publisher for a while if the remote endpoint (like: the Photon Dashboard service) is not available, but it should definitely not end up with a significant memory leak.

    Thanks for the report.