Realistic values for Incoming/OutgoingLossPercentage

Hi,
I'm trying to use the NetworkSimulationSettings to do robust debugging, but I am coming up against a couple issues:

1. if I set the Incoming/OutgoingLossPercentage too high (25-50% and up) then I lose too many packets and, if connected, time out within seconds.
2. if I set the Incoming/OutgoingLossPercentage too low then it may not be a realistic test (we are targeting mobile devices).

First, I'm curious, is there a setting I need to use to make UDP reliable with the standard PhotonNetwork.networkingPeer? I think it is set to be reliable by default, but I am a little suspicious with all the time outs I am experiencing when I set Incoming/OutgoingLossPercentage too high.

Second, what are some realistic values for the NetworkSimulationSettings that would be considered a robust test? I imagine there is no definite answer for that, but what has worked for you?

Thanks

Comments

  • how high the rate needs to be really depends on the networking type and the area. Even with mobiles its not to expect that they lose a massive amount of packets, on the other side their latency is instable enough to not even consider doing realtime networking on 3G as 95%+ of the users will definitely not have pings in a range where it would be enjoyable (as they can easily go from 100ms to 5s+)

    The only really robust test is going to the areas you want to have it field tested in or getting testers from those areas as you can't realistically test against 3g congestion in a fictive environment, even less with a simple lose percentage that does not reflect the ping flux and related packet flood etc


    As for reliable UDP: if you send the custom op as reliable it will be reliable yeah :) if you use an RPC it will be reliable too. For the view sync its define through the type of syncronization you use
  • Timeouts will happen because of loss or very high latency. That's in the nature of things.
    The latency you simulate is added to regular latency. So if you usually have 50ms and the in/out simulation adds 50ms each, you more or less end up at 150ms.
    Most realtime games try to stay below 300ms but 500ms is also ok. Jitter is "random lag" and will increase the roundtrip time variance. If this gets higher, commands will be repeated later and less often before a timeout, so don't add only Jitter.
    Loss should not be more than 5% or so.
  • Thanks for the info from both of you!