Max Quotas/Limits for Networked Properties and StartGameArgs.PlayerCount

Options

Hi,


I am looking for some clarity on the quotas and limits for a couple of things which I could not find in the manual or API reference.


-Are there any hard limits on data sizes for networked properties? For an example, I am considering using networked properties to store a list of items which a user has acquired. As you can imagine, a user may collect a large number of items, so this property could easily grow large. are there any limits or performance issues to be aware of?


-Are there any limits or quotas on StartGameArgs.PlayerCount?

Answers

  • AdaT
    Options

    I got my response on Discord, thanks @Inky_white, @Laags, @DavidJ, @emotitron

    #1: For one, networked properties only take up bandwidth when the values change, so you can network stuff that doesn't change alot. 2 Yeah, some networked properties require a Capacity attribute to specify how much space you need. If I understand what you are saying, you can network something liek dictionaries of items, and it would only update items that change, but you should only network the minimum amount of data for best results.

    For example, in one of the Youtube tutorials, Player inventory is networked with a List of ints, with each int corresponding to the index of an item, rather than networking the item itself (which im not sure you can even do)


    Q#2:

    With a properly tuned Area Of Interest Radius, can a single Fusion room running on a large ec2 instance handle this 3-4k concurrent player load with NetworkedRigidBodies and hitboxes? or is it generally advisable to have smaller zones with lesser player count? as the examples I've seen have tested with 200 players


    A#2:

    - If you get even 200 players working smoothly in that sort of mmorpg i'll be impressed

    - should note they are not using network rigidbodies for that br case either

    - A big chuck of BR 200s CPU usage goes to the playables for the tick accurate animation is my understanding.