Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

UNET HLAPI Network Bandwidth Inefficiencies + Suggested Changes

Discussion in 'UNet' started by Zullar, Sep 2, 2016.

  1. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    651
    TLDR: Propsed changes would reduce UNET bandwidth by ~10x (depending on usage).

    After looking at how UNET uses Messages and OnSerialize/OnDeserialize to handle NetworkBehaviour SyncVars I think they are pretty inefficient.

    There is a 32bit dirty mask sent for each NetworkBehaviour synced script on an object. Anytime a network package is sent for that object I think a 32bits are sent for each attached synced script. That means if you have 10x scripts that 10x32 = 320 bits is sent before any payload is added.

    In addition, bools are sent as 8bits instead of 1bit. Stream classes use 8bits as their minimum resolution and I speculate this is the reason for this (it's always easier to work with whole bytes).

    So lets give an example of how UNET currently works with a worst-case scenario showing the inefficiency and proposed improvement.
    -You have 10x scripts each with a SyncInt, SyncBool, SyncFloat, and SyncVec3.
    -You change a SyncBool on NetworkBehaviour #2 to true.
    -When a object network package is sent the data will be this:
    Msg.Type 16bits Ushort
    NetworkIdentity.netID (32bits?)
    Script 1: 32bits: dirty mask = 00000000 00000000 00000000 00000000
    Script 2: 40bits: dirty mask = 01000000 00000000 00000000 00000000 + bool data 10000000
    Script 3-10: 32bits each: dirty mask = 00000000 00000000 00000000 00000000
    Total of 376bits of data is sent to transmit 1 bit (the bools on script #2) data.

    After looking at this I think it would be beneficial for UNET to change a few things.
    1: NetworkReader/Writer should be able to write bools as bits (not bytes). Example here: http://stackoverflow.com/questions/7051939/bit-based-binarywriter-in-c-sharp/7067744#7067744
    2: Each NetworkBehaviour script should have its own dirty bit.
    3: The SyncVar dirty mask should be variable size and match the number of SyncVars
    4: SyncBools should flag the NetworkBehaviour as dirty, but should NOT write to the SyncVarDirtyMask because the dirty flag is just as much bandwidth as the payload (1 bit). Their data should always be sent anytime the NetworkBehaviour flag is dirty.

    So lets repeat the example above.

    Msg.Type 16bits Ushort
    NetworkIdentity.netID (32bits?)
    Script 1: 1bits: NetworkBehaviour dirty = 0
    Script 2: 1bits: NetworkBehaviour dirty = 1. SyncVarDirtyMask = 000. bool data 1
    Script 3-10: 1bits each: NetworkBehaviour dirty = 0
    Total: 62bits instead of 376bits!

    Why so much less bandwidth with these changes?

    Scripts 1, and 3-10 only sent 1bit (a single 0) instead of 32bits of zero's.
    Script 2's dirty mask is only 3bits (1 for int, 1 for vec3, and 1 for float) instead of 32bits
    Script 2's bool payload is 1bit instead of 8bits

    In addition these changes would allow for >32 SyncVars on 1 script.

    This illustrates a worst-case UNET scenario... but in typical usage I think this type of change might result in ~3x or so reduction in bandwidth (it depends greatly on usage).

    5: Additionally instead of Ushort Msg.Type this could also be variable length based on the number of registered messages. i.e. if there are <= 128 message types then this can be packed into 7bits (2^7) instead of using a constant 16bits. <=512 message types would use 9 bits (2^9)

    6: Similary (although this may be harder because it's constantly changing) the NetworkIdentity.netID could be reduced from 32 bits to a smaller number based on the number of current network objects. i.e. if there are only 16 objects in a scene then 4bits (2^4) could be used to send the ID instead of a constant 32bits.

    With all these changes:
    Msg.Type 7bits (variable length. Assume <= 127 message types)
    NetworkIdentity.netID (4bits. Assume there are <=16 objects, would be 5bits for <=32 objects etc.)
    Script 1: 1bits: NetworkBehaviour dirty = 0
    Script 2: 1bits: NetworkBehaviour dirty = 1. SyncVarDirtyMask = 000. bool data = 1
    Script 3-10: 1bits each: NetworkBehaviour dirty = 0
    Total: 25bits instead of 376bits!

    *DISCLAIMER*: I think this is the way things work with UNET, but I am not 100% certain. Let me know if anything is incorrect.

    Hope this makes sense and thanks for reading. If anybody knows any ways to get this in front of a UNET dev please let me know.

    *EDIT* DirtyMask is written as UPacked32 which will typically consume 8bits not 32bits.
     
    Last edited: Oct 13, 2016
    Deleted User, ThaiCat and Chom1czek like this.
  2. emrys90

    emrys90

    Joined:
    Oct 14, 2013
    Posts:
    755
    I could be wrong about this, but I think the netId gets packed to 1-5 bytes based on amount of data in it, like they pack uint's.
     
  3. l3fty

    l3fty

    Joined:
    Mar 23, 2013
    Posts:
    86
    Hopefully someone from Unity can have a look at this, sounds like it would be a good optimisation.
    Re getting their attention you could try tagging a few of the developers (if you can find which ones have been active in the networking section recently). They'd then get a little notification.

    Alternatively promise them some free chocolate? :D
     
  4. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    651
    I did some looking.

    Messages use short (16bits) and are not compressed. MsgType.highest is 47 which 6 bits (2^6 = 64) could fit. If the user registered added a lot of messages then 7 bits might be needed. But to send an extra 10bits of overhead on every message is inefficient (since messages do everything!).

    You are right the NetworkID is a PackedUint32. I think it will usually only use 8bits to send uint32 unless there are a lot of objects.

    By far the biggest inefficiency for me is the 32bit dirtyBitMask for each script. In my case it's a real killer because I have a script for each character ability. I must attach ALL abilities... even untrained ones because you can't add NetworkBehaviour components at runtime. So if I have 50 abilities, and only 5 are trained (other 45 are unused but attached), I'm still sending 32 bits on all 50 abilities (1600 bits!) of overhead anytime it's serialized!

    Unfortunately I think I need to write my own network serialization. The built-in serialization is just too inefficient and doesn't allow for client control or clients to communicate with other clients easily. What I have so far seems to be working well. Bits are written as bits instead of bytes, the dirtyMask is variable size, SyncVars who's networkSendIntervals are *almost* expired are piggy backed on a network package if one is being sent, scripts without SyncVars write 0 bits, late-connecting clients get up-to-date info before the first frame, and bools do not use a dirtyMask, and some bugs regarding SyncVar hooks not being called during scene change are addressed. I'm working on client control now. If anybody is interested in details just poke me.
     
    Last edited: Oct 13, 2016
  5. moco2k

    moco2k

    Joined:
    Apr 29, 2015
    Posts:
    294
    I also started with having a networked script for each character ability. I was wondering if this is good/efficient design practice or not, so I tried to get some feedback here. Eventually, I have revised my whole code architecture so that all ability scripts are now non-networked scripts and they are all handled by one single networked AbilityManager script which is attached to each player. In my solution, the particular abilities are referenced with unique id's across clients/server and I use override function calls like OnCastServer within the ability scripts which are invoked from the AbilityManager.
     
    Last edited: Nov 1, 2016
    Zullar likes this.
  6. Zullar

    Zullar

    Joined:
    May 21, 2013
    Posts:
    651
    I have also done something similar. I use messages now since they seem to be robust. Using a UNET script for each ability has many issues
    -UNET Inefficiencies (listed above)
    -UNET RPC's not always called (bugs due to NetworkIdentity.observers falling off during scene change)
    -UNET Command hash bug (commands are sent to the wrong script if the method name is not unique... doesn't support inheritance)
    -UNET SyncVars are buggy (they can de-sync and hooks are not always called due to NetworkIdentity.observers falling off and OnSerialize of late connecting players clearing dirty bits)