I'm getting a bit bothered by my lack of understanding of what is really going on with NetworkBehaviour serialization. To confirm: I am overriding OnSerialize/Deserialize of a NetworkBehaviour with manual encoding, sync vars are not being used. So far everything does seem to work, but I feel a gap in knowledge about what I am doing and that == future bugs. C#. In OnSerialize, what happens when I write to the buffer and then return false? Should I be returning false before any writes occur? Should I always return false when OnSerialize[initialState] is true, since this seems to imply a full state update for a client and not an actual change in dirty data? Does NetworkSettingsAttribute[sendInterval] actually control the rate of overridden OnSerialize, or is it only for SyncVars? Do both the return of OnSerialize and sendInterval simply provide different ways of accomplishing the same task (deferring updates for a later time)? Thanks for any clarification. This project requires a lot of serialization that isn't appropriate for SyncVars and is preferable to RPCs, and I want to avoid creating a mess of confusion later if I am doing things incorrectly.