Network bandwidth refers to how much data you have room to send over the wire. As you approach the limit, you will see increased amounts of packet loss. If you go over the limit, you will eventually get disconnected from your session.
The XNA Framework measures bandwidth in bytes per second. Confusingly, network vendors like to measure it in bits per second (I suspect they do this to make their numbers look bigger). Even more confusingly, both units have the same acronym: bps, or kps/kbps if you are talking about thousands of them. Be wary, and always check which you are dealing with.
In order to be playable by the majority of home Internet users, Xbox games are expected to work with as little as 8 kilobytes per second of both upstream and downstream bandwidth.
You might think you could figure out your bandwidth usage by adding up the size of all the data you are sending, multiplying by the number of machines you are sending it to, then multiplying again by the number of sends per second.
But you'd be wrong.
You forgot to include the packet headers. Every time you send a packet over the network, in addition to the data inside that packet, you must factor in:
That is ~50 bytes of header data per packet.
Consider an example where we are sending a single boolean value to a single other player, 60 times per second:
Whoah! We are using 3 kilobytes per second (remember we only have 8 in total) even for so little real data. 98% of our bandwidth is being wasted on packet headers.
How can you survive this deadly attack of the killer packet header gremlins?