We've studied this bitrate issue a lot, mostly to help our customers in the field.
We've also studied the use of live streaming by weather people, thus my interest in this group.
We're actively planning future cameras around the needs of weather people, with exotic features so good that I'll get into trouble if I reveal them before we finish.
However, new products take a long time and so for now I'd like to help you comply with that internet provider.
I've never seen more than a 30% difference in bit rates among IP cameras. So I looked up Hikvision's bit rate recommendations. Their 720p 30fps at "best" quality requires 6144Kb. At the lowest quality they recommend, it still needs 3072Kb. If it's streaming to YouTube on less than that, it's likely because the Hikvision can't stream there on its own and a computer is re-compressing the stream for it. Computers have lots of GPUs they can use, lots of memory, and that Pentium itself is no slouch. So they can significantly lower the rate in ways a little IP camera can't. They can also tinker with the quality settings and lower them when they're causing traffic issues, unknown to you.
But if you need a computer in order to stream a brand of IP camera, the streaming becomes less reliable due to more equipment in the path to go wrong.
We set our default for 720p at 4000, well towards the lowest Hikvision recommends. And there's lots of code in our cameras to automatically drop the bit rate if the internet provider is having trouble. If you want to know what's happening precisely, put this in your overlay:
Q: $Sq D: $Sd P: $Sz $Sb
Q=current quality setting (higher is lower quality), D=dropped frames, P=pauses to try to help out the internet provider, and the $Sb is the actual current bitrate in use, regardless of what you have specified in the setup menu.
If you used up half the town's bitrate, that's going to be reflected in a much lower actual bitrate on the overlay. You might be surprised at what bit rate it's actually running. 1200 is common for 720p. However, allowing up to 4000 in the setup menu does mean, when it can, it'll use up 4000 even if it needs to adjust to unnecessarily high quality settings.
Knowing your internet provider is keeping tabs, you can set it not to do that by tinkering with the values, after you use the presets. Always start with the presets, then tinker if you must. It's possible to get decent views as low as 400, and one train station we helped had to run at 150 for a while, due to poor internet service.
You could adjust your 720p bitrate down to 1200 on the H.264 tab. Then set the quality to 30 instead of the default. The quality is actually compression ratio, so 30 is more compression than the 16 default. What that'll do is, tell the camera that even if it can use up to 1200 by increasing the quality, don't bother if you are at least at 30. That's defined as high quality in the H.264 standard. Then it won't use up bandwidth needlessly trying to reach the limit you set.
Also consider, though people claim to be able to see the difference between 30fps and 20fps, I've tested a few "experts". Their guess is not much better than 50/50 when looking at a 20fps stream. So you could set the frame rate of that 720p to 20fps, and lower the quality to 1000. If you lower the frame rate, always lower the bitrate too or the lower frame rate won't help. It should still be fairly unnoticeable, because at 20fps, that 1000 is more like 1500.
And consider, most viewers of live streams are using cellphones, and up to 60% of all viewers are being sent 144p by YouTube. They really mess with the sizes! You might have set 720p, but most viewers are seeing less than half that.
Bottom line: If you want to keep that camera streaming within what the provider said, Select 360p, lower the frame rate to 20, increase P frames to 63, set bitrate at 400, and set quality to 30. Then let me take a look.
We had a train station customer who had to run at 150 for a while. It looked fine. No viewers complained about it.