freaknout: Just reading through, and excuse my ignorance, but can someone explain to me how the bitrate can remain the same at the same resolution when the FPS halves? Surely that’s just wrong?
For what’s it’s worth I imagine the change is down to ensuring quality of service once the demand ramps up. The World Cup is not far away and that will put considerable strain across the network. A downgrade now is going to impact a lot less people than the middle of the World Cup!
Can I also ask, that’s the best way to watch. I can use my iPad etc but will need to cast to the TV. Would it be better to buy an Apple TV or something to avoid the casting?
Thanks
J
Because the bit-rate is the same. Bit-rate is a measure of the bits per second.This hasn't changed
Generally when you encode video you provide it with a target bit-rate and it will try to output video at that rate. This is the amount of information that is needed per second to make up the video. This is a pure data value, it doesn't take into account any of the video features of the stream, i.e. resolution, frame-rate or colour space.
You can transmit a 480p at 25fps video at 6Mbps and it would look great as the information per-pixel-per-frame would be high. You can also transmit a 2160p (4K) at 25fps and the video would probably look rubbish as the information per-pixel-per-frame will be a lot lower.
The same goes for frame rate. You could transmit the information at 2160p at 1 fps and still 6Mbps and the information per-pixel-per-frame would be a lot higher than it would at if it was the same resolution at 25fps.
From the NZ NOG video Spark showed the encoding profiles, I don't have the material on hand. Both the 60fps and 30fps streams were at 6Mbps meaning that the theoretical information per pixel for the 60fps stream was halfed.
Note this is somewhat simplified, video encoding is more complex than this but the main point stands.