I regularly see 1080i referred to as being inferior to 1080p. Ignoring arguments about whether interlaced pictures are "worse" or "better" based on the temporal/spatial resolution tradeoff, could a helpful person explain to me why, in a TV transmission context, 1080i needs less bandwidth than 1080p? I'm confused because PAL SD bandwidth is identical regardless of whether it's film or video being transmitted.
Assume frame rates are identical. Show your workings :-)
Cheers,
Tom