Perhaps not quite on-topic, but close. Now that TV1 & 2 have moved to 1080i, all channels in NZ are now interlaced. It seems to me that modern TV's must have good deinterlacing to display Freeview from their built-in tuners at a good quality.
It makes sense to switch the HTPC to output 1080i and have the TV do the deinterlacing. It means getting away with a much less powerful gfx card.
But I was wondering, what type of deinterlacing do TV's do? Do they have a vector adapative delinterlacing chip these days?