Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.
View this topic in a long page with up to 500 replies per page Create new topic
1 | 2 
7851 posts

Uber Geek
+1 received by user: 787

Subscriber

  Reply # 195302 11-Feb-2009 13:01
Send private message

pcruthven: I agree... I would like higher bit rate. Better quality encoding process to get rid of all pixilation on the screen.



How about Sony releasing a range of ?Superbit BluRay? movies?!? Can the Bonus extras and give me a quality visual experience? if you really need extras, include a bonus (cheap) DVD fill of extras and bonus stuff.

Isn't that now being done with The Criterion Collection

http://www.criterion.com/library/bluray

 

Pity that they region lock all their titles.  Plebs outside the US don't need apply..

"26. Are Criterion’s Blu-ray discs region-encoded?
Yes. Criterion is licensed to sell most of its editions only in North America."

 





Regards,

Old3eyes


3282 posts

Uber Geek
+1 received by user: 208

Trusted

  Reply # 195428 11-Feb-2009 23:19
Send private message

sbiddle: If you had a theoretical 3840 x 2160p screen then 720x576 content would have to be upscaled to fill that. This is just like sticking a magnifying glass on top of a newspaper and complaining the image looks grainy. Unless you have native content in that resolution then it won't look great.

What rubbish.  First you say that resolutions over 1080p (for example) will be indistinguishable due to the viewing distance, then you go on to say that for a given distance higher resolution will look worse.  It's a complete contradiction.

Do you honestly think if you had a 2160p screen, the picture would look worse?

By your original argument, even if simple resizing algorithm was used the eye would be unable to distinguish the extra pixels, so what's the harm?  It is absolutely noting like using a magnifying glass on top of a newspaper, unless you mean the pixel size remains the same (and hence the screen size is doubled).  I thought we were discussing decreasing pixel size, as evidenced by your quoting of the Visible Resolution Limits graph?  The image on the 50" screen isn't gettng any bigger unlike the magnified newspaper.

2584 posts

Uber Geek
+1 received by user: 5

Mod Emeritus
Trusted
Lifetime subscriber

  Reply # 195462 12-Feb-2009 08:49
Send private message

bazzer, this is a common issue. Many people find that watching SD tv on a HD pannel looks worse then watching the same signal on the same size SD pannel.

Maybe you need to go down to your local TV store and look for yourself. You will have to ask for them to put Sky or similar on the TV because they know it shows just how bad they can look.

The same is true if you play a low res video on a computer screen then play it though a TV. From my experiance it will always look better on the TV screen.

I can't explain why this is any better then sbiddle I can just confirm that it does happen.







Media centre PC - Case Silverstone LC16M with 2 X 80mm AcoustiFan DustPROOF, MOBO Gigabyte MA785GT-UD3H, CPU AMD X2 240 under volted, RAM 4 Gig DDR3 1033, HDD 120Gig System/512Gig data, Tuners 2 X Hauppauge HVR-3000, 1 X HVR-2200, Video Palit GT 220, Sound Realtek 886A HD (onboard), Optical LiteOn DH-401S Blue-ray using TotalMedia Theatre Power Corsair VX Series, 450W ATX PSU OS Windows 7 x64

3282 posts

Uber Geek
+1 received by user: 208

Trusted

  Reply # 195467 12-Feb-2009 09:39
Send private message

Nety: bazzer, this is a common issue. Many people find that watching SD tv on a HD pannel looks worse then watching the same signal on the same size SD pannel.

Maybe you need to go down to your local TV store and look for yourself. You will have to ask for them to put Sky or similar on the TV because they know it shows just how bad they can look.

The same is true if you play a low res video on a computer screen then play it though a TV. From my experiance it will always look better on the TV screen.

I can't explain why this is any better then sbiddle I can just confirm that it does happen.

Think of it this way, the 2160p panel has twice as many pixels in each direction.  The "worse" algorithm we could use would simply copy a pixel to the three adjacent pixels.  In that case, effectively we are increasing the pixel pitch and reducing the screen to 1080p.  How can this look worse than a native 1080p panel?

Especially considering sbiddle himself argued that increasing resolution is indistinguishable to the human eye!

2584 posts

Uber Geek
+1 received by user: 5

Mod Emeritus
Trusted
Lifetime subscriber

  Reply # 195472 12-Feb-2009 10:02
Send private message

As I said I can't explain why it looks worse I just know it does. It has been talked about several times on GZ where people have asked why their nice new HD TV makes Sky TV look like crap when it used to look OK on their last TV.

I agree with you that more pixels should not be able to make a picture look worse then it does with less but the simple fact is it can.
As I said go have a look for yourself.

However there are two issues here that I think are being treated as one when they are not.

One is what we are talking about. Does a picture look better on a screen running at the same resolution as the image look better then the same picture on a higher resolution screen?

The second is at what point can you no longer see an increase of resolution.

As far as the second point I have to say that after watching a full HD TV last week I am not as convinced that the distances they give are correct at least for me. I could see the difference further back then it is suggested I should have been able to.

I think it would be a case of diminishing returns but it may be possible on a home theatre projector screen to get advantages by going to a greater then 1080p resolution.







Media centre PC - Case Silverstone LC16M with 2 X 80mm AcoustiFan DustPROOF, MOBO Gigabyte MA785GT-UD3H, CPU AMD X2 240 under volted, RAM 4 Gig DDR3 1033, HDD 120Gig System/512Gig data, Tuners 2 X Hauppauge HVR-3000, 1 X HVR-2200, Video Palit GT 220, Sound Realtek 886A HD (onboard), Optical LiteOn DH-401S Blue-ray using TotalMedia Theatre Power Corsair VX Series, 450W ATX PSU OS Windows 7 x64

6666 posts

Uber Geek
+1 received by user: 568

Trusted

  Reply # 195475 12-Feb-2009 10:23
Send private message

Yeah going by that distance chart that's around, you really shouldn't be able to tell much of a difference on smaller tv's at normal viewing distances.  Larger TV's though, coupled with some peoples closer viewing distances, and better eyesight means you can notice a difference more than a chart says you should.

The big difference is definitely for projectors, where you're routinely talking a minimum 80" screen, and often 100 or 120".  At that size and standard viewing distances, then there's is scope for more native resolution, and then more higher HD content.  We're going to run into the problems listed above if we have content that's recorded below the native resolution of the TV.

I think that's got a lot to do with it as well.  We're seeing that with broadcast bandwidth compared to physical bluray media etc.  Movies on TV3 are 1080(i) but they will look better on bluray, even though it's essentially the same resolution, (well 1080p but you get the idea).  It's one thing to have that grid of pixels, but you need to keep the update of them high to really get the best out of the system.

41 posts

Geek


  Reply # 195869 15-Feb-2009 05:30
Send private message

bazzer: I think the point is that it doesn't make much sense sticking to 1.78:1 ratio when lots (most?) widescreen movies are shot at 2.39:1.  I guess the idea is to mimic the theatrical experience at home, so shouldn't everything move towards that ratio?


I'm not sure most widescreen movies are shot at 2.39:1. From the data I have off the IMDB, for feature-length movies (not TV shows) released in 2008:
  • 20% had an aspect ratio of 1.78:1
  • 40% had an aspect ratio of 1.85:1
  • 29% had an aspect ratio of 2.35:1 (aka 2.39:1)

From what I can see, ever since 1980, flat 1.85:1 AR overtook 2.35:1 as the most common format for feature-length movies with the split being roughly 50/25 since, at least until the last couple of years. With the prevalence in the past 24 months of cheaper digital 16:9 equipment for new filmmakers, 1.78:1 is now becoming a very common release ratio for movies so I'm not so sure it's a bad ratio to stick with for the next decade or more.

This is just my own data from the IMDB though and doesn't include movies that have no reported AR technical data (though virtually all 'mainstream' films do).


217 posts

Master Geek
+1 received by user: 2


  Reply # 196261 17-Feb-2009 12:57
Send private message

sbiddle: The reality is the grater the number of pixels the closer you have to be to differentiate between the resolutions. Here's a chart showing optimum viewing distances for different resolutions.

To view 4k content on a 50" screen the optimal viewing distance is 3 feet. Nobody would ever set this close to a 50" screen. Likewise the optimal viewing distance for 1080 content is around 6.5 feet.



The chart is an estimation of resolvable limits. That is, for a given display resolution on a given size screen, how close do I have to sit in order to be abel to make out individual pixels. It is not a chart of optimal viewing distances; the suggestion that it's "optimal" to sit three feet from a 50" screen is just plain wrong. As you say, no one would sit that close. It's not optimal by any definition.

I'm in agreement with Bazzer here. A larger native resolution and a decent scaling algorithm aren't going to make pictures look worse. Nety, when you're comparing an SD signal on SDTV with SD-on-an-HTDV, is the SDTV a CRT or is it also a flatpanel like the HDTV? I'd guess it's a CRT, in which case you're not really comparing apples with apples :)

1 post

Wannabe Geek


  Reply # 196594 18-Feb-2009 20:22
Send private message

I reckon the next big thing/standard will be 1080p-3D, until then it will be just improving the black level/contrast/colour accuracy etc.

155 posts

Master Geek


  Reply # 198386 27-Feb-2009 13:33
Send private message

Isn't the reason that watching content on a screen above its native resolution often looks bad that often the resolution of the image doesn't divide evenly into the number of pixels on the screen? Hence, some kind of algorithm has to be applied in order to decide what to do with the extra pixels. 720, for instance, is only 2/3 of 1080. So, when watching a 720p video on a 1080p screen, there are 50% extra pixels left on the screen that need to be filled in. One simple algorithm might be to repeat every second pixel, so an row of pixels that starts of like this

1 2 3 4 5 6 7 8

becomes this:

1 2 2 3 4 4 5 6 6 7 8 8

Of course, that means that the apparent pixels on the screen won't be exactly the same size, so the image quality will degrade. I had a similar effect playing starcraft on my friend's laptop screen years ago. Some lines that were one pixel wide in the game's code were displayed as 2 pixels on the screen, while some were displayed as 1. However, there are upscaling algorithms that do a better job of this process - I'm not sure how they work but my guess is that they avoid bulking up detail and instead opt to add the extra pixels in where there are big patches of colour. That's a guess though :D

With this in mind, doesn't it follow that 2160p shouldn't degrade picture quality any more than 1080p does? After all, once the initial degradation of translating it to 2160p is done, simply doubling it up further shouldn't introduce any additional degradation - 1080 divide evenly into 2160!

In fact, you could take this argument a further step - if you can find a native resolution that both 720p and 1080p divide into evenly, then it would be able to play both formats with no loss of quality. Seeing as 720p is 2/3rds 1080p, multiplying 720p by 3 should give you that optimal resolution - 2160p!

So I would have thought that a 2160p tv would be able to play 720p images even better than 1080, like this:

1  1  1  2  2  2  3  3  3  4  4  4  5  5  5  6  6  6  7   7   7   8   8   8  

and at the same time, wouldn't display 1080p any worse than a native 1080p screen:

1  1  2  2  3  3  4  4  5  5  6  6  7  7  8  8  9  9  10 10 11 11 12 12

Of course, then there's the question of how well SD content would play on the screen. It would depend on the specific resolution. I guess most resolutions would look equally bad on both 1080p and 2160p screens, but if there was a resolution that was, say, half of 720p (360p? Does that even exist?), that one would look better on 2160p screens, just as 720p would.

Theoretically though (and I guess this is outside the realm of realistic), a crazy-high rez screen could display all current resolutions decently. All you need to find a number that they all divide evenly into. One such number could be obtained by simply multiplying all the resolutions together, which would provide a truly astronomical number. However, there would almost certainly be a fracton of that number that would do the job, seeing as many resolutions are multiples of previous ones, and whenever that was the case, you would only need to include the largest multiple in that series of resolutions in the equation for it to work.

Wow, that's a lot of writing on something that I'm not even 100% sure about. As I said, that's the reason I *think* that upscaling sometimes looks worse, but I could of course be wrong.

41 posts

Geek


  Reply # 198554 28-Feb-2009 19:23
Send private message

psk20: if you can find a native resolution that both 720p and 1080p divide into evenly, then it would be able to play both formats with no loss of quality. Seeing as 720p is 2/3rds 1080p, multiplying 720p by 3 should give you that optimal resolution - 2160p!


Calculating the optimal native resolution of a display (720p, 1080p, 2160p) is a bit moot once you consider overscan in TV signals and the processing that comes with it. Even if you send a 1080-line signal to a native 1080p set, the video processor will (by default) try to create about 2-4% overscan on all sides of the picture. In effect the processor might scale a 1080 line frame up to 1130 lines and then take the middle 1080 lines and display those, clipping off the unused edges (the overscan). The perfect 1920x1080 gridmatch of the pixels is immediately lost with that resize.

As far as I can see, this means precise resolution calculations are a bit of a waste when TVs by default apply base overscan adjustments to incoming signals anyway, even if signal lines match the native resolution of the display. The reasons for overscan are legacy from the days of CRT-based TVs, but have carried over into the HDTV world. You can turn this overscan behavior off in most TVs by hunting through menus to find the 1:1 pixel mapping or PC-mode setting, but it isn't the out-of-box default to have that enabled.

155 posts

Master Geek


  Reply # 198571 28-Feb-2009 22:25
Send private message

Ahh, true.

Still, 102% or 98% would presumably be better than 150% right?

It definately complicates it... although as you say, you may be able to switch that feature off.

1 | 2 
View this topic in a long page with up to 500 replies per page Create new topic

Twitter »

Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:





News »

Intel introduces new NUC kits and NUC mini PCs
Posted 16-Aug-2018 11:03


The Warehouse leaps into the AI future with Google
Posted 15-Aug-2018 17:56


Targus set sights on enterprise and consumer growth in New Zealand
Posted 13-Aug-2018 13:47


Huawei to distribute nova 3i in New Zealand
Posted 9-Aug-2018 16:23


Home robot Vector to be available in New Zealand stores
Posted 9-Aug-2018 14:47


Panasonic announces new 2018 OLED TV line up
Posted 7-Aug-2018 16:38


Kordia completes first live 4K TV broadcast
Posted 1-Aug-2018 13:00


Schools get safer and smarter internet with Managed Network Upgrade
Posted 30-Jul-2018 20:01


DNC wants a safer .nz in the coming year
Posted 26-Jul-2018 16:08


Auldhouse becomes an AWS Authorised Training Delivery Partner in New Zealand
Posted 26-Jul-2018 15:55


Rakuten Kobo launches Kobo Clara HD entry level reader
Posted 26-Jul-2018 15:44


Kiwi team reaches semi-finals at the Microsoft Imagine Cup
Posted 26-Jul-2018 15:38


KidsCan App to Help Kiwi Children in Need
Posted 26-Jul-2018 15:32


FUJIFILM announces new high-performance lenses
Posted 24-Jul-2018 14:57


New FUJIFILM XF10 introduces square mode for Instagram sharing
Posted 24-Jul-2018 14:44



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.