PS3 Scales 1080i To 480p On HDTVs 125
Dr. Eggman writes "According to an article from IGN, PS3 owners are finding that 1080i-only HDTV sets are scaling down launch games to 480p. The scale-down occurs because the launch games do not support 1080i, however they should be scaling down to an HD resolution of 720 instead of 480p. It is unknown if this is a technical or software issue and if it can be patched soon." ABC news is reporting that a patch which should be available to PS3 owners soon will correct the backward compatibility issues we discussed the other day.
Summary is a bit misleading (Score:5, Informative)
Article is unclear (Score:5, Informative)
Some PS3 launch games outputs at 720p
Lot of 2-3 years old HDTV cannot display 720p, but can do 1080i just fine.
But the PS3 is incapable of upscaling the game's graphics to 1080i. (unlike the xbox 360 for example)
Hence, the only display available for them is 480p.
To sum up : buy PS3, hook up to HDTV, play in 480p. (some games, some TV)
Re:1080i 720p? (Score:3, Informative)
Summary is wrong (Score:5, Informative)
No, the problem is that they don't support 1080i. The PS3 should be scaling from 720p to 1080i (which the 360 does), not 1080i to 720p.
The issue here is that older HDTV's only support 480p, 480i, and 1080i - not 720p. This is all stated very clearly in the article.
I know that commentors don't seem to read the articles on Slashdot, but shouldn't the submitters?
Re:Summary is a bit misleading (Score:2, Informative)
Re:Reason? (Score:3, Informative)
The 360's scaling is done before the signal is converted to analog. Anyways, if this was actually the problem, Sony could have just made the PS3 only capable of scaling up to 1080i on analog outputs. I don't know of a single HDTV out there with digital inputs that doesn't handle 720p, and there's no question that for games it's better to use 720p than 1080i.
Re:Out of curiosity (Score:3, Informative)
There's some question as to 1080i vs 720p in quality, and I have never heard of a 720i or 480i (although I see no reason why they might not exist).
Re:1080i 720p? (Score:5, Informative)
Sure, it only shows 540 lines "at a time", but the next 540 lines are not the same 540 lines, but the ones in between the previous 540 lines, making up the full 1080 line display. Your eyes don't work fast enough to see that one is off while the other is on, and the chemicals on the inside of the CRT keep their "glow" long enough to minimize or even eliminate flicker. Non-CRT sets, like dlp/plasma/lcd/d-ila/sxrd work a little differently and showing interlaced footage in a progressive manner can lead to visible "combing" unless the set de-interlaces, but I won't go into that here.
1080i has 1080 lines of resolution, but like your old standard definition television, it refreshes every other line alternately. So, the first half of the refresh mode (1/60th of a second) refreshes lines 1,3,5,7 etc (fields a) and the other half refreshes lines 2,4,6,8 etc (fields b). So that while it refreshes 60 times a second, it only shows you 30 full frames.
720p conversely, shows you 60 full frames of 720 lines in sequence, per second.
If it's shot at 1080/30p, it still gets broadcast as 1080i, and you still see 30 full 1080 line frames per second.
If it's shot at 1080/60i, it gets broadcast as 1080i, and you see 60 "half-frames" per second, because the movement of the subject changes between fields a and b.
If it's shot at 720/60p, it usually gets broadcast as 720p, but some stations only broadcast 1080i regardless of source, in which case each set of 720 lines would be interpolated to 60 full frames of 1080 lines, and then only half of each gets broadcast. Still looks great, but it's not as detailed.
If a station broadcasts at 720p regardless of source, it gets a little complicated. 1080i sources are basically converted to 540p and bobbed (fields b are moved up one line so the image doesn't shake up and down), and then gets stretched to 720p. It retains all the information of the 540 lines, but doesn't have as much detail as 720 lines, obviously. Now, if the 1080i source was shot 1080/30p and gets broadcast at 720p, each frame needs to be downsized, and then repeated, to make up the missing 30 frames from the 60p signal.
Additionally, if a movie comes in a 1080/24p source, it gets broadcast either as 1080i with 3:2 pulldown, or it gets broadcast as 720p with 2:3 frames (3:2 pulldown repeats fields, 2:3 frames repeats frames) in order to bring it up to 60 fields (for 1080i) or 60 frames (for 720p).
Confused yet?
It's not that hard when you understand why it is the way it is.
In the case of the PS3, it's pretty lame that 720p gets converted down to 480p, but since it's a slightly simpler process (1 full frame = 1 full frame vs. 1 full frame = 2 half frames), I can't really blame them for using it on the launch games.
I have an older CRT HDTV that only does 1080i/480p/480i and can't do 720p, so of course I'm disappointed, but all good things to those who wait.
Re:Out of curiosity (Score:3, Informative)
480i is just called either NTSC or SDTV, which is 480 lines interlaced. This is why you never hear it refered to as 480i. The whole naming by scan lines (ie: 480p, 720p, 1080i/p) is a reletively recent phenomenon since EDTV (480p) only came out about 10 years ago, and called for a new naming system.
720i does not exist, to my knowledge. By the time they got to the 720p standard, there was no reason to go interlaced anymore, because progressive scan will ALWAYS be superior in image quality when showing in a native format. Now, there is a bit of question as to whether 1080i or 720p is superior. I would say 720p is a safer bet, because interlacing causes a lot of weird flickery effects, and I would be totally up for sacrificing some screen resolution to get rid of those.
Re:Article is unclear (Score:3, Informative)