Talk:Progressive scan
This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Twittering image
editThe twittering is annoying my eyes, and makes it very annoying to even read the first part of the article, since it will be on your screen unless you make the window smaller. It shouldn't be necessary for people to resize their window in order to pleasantly read the article. Can't we do something about it? Make a "hide image" button or similiar? — Preceding unsigned comment added by Wanze (talk • contribs) 14:46, 13 April 2011 (UTC)
- Agreed, the reason it was thumbnailed to begin with was because the "Seizure causing propensity of the image was 'disputed'" previously already. Someone just reverted it to full size a couple days ago. —Preceding unsigned comment added by 98.110.37.72 (talk) 22:05, 15 April 2011 (UTC)
"Kell factor"
editThe current article (on 27OCT05) refers to a factor of 0.6 to determine the visual perception of vertical interlaced lines? Can anyone substantiate this? How precise is it? Some thoughts : Why is it not a factor of 0.5? How much does line blooming interfere with this perception. Is it dependent on the brightness/contrast settings of a particular screen?
- This is called the "Kell factor". I'd read before that it is .6 for interlaced displays and .9 for progressive. However I did a google search to verify this, and found all sorts of conflicting info with no one description predominating.Algr 09:49, 12 January 2006 (UTC)
Anti-Aliasing
edit- "This rough animation compares progressive scan with interlace scan, also demonstrating the interline twitter effect associated with interlace. The interlaced images use half the bandwidth of the progressive ones. The left-center image precisely duplicates the pixels of the progressive one, but interlace causes details to twitter. Real interlaced video blurs such details to prevent twitter, but as seen on the right-center image, such softening (or anti-aliasing) comes at the cost of image clarity. A line doubler could not restore the previously interlaced image on the right to the full quality of the progressive image on the left. Note - Because the refresh rate has been greatly slowed down, and the resolution is much lower than that of typical 480-line interlaced video, the flicker in the simulated interlaced portions and also the visibility of the black lines in this image are exaggerated."
- Could someone rewrite this part ? The Anti-Aliasing part seems to generalize a problem with LCD/Plasma displays. I am not aware of SD CRTs with Anti-Aliasing. Also the "Note" seems a bit odd.
I feel that I am lacking the English skills to rewrite it myself. DCEvoCE (talk) 00:24, 22 March 2008 (UTC)
- His main point is that real footage is always intenionally softened a little to suppress interlace twitter. So, twitter is not a big issue even though the images suggest it is. The NOTE part is not quite right: The author is refering to the artificially slowed framerate being a cause of exageration - but it is of a still image so interlace doesn't show up at all. The first image could also be an interlaced one, there is no way to tell except by the comment saying "Progressive Scan". His point about the blocky resolution is valid though. Evanh (talk) 14:35, 11 July 2008 (UTC)
SCART ?
editdoes progressive scan works over SCART Euro connector?
- No. The bandwidth required would exceed of what the SCART cable is capable of transmitting. It's also not included in the SCART standard. Even if in theory it would be possible (and I think I read that somewhere), you would still need a device capable of outputting a progressive signal via SCART and a display capable of receiving a progressive signal via SCART. --- From my experience most DVD players just output a composite signal via SCART if 'Progressive Scan' is enabled. DCEvoCE (talk) 00:24, 22 March 2008 (UTC)
- Only true for SD composite wiring of SCART. There is also RGB wiring of SCART, it is equivalent to Component cabling in quality and capability. So, SCART is in theory capable of higher screen modes than Standard Definition. Evanh (talk) 14:14, 11 July 2008 (UTC)
1080i equal or poorer quality than 720p?
editFrom the article:
" HDTV standards such as 1080i (1920x1080, interlaced) in most cases deliver a quality equal to or slightly poorer than that of 720p (1280x720, progressive), despite containing far more lines of resolution."
I don't think that is an accurate statement.
My experience has been that if you put someone in front of two identical TVs showing the same thing but one in 720p and one in 1080i, people nearly always describe the 1080i picture as being the "sharper" one (the exceptions being if there are a very thin horizontal lines in the picture or a small object moving quickly (relative to the screen)).
- That are two entirely different things. 1080i has the higher resolution, 720p has the higher frame rate. It also depends a lot on what your are displaying on which display. I think an example might help: Take your old CRT VGA monitor and your HDTV CRT capable of 1080i. 240p vs 1080i - what's better ? That said, and considering that LCD/Plasma displays have to deinterlace all non-progressive signals, I would recommend to use 720p as 1080i might look not as good on such a display. - That's probably where the above statement in the article comes from. DCEvoCE (talk) 22:10, 21 March 2008 (UTC)
- They are both the same framerate unless you are talking about pulldown encoded footage. But then it's not real interlace any longer. Uncompressed - interlaced is superior, period. In compressed form though, that advantage is reduced. As for deinterlacing, I think only LCD's have to deal with this, there is very good adaptive algorithms, they can both effectively motion blur and also show a still with progressive like detail, just like the analogue CRT's always have been able to. I wouldn't be concerned about this. Evanh (talk) 14:09, 11 July 2008 (UTC)
Progressive Scan NTSC
editIs the article implying that HDTV in the US can't support progressive scan at all? If so, this is in error. MSTCrow 10:28, 29 May 2006 (UTC)
- Yes, in fact the opposite is the case: LCD/Plasma don't support interlaced signals. They have to deinterlace it. And this has nothing to do with whatever region they're from. DCEvoCE (talk) 22:12, 21 March 2008 (UTC)
New sections
editI've split the article into sections dealing with progressive as image recording technique and as a display technique. Both have different advantages and issues, and this confusion is source of much debate and questions. I hope this clears things up. Ricnun 15:44, 27 October 2006 (UTC)
Could be made clearer
editI just don't think that this article really explains progressive scan. Examples and metaphors would be helpful. Also, a quick definition of its opposite (interlacing) might also be helpful.
-- trlkly 01:13, 6 August 2007 (UTC)
- I did my best to overhaul this article to provide a clear definition on progressive scan, and its applications at present. RyokoYaksa 12:32, 9 August 2007 (UTC)
Requires higher bandwidth?
editRequires higher bandwidth transmission rates than interlaced video of the same display resolution. As such, progressive signals requires higher bandwidth mediums in order to work, such as component video, HDMI, and digital broadcasting.
This isn't always true, because progressive material usually uses lower frame rate. So in case of i60 vs p24 it can need even less bandwith than interlaced.
- This is not how it works in practice. DVD movies are encoded in 24p, but you still can't get progressive output without the use of component video cables. Console video games, for example, have varying framerates of 30 and 60 fps depending on the game. With either framerate, component video cables are still required for progressive scan output. The general lesson here is that the framerate of the material has almost no correlation to the bandwidth it uses when streamed to a display device. RyokoYaksa (talk) 04:06, 28 December 2007 (UTC)
- That is because analog cables always use 50 or 60Hz. But this is just for analog cables between DVD and TV, not digital broadcasting. —Preceding unsigned comment added by 88.101.76.122 (talk) 17:10, 28 December 2007 (UTC)
- -.-; The same concept applies to over-the-air TV broadcasts. It is still a constant stream of data regardless of the framerate of the material. EDTV is not broadcastable by analog television standards, but is broadcast digitally. RyokoYaksa (talk) 17:59, 28 December 2007 (UTC)
- It would be possible to broadcast 25fps progressive in analog and it would need exactly same bandwidth as usual PAL. Just because there is no standard for that doesn't mean it's not possible.--88.101.76.122 (talk) 12:16, 29 December 2007 (UTC)
- 25p progressive is being broadcast for decades by using 2:2 pulldown. Yep, a TV has to be smart enough to detect that fields belong to the same progressive frame, this is what better TVs are able to do, just the same as 60i TVs are able to detect 2:3 pulldown and display full 1080p24 out of 1080i60 video stream when a movie is shown. Mikus (talk) 19:29, 22 January 2008 (UTC)
- I believe 2:3 pulldown converts from 24 to 30 Hz. This suits the 60 Hz NTSC TV rather than the TV changing to 48 Hz to suit 24 Hz film. Also, a display doesn't have to be smart to deal with this encoding. If it's an analogue TV then it just displays the picture as interlaced due to the pulldown being encoded into the interlaced signal. Though, scan converting displays, includes all digital displays, do have to be smart to deal with deinterlacing of true interlaced pictures. Evanh (talk) 05:26, 17 February 2008 (UTC)
- 25p progressive is being broadcast for decades by using 2:2 pulldown. Yep, a TV has to be smart enough to detect that fields belong to the same progressive frame, this is what better TVs are able to do, just the same as 60i TVs are able to detect 2:3 pulldown and display full 1080p24 out of 1080i60 video stream when a movie is shown. Mikus (talk) 19:29, 22 January 2008 (UTC)
- It would be possible to broadcast 25fps progressive in analog and it would need exactly same bandwidth as usual PAL. Just because there is no standard for that doesn't mean it's not possible.--88.101.76.122 (talk) 12:16, 29 December 2007 (UTC)
- -.-; The same concept applies to over-the-air TV broadcasts. It is still a constant stream of data regardless of the framerate of the material. EDTV is not broadcastable by analog television standards, but is broadcast digitally. RyokoYaksa (talk) 17:59, 28 December 2007 (UTC)
- That is because analog cables always use 50 or 60Hz. But this is just for analog cables between DVD and TV, not digital broadcasting. —Preceding unsigned comment added by 88.101.76.122 (talk) 17:10, 28 December 2007 (UTC)
Definition of Progressive Scan appears to be entirely wrong
editI was under the impression that "Progressive Scan" as a name was introduced to label the encoding method done with films when converting them to video formats. Either that or it's just another marketroid term that gets fitted to whatever they want the current trend to be.
With the original non-interlaced video format there was a resolution of around 250 lines per frame. When interlacing was added the display doubled in resolution without reducing the framerate nor without needing any changes in the bandwidth while only the smallest of tweaking to the video electronics was needed. It was a cheap analogue compression technique.
Video framerates are at 50 or 60 hertz. All interlaced and non-interlaced video transmissions are at these framerates. Proper interlacing (Ie: Non-progressive) encodes both time and space in every field and can not be shown as a combined freeze frame.
Progressive scan performs a useful function when converting analogue film since it is only filmed at the pedestrian framerate of 24 hertz. There is the opportunity to fiddle the encoding of an interlaced display to give an improved freeze frame by treating the two fields of the interlaced display as a single frame at 25 or 30 hertz. This is simply done by encoding only spacial data in the second field, leaving out any time component since it didn't exist in the master film.
With digital compression, which replaces the reason for interlacing in the first place, and the new HDTV specs and displays to match, interlacing will never be used again. As such, progressive, which relies on the interlacing system, is also an out-of-date term relegated to history.
- Where did you get that? Yes, it would be great thing, but it is not true. TV broadcast interlaced in DVB too.--88.101.76.122 (talk) 12:25, 26 January 2008 (UTC)
- That is the reason interlacing existed - as a compression technique. So, it stands to reason that digital compression supersedes interlacing. When comparing two recordings at the same framerate the improvement in detail gained by interlacing is lost in the reduced effectiveness of the digital compression across alternate frames. Evanh (talk) 03:44, 27 January 2008 (UTC)
- You don't understand me. Interlacing IS used in DVB broadcasts. You are right, it is no longer necessary with digital broadcast, but you are wrong when you say it is no longer used. It is.--88.101.76.122 (talk) 11:45, 28 January 2008 (UTC)
- I can think of one reason why some studios might prefer interlaced encoding. For professional recording/editing where the image is not stored with lossy compression there is a distinct size advantage in using interlaced. The downside is interlaced is more difficult to edit though and as capacities improve there will be less incentive to save space. Maybe it'll continue to be used, maybe it won't. Evanh (talk) 05:28, 17 February 2008 (UTC)
- No, there is not any advantage. For 50Hz countries it is of the same size. 60Hz, uncompressed interlaced is even 3/2 of the progressive. AFAIK, noone uses 50/60Hz progressive, only 25/24fps progressive. TV series are originally shot 24fps progressive but broadcast 50/60fps interlaced.
- That's true except you have halved the framerate and are no longer talking about an interlaced picture, even for PAL/NTSC on an analogue TV. That's part of my whole argument - that the above method of encoding is the true progressive scan system. The HDTV relabeling was a bad move. That said, what's done is done. Making clear the generational change is the best thing to do here. Evanh (talk) 06:40, 21 February 2008 (UTC)
- huh? Of course I'm not talking about interlaced picture, when I'm talking abou progressive. WTH you wanted to say with that? —Preceding unsigned comment added by 88.101.76.122 (talk) 10:01, 21 February 2008 (UTC)
- Then that's the worst of both worlds - a DVB transmission using interlaced then only feeding pulldown encoded video. —Preceding unsigned comment added by 203.97.117.35 (talk) 11:32, 21 February 2008 (UTC)
- I just noted that 1080p50 is not a currently supported screenmode. That little detail right there will be the sole reason why 1080i50(1080p25) is in use. This appears to be due to MPEG2 being the compression of choice for the moment. One assumes this will change in the future, along with better efficiencies at higher framerates. Evanh (talk) 13:40, 6 May 2008 (UTC)
As a side note, I believe progressive may have also been used to refer to the equivalent behavior for loading large web-page images over slow links where the image is spatially transfered in an even manner. Resulting in low detail at first but the finished image is much higher detail.
Evanh (talk) 13 January 2008 (UTC)
Applying the above to the example images at the top of the main article, all four examples can be from the same progressive scan source. The difference is the display itself:
- In the first picture the display has a higher bandwidth than the transmission and is configured for progressive scan decoding by assuming the second field is of the same time period as the first field.
- In the second picture the display is operating at the transmission rate and visually operates the same as the first picture when the source is progressively encoded. Although, the example is a still image so it looks the same even if the source was true interlaced.
- In the third picture the display is no different to the second. The difference is the source has been modified. Maybe a softening effect.
- In the fourth picture the display has higher bandwidth like the first but it's not configured for progressive scan decoding, and therefore is simply deinterlacing at full framerate by only working with a single field at a time. A relationship between the two fields can not be assumed with true interlacing.
Evanh (talk) 05:39, 14 January 2008 (UTC)
07:06, 14 January 2008 (UTC)
To summarise: Progressive is explicitly not sequential. The existing definition is that of sequential scan and should have it's own page.
Evanh (talk) 12:22, 14 January 2008 (UTC)
- Evan, what are you talking in this "This is simply done by encoding only spacial data in the second field, leaving out any time component since it didn't exist in the master film" is called Progressive segmented Frame, this is a way to record progressive video using interlaced equipment and media. —Preceding unsigned comment added by Mikus (talk • contribs) 19:24, 22 January 2008 (UTC)
- Interesting, just had a peek at that link, it appears to reinforce my point (saying that it was introduced in late nineties) about the definition of progressive changing. Reading further it goes on to link to 2:2 pulldown, this appears to be the original name. I think some clarification and linking to both these articles should be added to both interlaced article and progressive article. That way, there will be far less confusion about what progressive now means. Evanh (talk) 08:53, 26 January 2008 (UTC)
Evanh, you haven't provided any evidence for your argument. You simply assert a certain meaning for "progressive" ("representing a sequential image as two interlaced fields") and then assume that your definition is correct for the rest of your post. Even the progressive segmented frame article uses phrases such as "PsF allows the progressive format to be maintained", implying that "progressive" == "sequential". – Smyth\talk 11:58, 26 January 2008 (UTC)
- I acknowledged, as of the "late nineties" progressive is now defined as the same as sequential. Is that not okay? Obviously, the 2:2 pulldown encoding existed long before that date. I certainly knew it as "progressive scan" back then. The correct term appears to be 2:2 pulldown. And some references to this might help out with all the confusion I've seen where people keep asking about the difference between interlaced and progressive DVDs and TV broadcasts. Evanh (talk) 02:52, 27 January 2008 (UTC)
- PS, Please leave the disputed tag in place. It's needed so more people can address the issue. Evanh (talk) 03:44, 27 January 2008 (UTC)
- In addition to the confusion of naming ... the practical difference of the pulldown encoding is it halves the recorded framerate - given that the bandwidth of PAL/NTSC/DVD are fixed at the one rate. This is fine for film conversions but it's a backwards step for TV/video which appear to be making use of it. This has not been identified in the disadvantages section. Evanh (talk) 10:04, 27 January 2008 (UTC)
A disadvantage of 2:2 pulldown, compared to what?
- Compared to PAL/NTSC interlaced. The disadvantage is 2:2 pulldown is half the framerate of interlaced. Pulldown encodes to 25/30 Hz via the 50/60 Hz interlaced transmission.
- This article is not about pulldown, it is about progressive scan. Progressive scan does not have to be transmitted at half of interlaced scan, if fact it does not have to relate to interlaced scan at all, like 720p60. The scanning rate itself can be any rate you like. Try thinking in abstract terms. As a side note, I think there should be different terms for "scan" as representation on a display device and "scan" as transmission. These were the same things for analog TVs, but they are different things now. Mikus (talk) 07:26, 18 February 2008 (UTC)
- Oops, missed your response when I was still composing further down. Yes, you've highlighted a good point there. There is three distinct areas. Recording/editing, distributing/transmitting, and finally displaying. HDTV labeling of interlace and progressive only deals with displaying. And more specifically, display after an up/down conversion, ie: converted to the display's native resolution, which puts us right back at reuse of the terms for exclusive naming of screen-modes. A sorry state of affairs imho.
- You mention that HDTV's interpretation does not have to reduce framerate or resolution to achieve the same 720p60 as 720i60. That's true but it's also not a fair comparison as it requires a connection with double the uncompressed bandwidth of the interlaced version to achieve it. Granted this is only talking about the connection between the decoder and the scan converter, ie: the HDMI connection, but it's still a fact. Best to keep the apples and oranges in their own baskets. This is also why 720p60 gets compared to 1080i60 - they are the equivalent uncompressed bandwidth. Evanh (talk) 13:16, 19 February 2008 (UTC)
- I agree that the article has to emphasize that the whole interlace deal was a result of cramming more lines and faster refresh rate into limited bandwidth, and compromize known as interlacing was made. Now we have to deal with the same channel bandwidth set half a century ago, because broadcasters did not want to lose number of channels. If bandwidth were not an issue, all TVs would have been progressive from day one. In an abstract world with no computing and bandwidth limitations, progressive scan is better than interlaced in all and every regard. Mikus (talk) 19:13, 19 February 2008 (UTC)
- For PAL/NTSC, it was a bit more than the number of channels locking things down. Compatibility was a huge issue for a long time. And they were able to keep notching up the visual quality over the years anyway. After that it seems there was no incentive to modify analogue instead of going digital. Evanh (talk) 11:43, 20 February 2008 (UTC)
Incidentally, if you agree that the word "progressive" does not usually now refer to 2:2 pulldown, then perhaps you should take this discussion to another page. – Smyth\talk 12:34, 30 January 2008 (UTC)
- No better place. It will serve as a heads-up for all those with the same confusion. I need to think up suitable additions to add to this article and the interlaced article then remove the disputed tag.
Let's start from the beginning.
Evan, you are saying that sending two movie half-frames as two video fields constitutes progressive signal. Not exactly. Devices chained between a TV station and TV set do not care what sort of signal do they transmit, well yes, it is 50/60Hz consisting of two fields, but what exactly these fields represent is beyond them. A display device like a TV set can be progressively scanned or interlaced. CRTs and ALiS panels are natively interlaced, LCD, plasma and projection displays are natively progressive. AFAIK, there is no extra information in the video stream, that tells to a display device, whether the transmitted video is supposed to be interlaced or progressive when interlaced transport is used, so the display device does its own analysis, this is why we have a new market for vendors like Faroudja or Silicon Optix or DVDO. If a display recognizes signal as progressive, it will display it at once as full frame, preserving resolution, if not, then it will treat it as interlaced. That is it.
- Except that when using two fields to convey a single picture then you've lost framerate. That's a step backwards.
- I don't get what you are arguing with. Before you said that this sort of transmission is progressive. Now you are saying that this is a step backwards. These are orthogonal statements, moreover, there is no step backwards. This sort of transmission makes sense for stations that broadcast in interlaced, it is a smart use of existing technology. If you want 60p, use 1280x720 until technology for broad use of 1080p60 emerges. —Preceding unsigned comment added by Mikus (talk • contribs) 07:31, 18 February 2008 (UTC)
- I'm talking about using two fields to make one picture. That halves the framerate compared to interlaced at the same uncompressed bandwidth. This is a step backwards. This is why people are confused about progressive, because it is inextricably linked to this encoding. Reusing the progressive label for naming HDTV screen-modes can only have been a deliberate obfuscation, imho.
- Another point - interlacing existed for it's advantage in lossless storage/transmission. Linking it to a screen-mode naming scheme is also a twist on that one. All in all, it's a complete relabeling that is separate from pre-HDTV days. I'm getting a better understanding now, thanks. Evanh (talk) 08:39, 18 February 2008 (UTC)
On the other hand, there are ways to transmit truly progressive signal, one that does not have fields in it, for example 720p60. I don't know whether natively progressive transmission exist for standard def.
- Yep, it's half the line count.
So what you call "progressive signal" should be called something like "video originated as progressive, transmitted using interlaced transport, preserving its progressive nature, and intended for recovery back to progressive by a receiver". But if the receiver fails to recognize what you call progressive as progressive, the whole scheming falls apart. This scheme is called PsF for high-def applications, Sony also calls it progressive scan when it means sending progressive via interlaced equipment. This sort of field "packaging" is also called 2:2 pulldown. It might had been called progressive before HDTV with its native progressive recording and transmission capabilities has appeared, but strictly speaking 2:2 pulldown is not equal to progressive scan. Mikus (talk) 19:13, 5 February 2008 (UTC)
- Thanks for the input and acknowledgment. Evanh (talk) 06:01, 17 February 2008 (UTC)
I would like to know what progressive scan means in regards to the sticker you see on DVD Players. The DVD is scanning the audio/video data from a DVD and the outputing that to the Television. I don't see how all this talk about Broadcasting and DTV etc is even relevant when you are using a DVD Player with a TV. It has nothing to do with the broadcasting signals, does it? Also, an explanation from a consumers perspective would be nice also since it appears that the manufacturers are putting these stickers on the front of their units and on the boxes to make the DVD Players appear more feature packed and good value to the consumer. Is there any benefit to buying a DVD Player that has Progressive Scan over one that does not? What is the benefit? What does it do? What is the end benefit or detraction the user will experience? How do I follow this thread? --Avrfan (talk) 11:55, 22 April 2008 (UTC)
- I doubt it has any additional meaning, and just using the new buzzword because it can also apply to anything using the old pulldown encoding. DVD's have no need for a screenmode of say 576p50 because DVDs are either true interlaced or operating at half framerate progressive over interlace. As a side note, 576p25 is a missnomer, it's really 576i50 with pulldown encoding, ie: the original progressive. All that said, it is possible that such a player could produce a 576p50 signal on it's component outputs (or HDMI) even though DVD's don't use it. The player would perform a scan conversion from 25 Hz to 50 Hz on the fly to achieve this. Evanh (talk) 13:22, 6 May 2008 (UTC)
- Actually, it really is pointless for a DVD player to support higher than Standard Definition. If a screen can go higher then it'll have all the fancy conversion hardware already built-in to make the picture as good as you can get from a DVD. Evanh (talk) 14:53, 11 July 2008 (UTC)
- Actually, that isn't true. If a DVD player playing (say) 576i material were to have the output upconverted to 1080i and then fed to a 1920x1080 display, the perceived image is much sharper than if the 576i output were to be fed to a 720x576 display. Although this appears to be counter intuitive there are good technical reasons why this is so. I am not about to embark on an explanation here because it would require a large article on its own, but part of the explanation is because not all of the 720x576 pixels are presented to the viewer and thus interpolation takes place. 86.143.181.133 (talk) 17:15, 2 August 2009 (UTC)
- Lol, what a bullshit response! Up-converting in the display or up-converting in the player are the same thing. A higher res display will do it's own up-converting. Buying a DVD player just because it can do that function is wasted money and quite pointless for a DVD player itself to do. Evanh (talk) 09:08, 12 December 2010 (UTC)
- Actually, that isn't true. If a DVD player playing (say) 576i material were to have the output upconverted to 1080i and then fed to a 1920x1080 display, the perceived image is much sharper than if the 576i output were to be fed to a 720x576 display. Although this appears to be counter intuitive there are good technical reasons why this is so. I am not about to embark on an explanation here because it would require a large article on its own, but part of the explanation is because not all of the 720x576 pixels are presented to the viewer and thus interpolation takes place. 86.143.181.133 (talk) 17:15, 2 August 2009 (UTC)
- I just don't get why you guys keep insisting that 576p would be "HD". All my SD DVDs that I author are 576p (when PAL), both before and after authoring and burning. The information is stored progressively as 576 lines on the DVD, it's only the standalone player that on its analogue SD outputs interlaces the signal on-the-fly by halfing the spatial resolution and doubling the temporal resolution. Never had a problem with my setup consisting of progressive disks, standalone players, and CRT monitors.
- When a standalone DVD player is "progressive", it either just outputs exactly what's on a progressive disk, or deinterlaces it on-the fly with an interlaced disk. The result is clearly still SD in either case. The only benefit for that is if you have a progressive monitor (that won't have to deinterlace the footage if your player already does it). AFAIK, a CRT might have trouble displaying such a genuinely progressive signal.
- Or are you guys saying that I'm "up-converting" my SD files into HD when capturing my old interlaced VHS tapes and then deinterlace them before authoring? Then what about those of my tapes that ain't even interlaced to begin with? Are they "HD"? :P Of course, all this applies to PAL, as we're talking about 576p.
- In any case, I guess the confusion stems from the fact that progressive scan can have four different meanings down the delivery chain:
- a.) Progressive vs. interlaced imaging/capturing/sampling in the camera (or telecine unit? not quite sure if the telecine unit should rather be filed under b.) below, as telecine is really a transitional process between a.) and b.) where an originally fixed medium, i. e. film, is sampled/captured again).
- b.) Progressive vs. interlaced storing of the footage on a medium.
- c.) Progressive vs. interlaced signal transmitted via a cable or over-the-air.
- d.) Progressive vs. interlaced displays.
- Every step of the way can be combined with other types (progressive or interlaced) done on the other steps. Anybody wanna do the math on how many combinations are possible? Even more combinations can be added when you realize that there's two different ways of "interlacing" a progressive(ly originated) image: a.) Both fields are taken from the same frame, and b.) One field is from one frame, and the other field is from either the next or the prior frame.
- Interestingly, when displayed on a CRT (or on a progressive monitor with spatial interpolation), this artificially "interlaced-in-post" footage can't be told apart from footage that's really still progressive, because the specific amount of motion blur of the original capturing process (see a.) in the bulleted list above) is not affected by this artificial post-pro interlacing done for/in steps b.) through d.). It's this specific amount of motion blur that's part of the fabled "film look" that people intend to achieve with progressive imaging, as interlaced imaging in the camera differs from its progressive counterpart by having less motion blur due to a faster shutter speed, i. e. 50/60 fields per second instead of 24/25 frames.
- Too much motion blur, on the other hand, is also bad, as you can see when temporally deinterlacing by field-blending progressively originated footage that has been interlaced-in-post. The ideal amount of motion blur for both progressive and interlaced capturing has been determined as resulting from a shutter speed that is twice the framerate you're shooting with, i. e. 1/48 s for 24p, 1/50 s for 25p, 1/100 s for 50i, and 1/120 s for 60i. That way, you'll have the ideal amount of motion blur when doing a "poor-man's progressive scan" by field-blending your interlaced-originated footage. It's also why the standard shutter angle with a movie camera for chemical film is 180° (as 360 / 2 = 180). Shutter speeds deviating from twice the framerate are only for specific artistic effect, or for emergency when you have very poor lighting situations and your lowest f-stop just ain't enough. --87.151.17.65 (talk) 01:34, 5 August 2012 (UTC)
seizures
editThe seizure-causing propensity of this page is disputed... --Elindstr (talk) 04:01, 22 September 2008 (UTC)
I agree. That image should be behind a link with a warning. that could really take some people off guard who are sensitive to that. --AnalogWeapon (talk) 19:50, 28 October 2009 (UTC)
I Don't Understand Something.
editSorry, but when the second paragraph starts with "This system...", which 'SYSTEM' is being referred to? The article was just prior to that talking about the "interlaced" system at the end of the previous/first paragraph and yet the article itself is in fact about "Progressive Scan". So is the author refering to the article's subject matter or what was just being written about with reference to "This" at the previous juncture?
Again, i apologise for sounding so darn pedantic, but i honestly do not know what to infer and am just trying to set up my Plasma TV with my new Blue Ray recorder. Naturally i want to get the best bang for my buck. So thanks most genuinely to any good soul who might be able to help out.
PS. Believe it or not, i used to be manager of a large TV/Video Store some years ago, but things have changed a lot in the time since i 'got out of the game' so to speak. I am, therefore, not a complete novice, but this one i'd like to learn more about for future reference and just plain old knowledge for a deeper understanding to try and keep up to date. I often find myself being asked by family and friends for my advice and would certainly hate to lead anyone astray with half baked understanding.
Thanks one and all, TTFN!
Outofthewoods (talk) 14:43, 20 June 2009 (UTC)
- Problem is there is two separate definitions for Progressive and this is not made clear in the main article. One definition of progressive is how an interlaced signal can be utilised as two halves of a single frame to make one still picture. The other definition is the names of HDTV screen modes. Evanh (talk) 07:35, 12 December 2010 (UTC)
- Although, as I've recently discovered, the HDTV names might only be that, just names. The distinction between interlace and progressive for HDTV might still be framerate, not resolution. Just as it was in PAL/NTSC days. Interlaced being used for 50/60 Hz footage while progressive gets used for 25/30 Hz footage. In which case the main article is pretty mindless. Evanh (talk) 07:35, 12 December 2010 (UTC)
I Don't Understand Something.
editSorry, but when the second paragraph starts with "This system...", which 'SYSTEM' is being referred to? The article was just prior to that talking about the "interlaced" system and yet the article is about "Progressive Scan". So is the author refering to the article or what was just being spoken about with reference to "This" at this juncture?
Again, i apologise for sounding to pedantic, but i honestly do not know what to infer and am just trying to set up my Plasma TV with my new Blue Ray recorder. Naturally i want to get the best bang for my buck. So thanks most genuinely to any good soul who might be able to help.
PS. Believe it or not, i used to be a TV/Video salesman, but things have changed a lot in the years since i 'got out of the game' so to speak. So I'm not a complete novice, but this one i'd like to learn more about for future reference and just plain old knowledge for a deeper understanding to try and keep up to date.
Thanks one and all, Outofthewoods (talk) 14:43, 20 June 2009 (UTC)
Benefits of Progressive Scan
editI see an odd statement in this article. In the benefits of progressive scan section it says, "Higher vertical resolution than interlaced video with the same frame rate." This is just flat out false. I have checked the older versions of this article and it has been a part of this article for a few years, but it has never been substantiated with a source. There is no vertical resolution increase, either real or perceived. I think it's time that this gross misunderstanding goes away.--68.32.17.238 (talk) 21:13, 6 September 2010 (UTC)
- Correct. Most of what's in the article is marketing hype and some is plain wrong. The big feature that interlacing added was a doubling of vertical resolution while maintaining the same 50/60 Hz framerate. Sadly, progressive encoding has taken over (promoted) which halved the framerate down to 25/30 Hz. Evanh (talk) 07:08, 12 December 2010 (UTC)
- I was playing with a HDTV recording (from my PVR) on my PC just today (prompting me to revisit this page) and it dawned on me that the same encoding systems are still in play. I realised that interlacing/progressive encodings are still arbitrarily interchangeable on a field by field basis exactly as they always were with PAL/NTSC. I bet the same sync'ings are still embedded in the HD video streams. Evanh (talk) 07:08, 12 December 2010 (UTC)
- I'm going to remove the statement. It's been there for too long. If anyone has any problem, than state your case. --68.32.17.238 (talk) 05:30, 19 January 2011 (UTC)
- You're wrong, it's interlacing that halves the resolution, exactly because each field contains only half the spatial resolution of a progressive frame (interlace = same image area, but only half the spatial sampling frequency). Deinterlacing is a process whereby the algorithm is trying to "guess" the spatial information that's missing exactly because of the interlacing process. What's doubled with interlacing is temporal resolution, especially so because any progressive 50/60Hz modes are almost inexistant when it comes to capturing techniques, i. e. what (both analogue and digital) cameras and telecine units are set to.
- It's exactly why the German term for interlacing (and it's the Germans at Telefunken who invented interlacing in 1930 or 1931) literally translates to "half-frame", because that's just what it is, half a frame, i. e., only half the spatial resolution of a progressive frame at the same line resolution ("same line resolution" when one progressive frame is compared to two interlaced fields added together). Interlacing has twice the temporal resolution at the same line resolution, but only half the spatial resolution of progressive imaging.
- CRTs tried to make up for the spatial information lost due to interlacing by dumb line-doubling (i. e. effectively halfing the "half-frame" into a quarter-frame!), and then alternatively displaying the top and the bottom field at a 50/60Hz rate. You could see that quite clearly with CRTs and VCRs when you had a scene of interlaced footage with fast motion in it and hit pause. The result was a strong flicker where the motion was bouncing back and forth one frame in those image areas with motion in them because the CRT was alternatively displaying the top and bottom field (yes, there also was flicker due to bad tracking, but that's a different issue).
- So with a CRT, 480i really means 120 effective lines, and 240 effective lines only on a progressive display (which on such a progressive display must then be deinterlaced with a sure more sophisticated process than dumb line-doubling, such as spatial interpolation or temporal blending). On the other hand, 480p means 240 effective lines on a CRT, and really 480 effective lines on a progressive display. Same goes for PAL, but I just used NTSC figures here because they're a bit easier to do the math with. So, progressive = double the spatial resolution compared to interlaced, no matter what type of display.
- Anyway, I originally came here for a different reason. Why the Heck is it called "progressive" anyway? That information's still missing (interlaced? :P) from the article. When I first came upon Progressive scan as a marketing term around 2000, it was used for middle-class aka prosumer camcorders with a variable electronic shutter to keep computer monitors in the background from flickering. --87.151.17.65 (talk) 00:16, 5 August 2012 (UTC)
See Also 1440p
edit- There is no standard resolution called "1440p". We can assume they mean WQHD, but I don't think the this page should re-enforce the incorrect practice of adding "p" to the end of arbitrary horizontal pixel counts and calling them a resolution. (This is in regards to the see also links at the bottom of the page). 69.245.62.159 (talk) 15:27, 7 September 2012 (UTC)
Line Doublers ..?
editIn section 1, 'Example of Interline Twitter', the term line doublers is used without further explanation or hyperlink. Could somebody please add an explanation of this term or a link ... --86.159.93.132 (talk) 10:11, 16 November 2014 (UTC)
Image Order?
edit'On the left there are two progressive scan images. In the middle there are two interlaced images and on the right there are two images with line doublers ... A line doubler shown in the bottom center picture cannot restore the previously interlaced image'
I'm am extremely confused by this: if the 'line doubled' examples are just shown in the right hand column, the image at bottom-center cannot be line doubled ...?
@ 2A01:5EC0:7807:E1AC:1:0:A243:79B4 (talk) 02:59, 7 January 2024 (UTC)