Multisync monitor

(Redirected from Dual-sync monitor)

A multiple-sync (multisync) monitor, also known as a multiscan or multimode monitor, is a raster-scan analog video monitor that can properly synchronise with multiple horizontal and vertical scan rates.[1][2] In contrast, fixed frequency monitors can only synchronise with a specific set of scan rates. They are generally used for computer displays, but sometimes for television, and the terminology is mostly applied to CRT displays although the concept applies to other technologies.

Multiscan computer monitors appeared during the mid 1980s, offering flexibility as computer video hardware shifted from producing a single fixed scan rate to multiple possible scan rates.[3] "MultiSync" specifically was a trademark of one of NEC's first multiple-sync monitors.[4]

Computers edit

History edit

Early home computers output video to ordinary televisions or composite monitors, utilizing television display standards such as NTSC, PAL or SECAM. These display standards had fixed scan rates, and only used the vertical and horizontal sync pulses embedded in the video signals to ensure synchronization, not to set the actual scan rates.

Early dedicated computer monitors still often relied on fixed scan rates. IBM's original 1981 PC, for instance, was sold with a choice of two video cards (MDA and CGA) which were intended for use with custom IBM monitors which still used fixed scan rates. The CGA timings were identical to NTSC television, whereas the MDA card used a custom timing for higher resolution to provide better text quality. Early Macintosh monitors also used fixed scan rates.

In 1984, IBM's EGA added a second resolution which necessitated the use of a monitor supporting two scan rates, the original CGA rate as well as a second scan rate for the new video modes.[5] This monitor as well as others that could be manually switched between these two sync rates were known as dual-scan displays.[6]

The NEC Multisync was released in 1985 for use with the IBM PC, supporting a wide range of sync frequencies including those for CGA, EGA, various extended forms of those standards marketed by third party vendors, and standards yet to be released.[4]

IBM's 1987 VGA standard, in turn, expanded to three fixed scan rates. At this point, PC and Mac owners with multiple graphics cards required unique monitors for each of them,[7] and by the late 80s all of the below computer video standards required monitors which supported a small number of specific frequencies:

  1. PAL, NTSC, CGA: ~15.7 kHz horizontal scan, 50 or 60 Hz vertical scan
  2. EGA: 15.7 kHz (CGA compatible mode) or 21.8 kHz horizontal scan, 60 Hz vertical scan
  3. VGA: 31.5 kHz horizontal scan, 60 or 70 Hz vertical scan. No support for CGA/EGA timings. CGA/EGA resolutions are transmitted to the monitor at VGA compatible timings.
  4. XGA: 35.5 kHz horizontal scan, 87 Hz (43.5 Hz interlaced) vertical scan (plus VGA modes)
  5. Many different display formats for Macintosh, Sun, NeXT, and other microcomputers

After 1987's VGA. the IBM market began to develop Super VGA cards which used many different scan rates, culminating in the VBE which established standardized methods for outputting many different resolutions from one card, eventually becoming the Generalized Timing Formula which permitted graphics cards to output arbitrary resolutions.

By the late 1990s, graphics cards for microcomputers were available with specs ranging from 1024x768 at 60 Hz, to at least 1600x1200 at 85 Hz.[8] In addition to these higher resolutions and frequencies, during system boot on systems like the IBM PC, the display would operate at standard low resolution, such as the PC standard of 720x400 at 70 Hz. A monitor capable of displaying at both resolutions would need to be able to horizontally scan in a range from at least 31 to 68 kHz.

In response, VESA established a standardized list of display resolutions, refresh rates, and accompanying timing for hardware manufacturers.[9] This was superseded by VESA's Generalized Timing Formula, which provided a standard method to derive the timing of an arbitrary display mode from its sync pulses,[10] and this in turn was superseded by VESA's Coordinated Video Timings standard.

Implementation edit

Early multisync monitors designed for use with systems having a small number of specific frequencies, like CGA, EGA and VGA, or built-in Macintosh graphics, supported limited fixed frequencies. On the IBM PC, these were signaled from the graphics card to the monitor through the polarities of one or both H- and V-sync signals sent by the video adapter.[5]

Later designs supported a continuous range of scan frequencies, such as the NEC Multisync which supported horizontal scan rates from 15 to 31 kHz[4] derived from the sync signal timing rather than the polarity of the sync signals.[11] Displays like these could be used on multiple platforms and video cards as long as the frequencies were within range.

Modern monitors produced using the VESA frequency standards generally support arbitrary scan rates between specific minimum and maximum horizontal and vertical rates. Most modern multiscan computer monitors have a minimum horizontal scan frequency of 31 kHz.[12]

In both multisync and fixed-sync monitors, timing is important to prevent image distortion and even damage to components.[13] Most modern multiscan monitors are microprocessor controlled[14] and will refuse to attempt to synchronise to an unsupported scan rate, which usually protects them from damage.

Non-CRT monitors edit

The multisync concept applies to non-CRT monitors, such as LCDs, but is implemented differently.

LCD monitors are fixed-pixel displays, where the number of rows and columns displayed on the screen are constant, set by the construction of the panel. When the input signal has a resolution that does not match the number of pixels in the display, the LCD controller must still populate the same number of image elements.

This is accomplished either by scaling the image up or down as needed, creating a picture that does not have a 1:1 relationship between LCD image elements and pixels in the original image, or by displaying the image unscaled in the center of the monitor, filling the spaces on all sides with black pixels. While stand-alone LCD monitors generally accept a wide range of horizontal scan rates, the majority of LCDs accept only 60 Hz to 75 Hz vertical scan rates. In recent years, LCD monitors designed for gaming have appeared on the market offering vertical scan rates of 120 Hz and up.[15] These monitors are usually referred to by their specific max refresh rate.

Television edit

CRT televisions are typically designed to operate only with the video standard of the country they are sold in (PAL, NTSC, SECAM), but some sets, particularly broadcast monitors, can operate on multiple standards.

References edit

  1. ^ "13 What's the difference between fixed frequency and multisynchronous monitors?". 070808 stason.org
  2. ^ "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. Multimode monitors can measure the incoming sync signal frequencies and thus sync to any frequency within their range of operation.
  3. ^ "MultiSync 25th Anniversary – The Evolution of the MultiSync". NEC Display Solutions, Ltd. Archived from the original on 1 November 2021.
  4. ^ a b c Inc, InfoWorld Media Group (1986-10-27). InfoWorld. InfoWorld Media Group, Inc. {{cite book}}: |last= has generic name (help)
  5. ^ a b IBM Enhanced Color Display Manual (PDF). p. 1.
  6. ^ Inc, InfoWorld Media Group (1988-08-22). InfoWorld. InfoWorld Media Group, Inc. {{cite book}}: |last= has generic name (help)
  7. ^ Inc, InfoWorld Media Group (1988-08-22). InfoWorld. InfoWorld Media Group, Inc. {{cite book}}: |last= has generic name (help)
  8. ^ Inc, InfoWorld Media Group (1997-12-15). InfoWorld. InfoWorld Media Group, Inc. {{cite book}}: |last= has generic name (help)
  9. ^ Inc, Ziff Davis (July 1993). PC Mag. Ziff Davis, Inc. {{cite book}}: |last= has generic name (help)
  10. ^ "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. Q: How will GTF help the monitor automatically set itself to any timing format? / A: GTF defines the relationship between syncs and video signals at any frequency of operation. The display can measure the incoming sync frequency, and thus can predict where the image will start and finish, even though it may not have been preset at that operating point.
  11. ^ "PC Mag 1987-03-31 : Free Download, Borrow, and Streaming". Internet Archive. 31 March 1987. Retrieved 2020-08-16.
  12. ^ "Converters | RetroRGB". Retrieved 2020-08-16.
  13. ^ "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. Sync signals for displays drastically affect the quality, performance and even reliability of CRT displays. Even small differences in timing parameters can significantly affect image position and size, causing problems for the user. Difference in blanking times can lead to excessive power dissipation and electrical stress in the scanning circuits, or at the other extreme, incomplete or distorted images being displayed.
  14. ^ "Standards FAQ". VESA - Interface Standards for The Display Industry. Retrieved 2020-08-16. In order to identify the mode, most present day multiple frequency monitors use a simple microcontroller to measure syncs.
  15. ^ "List of 120Hz monitors – Includes 144Hz, 240Hz Blur Busters".

External links edit