In video engineering, color framing refers to the color frame sequence of fields in a composite video signal through which the video frame timing and chrominance subcarrier signal timing—in particular, that of the color burst -- cycle through all possible phase relationships.

The exact nature of the color frame sequence depends on the video standard being used. In the case of the three main composite video standards, PAL video has an 8-field (4 frame) color frame sequence, and NTSC and SECAM both have 4-field (2 frame) color frame sequences.

Preserving the color framing sequence of video across edits and between channels in video effects was an important issue in early analog composite videotape editing systems, as cuts between different color sequences would cause jumps in subcarrier phase, and mixing two signals of different field dominance would result in color artifacts on the part of the signal that was not in sync with the output color frame sequence.

To help prevent these problems, SMPTE time code contains a color framing bit, which can be used to indicate that the video material the timecode refers to follows a standard convention regarding the synchronization of video time code and the color framing sequence. If the color framing bit was set in both types of material, the editing system could then always ensure that color framing was preserved by constraining edit decisions between input sources to keep the correct relationship between the timecode sequences, and hence the color framing sequences.

Color framing has become largely an issue of historical interest, first with the advent in the 1980s of digital composite video timebase correctors and frame stores, which could regenerate the color frame sequence of a composite signal at any phase, and later with analog component video editing and modern digital video systems, in which subcarrier phase is no longer relevant.

See also edit