Why the EGA can only use 16 of its 64 colours in 200-line modes

This was a question which puzzled me when I first found out about it, but now that I understand all the history behind it, it makes perfect sense.

The IBM PC (5150) was originally designed output to an NTSC television in mind - hence the 4.77MHz clock speed (4/3 the NTSC color carrier frequency - allowing the video output and CPU clock to share a crystal). It was thought that home users would generally hook their PCs up the TV rather than having a separate, expensive, monitor. Another major limiting factor in the design of the CGA was the price of video memory - the 16Kb on the card would have been fairly expensive at the time (it was as much as the amount of main memory in the entry level PC). TV resolution is 912x262 at CGA 2-colour pixel sizes in non-interlaced mode, but TVs (especially CRTs) don't show all of the image - some of those scanlines and pixels are devoted to sync signals and others are cropped out because they would be distorted due to the difficulties of approximating high frequency sawtooth waves with high-voltage analog circuitry. So 320x200 4-colour and 640x200 2-colour packed pixel modes were chosen because they were a good fit for both 16Kb of memory and TV resolutions.

That system did work quite well for many home users - lots of CGA games have 16-colour composite output modes. But it wasn't so good for business users. These users tended not to care so much about colour but did care about having lots of columns of text - 80 was a common standard for interfacing with mainframes and for printed documents. But 80-column text on a TV or composite monitor is almost completely illegible, especially for colour images - alternating columns of black and white pixels in a mode with 320 pixels horizontally gets turned into a solid colour by NTSC. So for business users, IBM developed a completely separate video standard - MDA. This was a much simpler monochrome text device with 4Kb of memory - enough for 80 columns by 25 rows of text. To display high quality text, IBM decided on a completely different video standard - 370 scanlines (350 active) by 882 pixels (720 active) at 50Hz, yielding a 9x14 pixel grid for high-fidelity (for the time) character rendering. In terms of timings, the character clock is similar (but not identical) to that of the CGA 80-column text mode (presumably 16.257MHz crystals were the closest they could source to a design target of 16.108MHz). To further emphasize the business target of the MDA card, the printer port was built into the same card (a printer would have been de-rigour for a business user but a rare luxury for a home user). Business users would also usually have purchased an IBM 5151 (green-screen monitor designed for use with MDA) and IBM 5152 (printer).

CGA also had a digital TTL output for displaying high quality 16-colour 80-column text (at a lower resolution than MDA) on specially designed monitors such as the IBM 5153 - this seems to have been much more popular than the composite output option over the lifetime of these machines. The two cards used different memory and IO addresses, so could coexist in the same machine - real power users would have had two monitors, one for CGA and one for MDA (and maybe even a composite monitor as well for games which preferred that mode). The 9-pin digital connectors for CGA and MDA were physically identical and used the same pins for ground (1 and 2), secondary intensity (7), horizontal sync (8) and vertical sync (9) but CGA used 3, 4 and 5 for primary red, primary green and primary blue respectively whereas MDA used pin 7 for its primary video signal. MDA also used a negative-going pulse to indicate vertical sync while the CGA's vertical sync pulse is positive-going.

So for a while these two incompatible standards coexisted. The next major graphics standard IBM designed was the EGA, and one of the major design goals for this card was to be an upgrade path for both home and business users that did not require them to buy a new monitor - i.e. it should be compatible with both CGA and MDA monitors. This was accomplished by putting a 16.257MHz crystal on the card and having a register bit to select whether that or the 14.318MHz one would be used for the pixel clock (and by having the on-board video BIOS ROM program the CRTC appropriately). By 1984, it was not out of the question to put 128Kb of RAM on a video card, though a cheaper 64Kb option was also available. 64Kb was enough to allow the highest CGA resolution (640x200) with each pixel being able to display any of the CGA's 16 colours - these would have been the best possible images that CGA monitors such as the IBM 5153 could display. It was also enough for 4 colours at the higher 640x350 resolution - allowing graphics on MDA monitors. With 128Kb you got the best of both worlds - 16 colours (from a palette of 64) at 640x350.

IBM made a special monitor (the 5154) for use with the EGA. This monitor could display both 200-line and 350-line images (deciding which to use by examining the vertical sync pulse polarity), and allowed users would be able to take advantage of all 64 colours available in 350-line modes. The video connector was again physically the same and pins 1, 3, 4, 5, 8 and 9 had identical functions, but pins 2, 6 and 7 were repurposed as secondary red, green and blue signals respectively, allowing all 64 possible colours. But they wanted this monitor to be compatible with CGA cards as well, which meant that in 200 line mode it needed to interpret pins 3-6 as RGBI instead of RGBg and ignore pins 2 and 7. So even with a 5154, the EGA needed to generate a 4-bit signal when connected to a CGA monitor, disabling pins 2 and 7.

I guess the designers thought that sacrificing 48 of EGA's colours in 200-line modes was a small price to pay for making the EGA monitor compatible with CGA cards. Presumably they thought that if you had an EGA card and an EGA monitor you would be using 350-line mode anyway, or be running legacy CGA software which wouldn't miss those extra colours.

One thing I haven't mentioned here is the PCjr graphics. For the purposes of the discussion above it's essentially the same as CGA (it has the same outputs) but it's more flexible and slower due to the use of system RAM as video RAM, as many 8-bit microcomputers did in the 80s.

42 Responses to “Why the EGA can only use 16 of its 64 colours in 200-line modes”

  1. VileR says:

    Nice! This is something I've often wondered about, as an erstwhile EGA owner. Hi-res (350-line) games were rare, but they got to pick and choose their colors out of the full EGA palette; while low-res (200-line) games, which were prevalent, were limited to the 16 fixed CGA colors.
    Back then, it seemed as if the low-res modes were needlessly crippled - I mean, in those days of limited video RAM, lower resolution should've meant *more* colors, not less!... right?

    After I wised up a little, I assumed that this was simply a means of preserving compatibility with RGBI/CGA displays: just set the jumpers correctly, and you get the assurance of 4-bit RGBI signals that your monitor can handle. But that begged the question: why didn't they add *separate* 200-line modes with customizable palettes, for use with specialized EGA monitors that could understand 6-bit signals?

    Now that I've read your post, it seems that the key factor in this limitation was the 5154 monitor design: Vsync polarity was inextricalby tied up to scanline count and pinout configuration. So 200-line graphics with the enhanced 6-bit signal just couldn't be tramnsmitted, even though the card had the capacity for it (at leas with 128KB/256KB RAM). Am I getting it right? :)

    • Andrew says:

      Yes, exactly - becuase the 5154 has to be compatible with CGA cards as well as EGA cards, if you feed it a 200-line (vsync positive) signal it has no way of knowing whether it is connected to a CGA card or an EGA card, and therefore has to be conservative and assume a 4-bit signal. That in turn means that the EGA card has to generate a 4-bt signal in 200-line mode. If you reprogrammed the registers right, I think it would be possible for the EGA card to generate a 200-line 6-bit signal, but no monitor would display it correctly because they are all CGA/200-line-EGA compatible and therefore assume that all 200-line signals are 4-bit.

      • Bero256 says:

        Actually, you would be able do display it correctly on a TV which has RGB. Since the 200 line mode uses the same frequency as NTSC, you can just combine the primary and secondary RGB signals as well as the sync signals and adapt it to SCART. European TVs are better for this as all TVs from at least the late 80s have RGB SCART

        • Andrew says:

          Yes, that's true. You'd need a little circuit to change each channel from primary+secondary TTL to analogue, but that's very simple. But of course that would not have been a useful solution for the design of the EGA card and 5154 monitor. A better solution might have been to add another connection (only needs to be a single wire) between the EGA card and the monitor to tell the monitor that it is connected to an EGA card (and should interpret the signal as RrGgBb) and to tell the EGA card that it is connected to an EGA monitor (and should generate RrGgBb signals) and thereby to tell the software that it can reprogram the palette registers in 200-line modes. But I guess IBM felt that would be too complex, and besides which VGA was on the horizon which solved all these problems and more. Perhaps they would have considered it if they had used a connector with 10 or more pins on the CGA and MDA instead of 9, so that there would have been room for that sense signal without adding another connector.

  2. [...] Reenigne blog Stuff I think about « Why the EGA can only use 16 of its 64 colours in 200-line modes [...]

  3. Scali says:

    I'm glad I found this.
    I've been playing around with EGA a bit, and couldn't figure out why although I could write values to the palette registers, only 16 unique colours were displayed at most in 320x200. Each colour was duplicated 4 times in the total palette.
    This was on a VGA card, but apparently it emulates this EGA behaviour. I found one other card so far that did NOT emulate it, and showed all colours as expected.

    So I read your blog and tried to use mode 0x10 (640x350) instead of 0x0D (320x200). And indeed, in that mode, the palette works just fine.
    Nasty behaviour anyway. For games/demos, 320x200 is a far more important resolution. 640x350 was just too slow (and wouldn't allow for double-buffering on real EGA cards with only 128k memory).

    Now I not only know what is going on, but also why: IBM in their infinite wisdom made their EGA monitor backward-compatible with CGA. Just great. That's why I'm missing 2 intensity bits, and the third is changed to global intensity.

    I do wonder though: this seems to be purely the monitor's doing? I could not find any information on any EGA registers which would control any kind of colour remapping mode. So I wonder if the EGA card just output the colours in 320x200... meaning that the right type of monitor would actually display them all... but do such monitors even exist in the first place?

    • Andrew says:

      Yes, it's purely the monitor's doing - from the point of view of the EGA card there's no difference in colour handling between 200-line and 350-line modes - it's just a question of setting the palette registers to the . If you set bits 5 or (especially) 3 in the palette register you could potentially damage a CGA monitor or the EGA card when connected to a CGA monitor, though, since pin 2 (bit 3) is ground on CGA.

      There's no monitor that can do all 64 colours in 200-line modes for that reason, and because EGA was quickly replaced by VGA, which has a much more flexible colour output (the DAC registers) with 18 bits instead of 6.

      • Scali says:

        Ah yea... I have been reading the original IBM EGA programmer's manual, which I found here: http://www.minuszerodegrees.net/oa/oa.htm

        It is not too clear about this situation, but it does talk about how the card behaves with a CGA monitor connected, and with an EGA monitor connected.
        So it works both ways: the EGA monitor is backwards compatible with 200-line CGA output, but also, the EGA output is backwards compatible with a CGA monitor. You can't use the 350-line modes, but the 200-line modes still work (and obviously with the same reduced colours).

        I figured I could work around it on VGA cards because they will probably remap the EGA palette to the first 64 entries of the VGA palette. So if I modify the VGA palette directly, I can probably force the full EGA palette.
        It's just a bit of a catch-22: it would only work on VGA cards, and if I want to support VGA anyway, I might as well use proper mode 0x13 or mode X (which funny enough is mostly exploiting the graphics controller functionality introduced with EGA, which is what got me interested in exploring EGA in the first place... On EGA I could do the same tricks: smooth scrolling, double-buffering, fast 'blt' operations using the latches, and quick fill routines writing to multiple planes at a time).

        • Andrew says:

          There is one other possibility I just thought of. You could probably do a "tweak" 320x350 mode by using the horizontal settings from a 320x200 mode and the vertical settings from a 640x350 mode. Then you could change the CRTC registers so that it only generates 200 active scanlines for a 320x200 mode that is seen by an EGA monitor as a 350-line mode, so all 64 colours would be available. However, the pixel aspect ratio would be that of the 350-line mode, not the 200-line mode. I don't think there's any way around that.

          • Scali says:

            Yes... The site I just linked to also has a manual of the IBM 5154 monitor. It mentions the remapping quite explicitly. They say the mode is enabled based on the polarity of the vertical sync signal. Positive pulse is CGA/200-line, negative pulse is EGA/350-line.

            So I guess there is some freedom in reprogramming the CRTC registers, as long as you keep the polarity negative.
            There might be a tweaked mode in there somewhere.

            For now I've just added some code to detect VGA, and in that case, it overwrites the VGA palette after setting mode 0x0D.

            • Andrew says:

              Yeah, the sync pulse timings need to be the right timings to match the sync polarity, so with a negative polarity vsync you still need to generate 366 hsyncs per vsync at 21KHz instead of 262 at 15.7KHz, but within that structure you can generate whatever number of active lines you want.

              • Scali says:

                By the way, this trick of setting VGA palette values in an EGA mode gave me an idea:
                My EGA polyfiller is slightly faster than my unchained VGA mode one, since I only need to fill 4 bits per pixel. A dword write results in 32 pixels rather than 16.
                Also, because they are actual bitplanes, I can do transparency/glenz effects more easily, by just disabling certain planes while writing.

                So then it dawned on me: Triton's Crystal Dream has the fastest polyfiller I've ever seen. Even on a simple 386sx-16 it runs very smoothly. Also, it has a glenz sequence which does not appear to slow down the polyfiller at all (I have tried using the graphics controller's logical or mode in VGA, but you need to read per byte to fill the latches, then write back 1 byte, so you lose the hyperspeed filling of 16 pixels per store, and are limited to 'only' 4).
                But... it does not seem to use that many colours. Theoretically it could be using 16 colours, but those 16 colours would be 18-bit VGA palette entries.

                So I decided to check... and guess what? It sets mode 0x0D, not mode 0x13. So it's not a fast unchained mode VGA filler as I originally thought. Nope, it's EGA!

                • root42 says:

                  Interesting, but does it still use VGA palette registers? Because it seems it uses very smooth transitions in colors of the flat shaded polys.

                  • Andrew says:

                    I think it probably does. So although it uses an EGA mode, it won't display properly without a VGA card. Since other parts of the demo use real VGA modes, it's not like they were sacrificing compatibility just to make the shading a bit smoother.

  4. [...] the EGA palette? Apparently there is a method to this madness, but what? Eventually I landed upon this blog post, and there was an explanation at last! Apparently IBM wanted to keep compatibility between CGA [...]

  5. Yuhong Bao says:

    Anyone here know why EGA graphics mode was 350-line instead of 400-line? If you are wondering why, look up "NEC PC-98".

    • Andrew says:

      I had assumed (as I said in the post) that it was due to compatibility between EGA and MDA (both using an MDA monitor with an EGA card, and using an EGA monitor with an MDA card). Though having said that I'm not sure how compatible they really were since some of the EGA 350-line modes have a different horizontal frequency than the MDA 350-line modes. I'm not sure what the NEC PC-98 had to do with it though - presumably that machine's 400-line mode was based on an interlaced NTSC-compatible timings.

      • Yuhong Bao says:

        I mean, it was one of the biggest reason why the non IBM PC compatible NEC PC-98 in Japan lasted for so long. Japanese characters were typically displayed as 16x16 per character, and 25 lines * 16 pixels equaled 400 lines. The EGA was short by 50 lines!

        • Andrew says:

          Oh, interesting! Is Japanese text not legible with 14 lines then? And is Japanese usually displayed in text mode anyway? I thought there were too many characters so a graphics mode would be used, in which case it would be trivial to display 21 rows of 16-line characters on an EGA screen in graphics mode.

  6. VileR says:

    A curious finding that sent me back to this post: *some* extended EGA clones could indeed output 6-bit color in 200-line modes... or so it would seem. (The plot thickens?)

    I recently came across these screenshots on Mobygames -
    http://www.mobygames.com/game/dos/rambo-iii/screenshots/gameShotId,541305/
    http://www.mobygames.com/game/dos/rambo-iii/screenshots/gameShotId,541312/
    http://www.mobygames.com/game/dos/rambo-iii/screenshots/gameShotId,541319/

    As seen in the first shot, the game supports two "EGA Enhanced 320x200 16 color" modes - Boca and Paradise, as opposed to "Standard IBM Type". The other two screenshots show this "enhanced color" in action. According to the guy who submitted them, they were created using the PCE/ibmpc emulator (http://www.hampa.ch/pce/pce-ibmpc.html).

    The colors in these screenshots are a subset of the full 64-color EGA palette, outside the CGA color range. In true 320x200 mode, this shouldn't be possible on an EGA monitor, as your original post shows.
    Have you run into such extended EGA cards before? I'm wondering how they made this work - is it actually a "tweaked" 350-line mode like you described in an earlier comment (having wonky aspect ratio on a real monitor)? A true 200-line mode avaliable only on non-standard (or multisync) monitors? My guess is the latter, but there are probably other options I'm unaware of.

    • Andrew says:

      My guess would be the latter too. I think it has to be one of those two options (either it works with a standard EGA monitor, in which case it's really a 350-line mode, or it doesn't and isn't). The reason I think it's a VGA monitor with a EGA card that happens to have a VGA connector is that if it was a tweaked 350-line mode it would work fine with a standard IBM EGA and it wouldn't be necessary to have separate modes for Boca and Paradise cards. But it's quite possible that those two cards have different registers for enabling the secondary intensity outputs and/or programming the DAC.

  7. VileR says:

    I went and tried the game under the PCE emulator myself. Funny thing: the emulated card is vanilla IBM EGA w/ a ROM dump, but both the Boca and Paradise modes produce 6-bit "enhanced color". I guess PCE doesn't fully implement the way an EGA monitor would treat the signal; in DOSBox, for instance, both options result in a fallback to the CGA palette, making the colors all wrong (the expected behavior for standard EGA in 320x200).

    All the more confusing that two separate modes were needed, though. I'm not entirely sold that the targeted monitors were VGA - the menu has "CAUTION!" next to these two options, and that suggests some sort of potential damage if you don't know what you're doing. As you've noted in an earlier comment, messing with palette bits could do that to a CGA screen, but anything that needs a VGA connector should do fine.... not sure what to make of it.

    • Andrew says:

      I'm not at all surprised that emulators wouldn't handle these modes correctly - it's pretty rare to find an emulator that supports something that obscure.

      My guess is that the warning is to caution against using those modes with a standard EGA card, or possibly with a digital monitor plugged into the Boca or Paradise card - they might have both kinds of connector.

      • Scali says:

        Well, I know that at least some (most?) Paradise EGA cards don't have a VGA connector. They use a standard 9-pin connector. They predate VGA anyway.
        Have never seen a Boca card, but googling for info gives a vague reference to the Multi-EGA card having a 15-pin analog connector.

        I wonder if the fact that the Paradise cards came with 256k had anything to do with it... That they used a tweaked 350-line mode that only worked with extra memory.

  8. MacD says:

    I do remember n my 286 AT 12mh EGA PC that the picture used to display heavy scanline effect in vertical200modes, while the picture was smooth and clear in Hi-Res 640x200 mode. This always bugged me.

    I also couldn't understand how in hell the EGA had such colour limitations in 320x200 mode while my Amstrad CPC could actually do better than a CGA (it can choose freely from a palette of 27colours, RGB cube 3x3x3).

    The IBM's industrial choices clearly gave the PC almost a decade of lateness compaired to other system such as AtariST, concerning games.

    Amstrad was even bothered because its aparently superior CGA on PC1512 (could display 640x200x16 colours) wasn't totally compatible with the lame CGA standard.

    Thank you for this interesting article.
    I wished they would simply add a switch on monitors... Had the EGA benefit from its 64 colours choice in lowres modes, it could actually compete quite well with Atari ST graphically speaking (also an AY-3-8910/YM soundchip could have been neat too).

    • MacD says:

      (oops, i meant the picture was smooth and clear in 640x350x16 video mode... sorry for all the typos)

    • Andrew says:

      They couldn't have added a switch to all the CGA monitors that had already been sold, though - I think the main point of the 200-line modes on the EGA card was that they could be used with a CGA monitor. Having 64 colours available in those modes for fast games on an EGA monitor quite possibly just never occurred to them - the PCjr was their solution for games anyway (and that did have 3-channel sound).

      I could write a whole other post on PC1512/CGA compatibility. There was no monitor compatibility concern there - the 1512's monitor contained the power supply for the machine, so the monitor and system unit couldn't be used separately. Software wise, it was *almost* compatible - In about 7 years of use I only came across one CGA game which wasn't PC1512 compatible (Astro Dodge). It was compatible apart from the CRTC registers (maximum scan line was supported, but none of the registers that controlled the timing). I believe this was done to make it impossible for software to damage the monitor (incorrect CRTC values could damage flyback transformers on many other machines of the era).

      PCs weren't behind for that long though - VGA was introduced in 1987 and solved all these problems, but games continued to target CGA and EGA for a while after that due to the huge installed base.

  9. r00tb33r says:

    There's something not clear to me. In the 640x350 how can I get all 64 colors on the screen at once? I realize about the four memory planes, resulting in four bits defining color attribute. I figure there's additional planes for pixel's palette index...?
    Okay, so we call this a 16 color mode, but why is there a 64 color palette that can be dynamically redefined? Why do we map the 16 colors to 16 of the 64 entries in the palette? Why aren't we just redefining the 16 directly, and what's the point in having the other 48?
    Consider a program I wrote, it sets 6-bit values to each RGB color channel for each of the palette entries. Since I am assigning and reassigning those entries dynamically, I've got an 18-bit color space to work with, and it looks pretty darn good. What I can't figure out is what's the point of having 48 more palette entries if I can just assign any value in 18-bit color space to the seemingly only 16 attributes I can get on the screen at once...? Why is it 18-bit color space -> 64 entry palette -> 16 attributes, and not 18-bit color space -> 16 entry palette -> 16 attributes?

    • Andrew says:

      The EGA doesn't have any modes designed for displaying more than 16 colours on screen at once, but you can get all 64 colours on screen at the same time with careful timing, redefining particular palette entries to particular colours on particular scanlines.

      The EGA's four memory planes each control one bit of the (4-bit) palette index - there aren't any additional planes.

      The rest of your comment seems to result from a confusion between EGA and VGA. The EGA does not have an 18-bit, 256-entry external palette like the VGA does - only the 6-bit, 16-entry internal palette. The VGA has both, which seems to be redundant at first glance, since anything you could do with the internal palette you could instead do with the external palette instead. However, the VGA was designed to be software compatible with EGA, so the internal palette remains as a compatibility constraint. Hope that clears it up!

      • r00tb33r says:

        Perhaps my hardware is producing unexpected results or we are misunderstanding each other.
        I'm using mode 10H (QuickBasic screen 9), which is the 640x350x16 EGA mode, and I replace palette entries, first writing the entry number, then followed by 6-bit red, value, 6-bit green value, and 6-bit blue value, resulting in an 18-bit color space. For example, I can fill the entire palette with 64 shades of red.
        This tutorial demonstrates this technique:
        http://www.petesqbsite.com/sections/tutorials/tuts/kinney_palette.txt

        Now, the question is, even though I'm using an EGA mode, is my modern hardware doing something not normally possible on a true EGA card?

        I'm also playing around with other video registers and I was able to output 400 scanlines in the 640x350x16 mode, but so far only 356 lines show picture (6 extra), and the remaining 34 are shown as blank. Any way to get those 34 scanlines to show picture? I'm definitely able to write to those memory locations, the 6 extra scanlines I already have are showing pixels I wrote to those locations.

        • Andrew says:

          That is an EGA mode, but you must be using a VGA card (as a true EGA card does not have ports 0x3c7-0x3c9). The external palette works just fine in EGA (and even CGA) modes on a VGA card, as you have noticed. But if you run the same program on a machine with a real EGA card you won't see any palette changes.

          The argument about monitor compatibility between CGA and EGA (that the actual blog post is about) does not apply to EGA and VGA because EGA cards and VGA cards use different monitors (an EGA card uses a TTL digital monitor, a VGA card uses an analogue monitor). So EGA modes on a VGA card can the external palette, but CGA modes on an EGA card can't use the internal palette because for all the EGA card knows it might be connected to a CGA monitor.

          You can reprogram a VGA to 400 or 480 scanlines by changing the CRTC registers. At the very least you'll probably need to reprogram the Vertical Display Enable End register (0x12), the Start and End Vertical Blanking registers (0x15 and 0x16). Possibly also the Start and End Vertical Retrace registers (0x10 and 0x11), the CRTC overflow register (0x07), the Vertical Total register (0x06) and (depending on your monitor) the Synchronization Signal Polarity field of the Miscellaneous Output Register (port 0x3c2). There are tables of all the ports and BIOS register values for different modes for CGA, MDA, PCjr, EGA and VGA at https://github.com/reenigne/reenigne/blob/master/8088/cga/register_values.txt . Though to get a 640x400x16 mode it might be easier to start with BIOS mode 0x0e and turn off the scanline doubling in CRTC register 0x09.

          • r00tb33r says:

            Oooo mighty thanks, mate!
            This was quite confusing because some other places call EGA modes "DAC modes" for some reason, so I was assuming it was analog, with backwards compatibility with CGA.

            Thanks for the register reference, should come in handy.

            Would it have ever been possible to push 400 scanlines on EGA-proper hardware (not modern VGA). If I'm understanding the register reference correctly, it's not possible, but I don't actually have period hardware to test that.

            When you say BIOS mode 0x09, are you referring to the EGA or VGA mode? That register reference has both EGA and VGA modes corresponding to the same numbers.

            • Andrew says:

              I've never heard EGA modes called DAC modes before! The DAC is another name for the external palette hardware on the VGA (ports 0x3c7-0x3c9).

              The EGA card can generate 400 scanlines (subject to memory constraints) but trying to display the wrong number of scanlines on a genuine EGA monitor might damage it!

              The BIOS mode 0x09 I was referring to is an EGA mode. It's also a VGA mode because VGA has a superset of EGA from the perspective of software. It's a 640x200x16 mode on both.

              • Rich R. says:

                In this thread, you say that an EGA card could damage a monitor if it sends a signal with more scanlines than the monitor can handle.

                Since an EGA card is CGA-compatible, that means it's capable of detecting what kind of monitor it's hooked up to, right? Because it would damage a CGA monitor if it sent a 350-line signal.

                So if the software is creating a 200-line graphic, why can't the EGA card send 4-bit color when it detects a CGA monitor, and 6-bit color when it detects an EGA monitor?

                • Rich R. says:

                  Never mind, the Wikipedia article notes that the monitor type must be set manually on the card. https://en.wikipedia.org/wiki/Enhanced_Graphics_Adapter#Specifications

                • Andrew says:

                  Since I wrote the above I have learned some more about the consequences of feeding monitors signals with the incorrect timings, and have changed my mind about what I wrote above. Some MDA monitors (notably the IBM 5151) could be damaged like this, but most other monitors (including all CGA and EGA monitors that I know of) have phase-locked loops which keep the horizontal raster frequency within spec to avoid damage. So the monitor wouldn't be damaged, it would just not display the correct image.

                  As you found out, the EGA card can't detect what type of monitor it's connected to. The VGA can, but there were no such facilities in the CGA/MDA/EGA connector.

                  • Rich R. says:

                    Now I'm starting to wonder why EGA had 64 colors, AT ALL.

                    The design goal of EGA was to do everything that CGA, MDA, and Tandy could do, and be compatible with all the old monitors. They could have done that with just the RGBI colors, and would not have had to redesign the cable.

                    Software game companies that had aggressively optimized their RGBI 200-line artwork for the Tandy would be unlikely to waste time trying to pick slightly better colors (shading?) even if EGA could show them; and if they retooled the artwork for 640x350 resolution but kept the SAME colors, that alone would let them say "Look how much more awesome we are on EGA!"

                    • Andrew says:

                      Software and monitor compatibility with CGA, MDA and PCjr was one of the design goals of EGA, but not the only one. Another would surely have been to design the best graphics adapter possible with the technology of the time (within a certain budget). Going to 64 colours didn't require redesigning the cable (there were enough spare pins in the CGA/MDA cable for 6 bits of colour data) and allowed IBM to sell a whole new monitor to people who could afford it. IBM probably weren't so concerned with what software companies had done in the past as what they would be able to do in the future. One other reason that there weren't very many 640x350 games is that EGA was superseded relatively quickly by VGA - if VGA had been delayed and EGA had been the state of the art then there probably would have been a lot more 640x350 games.

  10. Keith Harvey says:

    Peter Norton once wrote a BASIC program that forced EGA to display 64 colors at onc instead of the standard 16. I'm trying to figure out how to do the same under the DOSBOX emulator

    • Andrew says:

      Presumably it's just a matter of waiting for a particular scanline and then setting the palette registers in such a way that the same colours appear at consistent places on the screen. I'm pretty sure that would work under current versions of DOSBox with no modifications - I think that does line-by-line rendering and supports several games and demos that do similar tricks (at least on CGA and/or VGA).

    • Christopher Martin says:

      Did you manage to make a program to do this?

  11. MacDeath says:

    Wikipedia :
    " A few third-party EGA clones (notably the ATI Technologies and Paradise boards) feature a range of extended graphics modes (e.g., 640×400, 640×480 and 720×540), as well as automatic monitor type detection, and sometimes also a special 400-line interlace mode for use on CGA monitors."

    this may answer the RAMBO III questions.
    If the card could produce a 640x400x16 mode with free choice of palette, it could possibly emulate a 320x200x16 with "free palette" but you certainly had to have compliant monitors, perhaps VGA or EGA or third party ones.

Leave a Reply