This document recounts the early days of fun computer stuff on the IBM PC, so whenever I refer to "the computer" or a similar generic term, I'm talking about an IBM. If I mention the "Tandy sound chip", I'm referring to the sound chip that was present in both the IBM PC Jr. and the Tandy clones (it's the same chip).
Many demosceners go through changes in life that affect them profoundly--changes that either bring them into or force them out of the demo scene. I find myself in the latter, which saddens me a bit, as I have not yet given back fully to the scene that has given me so much. I also welcome the change, as it is time for me to pass the baton onward to the newer generation of people who will redefine what demos are. I look forward to what tomorrow's enterprising youth with a love of computers can bring to the demo scene.
But the new blood probably has no idea what was interesting before demos came around in 1990. Sure, everyone can download GR8 / Future Crew and have a laugh or two, but that's the tail end of what I call "the golden age of PC fun." How did anything become interesting or cool on the PC--a machine with no dedicated graphics or music hardware? What held our attention before demos came along?
Good question. Well, this article will attempt to describe the changes in my life that brought me into the scene--what was interesting to me as I grew up with a PC, starting out in 1984 with a new 4.77 MHz processor and ending in 1990, the proper birth of the PC scene. It will be fascinating to some, and only mildly interesting to others, but it needs to be remembered in the sea of Pentium Pros, Windows 95, and accelerated hardware. Trust me, it used to be nothing like it is now. :-)
Also, I will try to cover some of the more eclectic technical information unless I don't remember it clearly, so if you want more info on how something was done, email me and there's a good chance I can find it somewhere. Hope you can parse BASIC code. :-)
So, if you've ever wondered what was interesting or "demo-like" on the PC before demos came around, this article will probably explain it. Let's journey backward, shall we?
Let's start briefly with something that held my fascination long before demos existed (and probably what contributed to their birth): Games. It's not too surprising that the most impressive stuff on any home computer is a game. Games are, in fact, some of the hardest things in the world to program, and in the early days, nothing was more true.
Think about it: You had to do many or all of the following: Accept input, display graphics (quickly), play sound effects, process artificial intelligence and other game elements, and do it all quickly enough so that the user didn't feel cheated out of a decent gaming experience. Of course, there was a multitude of problems in doing this on the original PC:
So it's obvious that a well-written game that was fast and fun to play was incredibly impressive, since things just simply weren't fast on the PC. I will mention games liberally in my musings. Just thought you'd like to know. ;-)
How do you make a bland, boring "business" computer do something impressive? Good question. While newer (1984) versions of the PC (IBM's PCjr, and Radio Shack's clone, the Tandy 1000) had more sound and graphics capabilities, the older PC did not, and that was what had the largest installed user base. So, you had a couple of areas to improve on: Sound, Graphics, and anything that would speed up the system. Since a program's first impression was usually how it looked, many chose to improve the CGA graphics.
CGA (Computer Graphics Adapter) always struck me as finicky. The video memory layout was interlaced, which means that the first 8K was every even-numbered scanline and the last 8K was every odd-numbered scanline. Plus, there were 192 bytes unused at the end of both 8K sections, so that became annoying to skip as well. Finally, you had one of two fixed palettes of nasty, ugly colors (cyan, magenta, and white; or red, green, and yellow) to choose from. While some of the best artists could deal with four colors adequately, the crappy color set usually disgusted the user.
Well, I quickly found out that by fiddling with the video registers, I could get an unadvertised (to my manuals, anyway) color palette that seemed to combine the two palettes. The resulting 4-color palette had black, white, cyan, and red--much nicer. I'm sure this palette was advertised somewhere because some game I had used it, which is what got me sending random bytes to the video ports in the first place.
Sending random bytes to the video registers, by the way, was extremely dangerous! If I had known that then, I wouldn't have even tried at all; I was lucky I didn't break anything. The monitors back then (1987 and earlier) were fixed frequency monitors, and ran at low frequencies. If you told the video card to do something outside of those frequencies, you stood an excellent chance of damaging the monitor. A friend of mine was trying to get the rumored CGA 16-color mode to work (more on this later) and destroyed his Compaq's built-in monitor, a small monochrome monitor that emulated color with different shades of gray. He went to the nearest Compaq dealership to get it replaced, and when they refused to fix it, he answered with something like, "Well, I could demonstrate how easy it is to break the monitor to your customers by typing in a three-line BASIC program on, oh... this machine over here," and pointed to their top-of-the-line machine that some customers were huddled around. They quickly replaced his monitor to shut him up. :-)
Let's face it: CGA was crap. The C64 blew it away in terms of speed, and even an Apple ][+ had more colors to choose from. Anything you could do above what it was normally used for, quickly or differently, was neat. So, a bit of experimentation exposed that you could change the background color, usually black, to one of the 16 text colors. This allowed for a quick flash (possibly for an explosion in a game), which was slightly impressive since nothing about CGA was quick. This trick was also commonly used to change the black background to blue, which made it easier to draw bright and colorful pictures if you didn't need the color black. This same technique could be used to change the foreground and background color of the monochrome graphics in CGA's 640x200 2-color mode. (Love that 640x200 2.6:1 aspect ratio--NOT!) This wasn't useful for games, but was used to make black-and-white pictures look less... ugly.
Other palette fun could be had by changing the border color, which was largely an overscan hack. It was cute to have the border change colors and not everything else. I didn't even know the border could be a different color for a while--nothing bothered to change it. Another cute hack was to write to the ports to change from 320x200x4 to 640x200x2 on the fly without erasing video memory. This way, you essentially faked a color-to-mono toggle switch.
The only game I ever ran that used some of the above tricks to do some primitive copper effects was California Games (1987). Its video selection menu had the normal CGA 4-color mode, but it also had a mysterious CGA "more-color" mode. The "more-color" mode was only used in two places: The title screen and the hackeysack portion. Why? Because those screens had a clear horizontal division of a graphic on the top half of the screen and one on the bottom half. The "more-color" mode would switch palettes at a certain scanline to display one set of colors on the top graphic, and a different set of colors on the bottom. As you can imagine, this was unnervingly time-critical, and self-programming vertical-retrace interrupts took too long (hey, you needed all the speed you could get) on a 4.77MHz machine, so this trick only worked on the one machine that they could hard-code the values into: The original IBM with an original CGA. I always used to think that this mode didn't do anything until I brought it over to my friend's house and saw it work. It worked pretty well, oddly enough. Maybe I was just easy to impress. :-)
Here's a bit of trivia: Checking for Vertical Retrace has been the same procedure since CGA on up--you simply monitor port 3dah. Until I discovered that port, I always wondered why delaying 13 or 14 ms before updating the screen made it nice and smooth, but that's because 1000 ms / 70 Hz (the screen refresh rate) is about 14.2 ms. ;-) Nobody ever bothered to monitor the vertical retrace port unless they were trying to avoid snow on CGA screens.
Snow, you ask? What, you don't know about snow? CGA boards, for a reason I can't remember, displayed "snow" in text mode whenever you wrote to the screen memory directly (the BIOS writes avoided snow, but were terribly slow). Since I'm typing this article on a CGA PC right now (I'm not kidding--my word processor runs in 384k! ;-), I'll describe what it looks like: Every time video memory is touched, small horizontal white lines one character cell wide appear and disappear all over the screen. It can get really annoying after a while, so many people waited for vertical retrace before writing to the screen. This was much slower, but reduced snow a great deal, and was still faster than using the BIOS to output text.
email@example.com had the following extra information to offer:
"If my memory serves me right, the snow on the CGA was because when the CPU and the video card both tried to access the one-ported video memory, the CPU would lock out the video cards access until the read was completed."
CyberTaco clarified this a bit:
"I believe you were wondering what causes CGA snow (wow, it's been a long time!)? It goes like this:
"The ram in the standard IBM CGA card was what was called single-ported, meaning that it could not be written to and read from at the same time. If you were writing to the ram, it simply ignored read requests and you got a random result from it instead. Single-port ram was used because at the time both because it was cheaper than dual-port ram (can read and write at the same time, like we're all used to), and because IBM was (is?) staffed by a bunch of idiots. :-)
"End result: When scrolling the screen up a line, every single piece of character memory has to be written to the memory area corresponding to one line above where it was. While this is happening (sloooooly), every time the physical picture-generating hardware goes to read the ram to figure out what dot to put where on the monitor, it keeps getting random results over and over because the ram is being written to during the scroll. The end result? Random dots all over the screen instead of text, resembling static, or.... snow. :-)"
16 colors in graphics mode with a stock CGA board? Believe it or not, there was not one but two legitimate ways to get more than 4 colors on a CGA card: An undocumented "low res" mode (which I'll talk about later), and CGA's Composite Color mode. Both had drawbacks, unfortunately:
But many game companies wanted to have every edge they could over the competition, so many decided to use it. (Starflight from Electronic Arts was the first mainstream game I can remember (1986) that used it well.) The only drawback, with any sort of compatibility back then, was speed--if all your graphics were in 16 colors, then you had two problems on your hands: You either had to convert the 16-color pictures down to 4 on the fly, which was slow, or you had to provide both sets of converted pictures on the disk, which took up too much room and cost more money. (Remember that back in the first half of 1980, disks were still fairly expensive--I remember the best price I could get on a 360K floppy in 1984 was a dollar a disk.)
Still, some companies saw the light and did it, and the result was colorful graphics on almost any system. Besides, you could use a lookup table to quickly plot the low-res sprites, because a word (two bytes) plotted 8 pixels in CGA or 4 pixels in EGA/Tandy (16 colors). (Plus, this gave Tandy/PCjr owners some extra speed because they had a native 160x200 mode, so they got more color without slowing down.) California Games from Epyx supported probably the most graphic options of any game in 1987 using this technique, including CGA, EGA, CGA "more-color" (see above copper trick), CGA Composite color, Hercules Mono, MCGA/VGA 16-color, and Tandy/PCjr. All the graphics and sprites (except for the font) were in 160x200. Mindscape also did the same thing with Bop'n'Wrestle, Uridium, and Bad Street Brawler (hi Lord Blix!).
Many companies used the 160x200 trick even if they didn't have 16 colors to store, simply because the sprites took up half the space. All the "Maxx Out" games from Epyx like Rad Warrior used this even though the graphics were mostly 4 colors.
Finally, some adventure game companies found a unique way to store over 200 or more full-screen pictures on a single 360K disk: Vectors. (By "Vectors", I'm referring to a series of points that define the beginning and end of a line, outline of a polygon, etc.) Regardless of how large or complex the picture was, you could usually fit a decent-looking scene into about 2K, because all you were storing were polygon outlines/definitions. A blue circle with a radius of 100 pixels located at (100,130) could take up over 10K as a raw sprite/bitmap, but it only took up enough bytes to describe "circle, at (100,130), radius 100, color blue" when you stored it as its vector definition. The previous example could take up as little as 7 bytes, if you did it haphazardly:
Come to think of it, I'm sure the game programmers were much more frugal than that; this was when games had to fit on a single 360K disk. The previous structure would've probably been optimized even further:
|Draw Element||first 3 bits of first byte||Holds 8 different types: Square, Circle, Line, Pixel, Fill Point, etc.|
|Fill Color/Pattern||last 5 bits of first byte||Allows for 32 different colors or patterns|
|X Coordinate||word||Must be a word, since the maximum (320) wouldn't fit in a byte (255)|
|Y Coordinate||byte||Could be a byte, since the maximum (200) fits into a byte (255)|
So, even though 7 of the 16 bits in the xcoordinate word are wasted and could be used for even more things, we're down to five bytes for the information required to draw a large circle on the screen. You can pack this down even further, which I'm sure they did, but I included the above merely as an example into the thought process involved.
Another nice advantage to this system was that you could compose your pictures with 16 colors and let the drawing algorithm pick the closest color or dither pattern when it drew the vector objects on the screen at runtime. But, as with all computer programs, there was the classic speed vs. size tradeoff: Drawing the polygons took a lot of time, sometimes as much as six seconds for the picture to assemble, which looked sloppy. There were other drawbacks to this system as well; the artist was forced to use an uncommon vector-editing program, which usually had to be developed in-house, and the nature of the whole vector format procedure made it hard to produce pictures with a lot of fine detail. Still, Sierra used it in King's Quest I (1983), and the classic text/graphics adventure Transylvania (1983) from Polarware used it as well. (In fact, when Polarware was still called Penguin Software, they marketed a vector drawing package specifically built for making small graphics for games called The Graphics Magician.)
Ah, now the fun part: 16 color graphics on a standard CGA card. IBM actually announced this mode in January of 1982, when the second version of the IBM PC came out; it's mentioned in their technical documentation, but they evidently saw no need to provide any real documentation for it unless asked. I guess nobody asked. :-) By the time I finally figured out how to program the mode, it was 1990 and I had a VGA card, so I never had a chance to use it in anything. Some games did use it, though, including a good shareware shoot-'em'up called Round 42 and two commercial adventure games, one from Sir Tech (remember Wizardry?) and another from Macrocom called ICON (1984). Here's how you program the mode:
The 160x100 "graphics" mode was actually a text mode. The video registers were changed so that normal 80x25 color text mode squeezed four times as many rows of text characters onto the screen to produce an 80x100 text screen with the bottom 75% of the text characters chopped off. Then, the screen was filled with ASCII 221, which is the left vertical bar. Each character on the screen was used for a pair of two horizontally adjacent pixels by adjusting the foreground and background color of that character. The "blink" option on the video board had to be turned off so that pixels didn't blink when the right-half pixels (which use the text background color) had a color value greater than 7. (Text characters normally blink if the background color is in the 8...15 range, but who uses blinking any more?)
Okay, cute trick. You could make it work in today's world by modifing it for EGA and VGA boards as well; on VGA boards, the text character height is normally 16, so it would be changed to 4. EGA boards, however, normally use a text height of 14 pixels. There is no way to change the text height to exactly one quarter of 14 pixels, so a text height of 3 would have to be used, which is slightly too small, but that's what you get for using EGA after 1990. :-)
Technical details: Blink is suppressed on CGA boards by setting bit 5 of port 3d8 hex to 0. On EGA and VGA cards, blink is suppressed by using subfunction 3h, function 10h of interrupt 10h and setting BL to 0. Trivia: The 160x100 mode uses 16000 bytes at b800:0000 hex to store 16000 4-bit pixels, which is twice the amount of memory that would be required in a normal graphics mode--so each even byte of video memory is wasted to store the ASCII character 221. Moving blocks of video memory around was both fast and annoying, since only the odd bytes (storing the text foreground and background colors) had to be modified.
Well, that's enough about CGA's 16-color "graphics" mode. I was just so happy when I figured it out that just had to spew somewhere. :-) And ICON? They added even more depth by using the entire ASCII character set--or, at least, the top four scanlines of the entire ASCII character set. (The bat's wings were the top four scanlines of ASCII #2, which was the smiley!) Some of the results were fantastic, but that's ANSI from hell that I don't think anyone would want to ever attempt again.
Demosceners are familiar with speeding up graphics with cute math and hardware tricks. But what do you do when the machine is simply too slow? Speed up your hardware. It sounds silly, but there were actually several ways to make the machine run faster--via software--by modifying certain bits of hardware. For example, the CPU spent time refreshing early DRAM so that the RAM wouldn't lose its contents. This refresh rate was adjustable and usually done way too often, so you could usually lower it a bit and gain more speed for your programs. Speed increases of 5% to 10% were not uncommon; I got a 15% increase by lowering the DRAM refresh rate gradually until the computer locked up. :-) I then used the last setting that didn't lock up the computer. Handy thing to have in your autoexec.bat, that.
But probably the most common method that game programmers used to speed up the computer was to go for the guts and bypass the operating system in a big way--by creating self-booting programs. This solved several problems at the same time:
Of course, you had to be a damn fine coder to do this. Being merely proficient in assembler programming wasn't enough--remember all those friendly DOS services you use? Gone. So, you'd better be prepared to write your own mini-DOS when you needed one.
Sometimes self-booting programs were mandatory; it was the only way to have any decent form of copy-protection. Stuff like that was really hard to crack. In fact, I don't think anyone in today's world would have the stomach to attempt it. Not only does the game boot (hel-LO!), but it then proceeds to stomp all over memory, usually obliterating the debugger. Your only chance is to dump the boot sector and attempt to disassemble it manually. (While the game programmers were tough back then, so were the crackers, so it got done somehow.) The most common programs that come to mind are the early Accolade and Electronic Arts games; Pinball Construction Set and Music Construction Set (forever classics), for example. Of course, the classic Wizardry did this as well, although it took them two tries to get it right: The first version of Wizardry was not very friendly to non-perfect-100% IBM compatibles, so the more you played the game, the less successful the disk reads were. Odd... The second version played just fine, however. They even took the time to improve the graphics.
Let's sidetrack for a minute or so on copy-protection schemes. Although they had nothing to do with graphics or sound explicitly, I still found them absolutely fascinating, since they were also very hardware-intensive.
Copy-protection, for those too young to remember, was a method of doing something to the diskette the game came on that made writing a perfect duplicate impossible, which in turn prevented you from copying the disk and giving it to all your friends, robbing the software company of potential sales. A typical method of checking went something like this: When the game started, it checked to see if it was running on the original diskette by looking for a specific piece of data stored in an extra sector hidden in a specially formatted track. If it didn't find that data, it aborted. This way, you could only make the program run by running it off of its original diskette. (Nowadays, "copy-protection" is usually as simple as the program asking you to look up a word on a certain page in your software manual and type it in, or for CDROM games, checking to see if it is running on the CDROM device.)
Copy-protection used to be included on everything, from games to business software, simply because software was extremely expensive back then, and a couple hundred copies of a program could actually make or break a software company (no, I'm not kidding). People just didn't buy software all the time because a typical game (for example) was $50. That's normal in 1996, but very expensive for 1984. (Think about it: $50 in 1984 is like $95 today due to inflation. I don't know about you, but I'm not willing to pay $95 for a computer game.)
I wasn't any good at cracking back in 1984 (I barely knew general programming, let alone assembler), so I had to become good at figuring out the copy protection scheme instead if I wanted to duplicate the program. While many crackers learned DEBUG inside and out, I learned protection schemes and how to duplicate diskettes. Here's some of the more interesting methods that programmers used to prevent you from copying that floppy with DOS's standard DISKCOPY:
If it was so easy (relatively speaking) to figure out these formats, then why didn't everybody just write bitcopy or nibbler programs to analyze the diskette and make perfect copies of everything (except the laser hole, of course)? It wasn't quite that easy: While the IBM floppy controller could read all of these formats, it did not have the ability to write all of them. A third-party company (usually the diskette duplication facility itself) specially prepared the diskettes using custom floppy controllers. To this day I don't know if the read-all-but-not-write-all phenomenon was a decision made on purpose by IBM's engineers or just a hardware glitch that software companies took advantage of. (Probably a hardware glitch.) Either way, I eventually broke down in 1987 and bought a Copy ][ PC Option Board, which went between the floppy drive and controller, allowing me to write those special formats. Trivia: To this day, there is only one diskette I have never been able to duplicate, even with the help of my Option Board: A Cops Copylock ][ demo diskette that I sent away for (Cops was a third-party copy-protection library you could purchase to copy-protect your own programs). I never found any programs that actually used Cops as the copy protection scheme, which was fortunate, since I couldn't copy it. :-(
Ah, music. Certainly one of the most interesting things done with computers today in the demoscene; in fact, at least twenty times more MOD/S3M/IT/XM music is being put out per year than demos, and that figure is steadily increasing. Until Sami came out with Scream Tracker in 1990, there was no native digital/multi-channel composition program. Heck, until 1986, there wasn't even any sound hardware you could hook up to your PC. Sure, the PCjr and Tandy had their own 3-voice sound chip built in, but I didn't have a Tandy or PCjr.
That didn't stop me. :-) Or anyone else, for that matter. People did the best with what they had, with surprising results for the time and hardware. Ladies and Gentlemen, I present to you the evolution of composed/tracked personal computer music hardware and techniques, from the point of a fledgling demoscener (which means I'll conveniently ignore MIDI):
This has been covered many times before, so I won't rehash The Beginning Of Demos excessively; however, a few tangible examples would probably serve to help you picture the transition.
Pirating games led to the birth of the demo scene. It's time everyone who is in deep denial about this fact comes out and accepts it. You can see this in the early loaders for pirated games; as early as 1988 you can see some demo-like effects in small loaders. The loader for Bad Street Brawler is a perfect example of this: The screen starts as static; then, like a TV pirate intercepting a TV channel, the screen flickers with the title graphic, until finally the title graphic is fully displayed. The entire program was only 128 bytes, and was tacked onto the front of a CGA screen dump.
This "loader mentality" took a while to grow, but it eventually did. Many early intros were simply one effect, but to be impressive, it had to be an effect never before seen on the PC. "Never before seen" meant one of two things, actually:
Well, our look back has ended, and not a moment too soon, since I've taken up valuable time that you could've used to write the newest demos. All I ask is that the next time you take a look at the demos today that contain incredibly complex 3D objects, particle systems, and multi-channel digital music, or the next time you compose music with 64 digital channels in Inertia Tracker, you think about what it used to be like in the old days. Who knows... Maybe some of you will be as innovative as us old-timers needed to be in dealing with such limited, slow hardware.
Then again, the driving force behind demos is that the hardware is only as limited as you think it is. ;-)
It just wouldn't be fair for me to end this article without a list of old demos that you should check out if you want to see the best of the best back in the old days (from 1990 to 1992). Keep in mind that these demos run at the full framerate on a 386 running at 16 MHz:
|Chronologia||Cascada||VGA||Sound Blaster, Internal speaker, LPT DAC|
|Putre Faction||Skull||VGA||Sound Blaster|
|Unreal vers. 1.1||Future Crew||VGA||Sound Blaster, Gravis Ultrasound, LPT DAC|
|EGA Megademo||SpacePigs||EGA||Internal speaker, LPT DAC|
And, as always, if you're interested about demos in general, please feel free to free to check out PC Demos Explained or download the PC Demos FAQ list.