Reason for the Amiga clock speed



The Amiga used a CPU rated for 8 megahertz, but clocked at 7.14 megahertz. What was the reason for this number? I remember it was something to do with a multiple of the frequency of the video circuitry, but I forget the details.


Posted 2017-01-24T19:36:02.353

Reputation: 7 596



The architecture of most "color computers" of the 70s-80s was very tightly built around the NTSC color video standard.

Almost all of them had a 14.31818 MHz crystal. Note that this is four times the 3.579545 MHz frequency of the NTSC color standard, which was called a "color clock". They divided that crystal frequency to derive their actual clock frequency. For instance

  • Apple II was 1.023 MHz (1/14 of crystal, 3.5 color clocks per CPU cycle) and used a 1 MHz rated CPU.
  • Atari 2600 VCS was 1.19 MHz (1/12 of crystal, 3 color clocks per CPU cycle).
  • Atari 400/800 was 1.79 MHz, (1/8 of crystal, 2 color clocks per CPU cycle) and used a 2 MHz rated CPU.
  • Even the IBM PC used this same crystal that everyone else was using, dividing it by 3 for CPU clock (4.77MHz). (But keep in mind memory clock is 1/4 of that, so memory throughput was crystal/12, the same as the Atari 2600 - ha!) Why choose a multiple when video cards had dedicated video RAM and ran on their own clocks? Despite staggering chip-fab capability, the IBM PC team was pathological about using off-the-shelf designs and components. And this "kept the door open" to a future "color computer" design with shared video RAM; which came to fruition as the IBM PCjr.

Why didn't these color-computers just operate asynchronously and run the CPU at max spec, while the video system operated on color-clock? Because memory was at a premium in those days, so they used memory-mapped video. This was dual-ported in the simplest possible scheme**, which required memory clocks be in lock sync with the video system. The Apple II even used the video system to accomplish dynamic memory refresh.

So you see. The Amiga, the last of the machines built with the "color computer" mindset (heh, speaking of another), deliberately chose a CPU speed again divided down from that same familiar crystal, and again lockstepped to the NTSC color clock. This allowed them to leverage all the color-computer design which had come before, rather than having to reinvent the wheel.

It's difficult to imagine, in this day and age of 4k video, gigabytes of video RAM, and VPUs that can mine Bitcoin faster than the CPU... that we were so pious in our worship of the NTSC standard. No longer needing the thing, it was kind of a lousy standard.

** To digress on pre-Amiga-age dual-porting: On the Apple II, the video system got every other RAM cycle whether it needed it or not (hence running half the speed of the Atari), and in classic Wozniak style, he rearranged the memory mapping of video so the video system would also do dynamic RAM refresh). On the Atari, the far more sophisticated ANTIC system interrupted the CPU to take the memory cycles that video needed, as well as dynamic RAM refresh. This meant available CPU power varied between display modes and whether you were in vertical blank (a significant amount of time when the raster was off the visible screen). In a lot of Atari games, the CPU "rode the raster", spending its very limited CPU cycles to prepare changes to colors or sprites that would happen on the next scan line, wait for horizontal-blank, execute on cue, and prepare the next. All the computational work of playing the game happened after it got to a low-attention area like the bottom scoreboard, and the vertical blank period. All this was coded in assembler of course, with painstaking cycle-counting to make sure the operations could happen in the requisite time, given the cycles ANTIC would steal for the video mode you requested.


Posted 2017-01-24T19:36:02.353

Reputation: 940

@cbmeeks I thought PAL -> Predictably Always Lousy – Neuromancer – 2018-11-20T00:43:46.273


IBM had tried and failed, repeatedly, to develop a personal computer based on their own designs - for example, the model 5100 Portable Computer from 1975. We had one in the computer lab when I was in college. It had a microscopic screen and was just totally awful to use. It was followed by the Model 5110 which wasn't much better. The 5120 was followed by the System/23 which was powered by an Intel 8085 - and led to IBM choosing the 8088 for the IBM PC.

– Bob Jarvis – 2017-01-25T13:26:45.020

1Re NTSC being a lousy standard: I've heard of NTSC being jokingly referred to as "Never The Same Color." – Wayne Conrad – 2017-01-25T17:44:00.203

4I thought it was "Never Twice the Same Color". Unlike PAL's "Perfect At Last". lol – cbmeeks – 2017-01-25T18:51:30.570

2From the point of view of 1955, NTSC seems pretty brilliant. From the point of view of as early as 1975, NTSC doesn't seem as amazing. And by 1985 it was already merely apathy that kept it entrenched. Or maybe lack of other technologies and market demand for a new standard. Anyway, thinking about the original invention of color TV that could be displayed by a black and white TV just as well, I personally find that impressive. – Todd Wilcox – 2017-01-26T13:25:06.483

3The funny thing about the Amiga was that everything else in the system had asynchronous clocking capability. It was just the ram-timing for video that required it. You could do crazy stuff like pull the MC68000 CPU out of the socket and replace it with a 68030 piggy-back board, that had it's own 50 MHz clock, faster RAM and and even faster (64Mhz) 68881 floating point unit. And most models (even without the CPU board) supported "Fast-RAM" (above the address-range of the video-chip) that wasn't bothered by the dual-access and thus faster. – Tonny – 2017-01-26T14:16:25.457

2Might be worth noting, that this is only about the timing of NTSC, as the Amiga (like a lot others) also supported RGB output, which does not suffer from any color issue of NTSC, when being directly connected to the display device. – Holger – 2017-01-26T15:09:49.270

1@ToddWilcox My father worked at the UK GPO (who had the contract for BBC telecoms) in the '60s and was present at a demo from the German guys showing off PAL - they first showed NTSC, and "und now, ze PAL!" - and the colours all settled down. Apparently everyone went "wow" :-) Then it was all decided, and he spent the next few years travelling up and down the UK with a scope doing signal correction on the new cables to handle the 625-line colour. – SusanW – 2017-01-26T20:08:09.577

1"Why choose a multiple when video cards had dedicated video RAM and ran on their own clocks?" Don't forget that the 14.31818 MHz crystal was readily available, cheaply, in bulk. Sure, you could cut your own crystal to get exactly the frequency you wanted, but really, why? Part of the design goal of the 5150 was to get a product out the door -- cutting corners where doing so meant saving time at marginal detriment to the final product was probably done regularly. – a CVn – 2017-01-27T12:10:41.273

3This is the correct answer. For authority, check my name against the names on the Amiga patents. The original cost goal was aggressive. Clock crystals cost money. So did synchronizer circuits and the buffering required to create logic that safely crosses clock domains. So we architected the whole system to use 1 basic clock, the one required for accurate NTSC video timing: 4x color burst. Divided by 2 for the 68000. – hotpaw2 – 2017-03-31T03:21:47.103

The chipset clock crystals are actually 28.63636 MHz (NTSC) and 28.37516 MHz (PAL), just take a look inside or at the schematics. – Zac67 – 2017-06-22T13:09:49.057

@hotpaw2: The VIC-20 should it anything have been more cost-sensitive than the Amiga, but it uses a dot rate of (8/7)chroma; the C64 was likewise cost-sensitive, and it uses (16/7)chroma. – supercat – 2017-06-23T20:45:46.003


Nice answer. ;-) The "riding the raster" approach was also called racing the beam. In fact there's an interesting book about the Atari 2600 architecture called Racing the Beam:

– Craig – 2017-06-28T22:47:43.020


When color video was introduced in the USA, the horizontal scan rate was set as precisely 15750 * (1000/1001)Hz, i.e. roughly 3579545.4545Hz and the color sub-carrier frequency (also called the chroma clock) was defined as 227.5 times the horizontal scan rate. Many computers of that era use a multiple of the chroma clock as the pixel clock (the Amiga, for example, uses 4x chroma, or 14318181.818Hz).

On systems where video generation shares the a memory bus with everything else, it's generally necessary that a fixed relationship exist between the video frequency and the CPU clock. In the case of the Amiga, the system clock is 1/2 the dot clock (2x chroma), which is a bit faster than some earlier machines, though the 68000 doesn't do as much each clock cycle as some other processors.

Some other typical machines:

  • Atari 2600: dot clock=chroma; CPU clock=chroma/3
  • Atari 800: dot clock=chroma*2; CPU clock=chroma/2
  • Apple II: dot clock=up to chroma*4; CPU clock=chroma*2/7
  • Commodore VIC-20: dot clock=up to chroma*8/7; CPU clock=chroma*2/7
  • Commodore 64: dot clock=up to chroma*16/7; CPU clock=chroma*2/7

The IBM PC had video memory that was separate from the main memory, and was designed to allow the video clock to be independent of the CPU clock. Nonetheless, the original PC with a CGA card used a dot clock of chroma*4, and a CPU clock of chroma*4/3.


Posted 2017-01-24T19:36:02.353

Reputation: 6 224

That makes sense, but something I'm still curious about; if the color clock was 227.5 times the horizontal scan rate, I would expect that to mean the achievable horizontal resolution for color graphics was 227 (or maybe somewhat less if you leave a safety margin for overscan) but e.g. the Commodore 64 got up to 320 horizontal. How? – rwallace – 2017-01-25T09:44:10.417

1the color burst wasn't a digital clock signal like the dot clock and CPU clock were. in a digital clock, the parts of the signal of interest are the points where it peaks or where it crosses zero. instead, the color burst is a reference clock, and you encode information by sending a signal that differs from the reference by a small amount. the difference in amplitude is measured to provide one color channel, and the difference in phase produces the other color channel, and with 2 channels you can encode both hue and saturation. – Ken Gober – 2017-01-25T14:29:54.137

1@rwallace: Imagine one were to place a filter over a TV screen which had a pattern of red-orange-yellow-green-blue-magenta-red-orange-etc. stripes on it (as a continuous range of colors, rather than discrete colors). That's pretty much how NTSC video works, with about 170-190 pixels in the visible portion of the screen (depending upon how tightly the screen is cropped). Some platforms like the Apple II and Atari use a horizontal scan rate that's about 0.23% slow and runs at a rate of chroma/228. On these machines, the pattern of stripes is stationary. – supercat – 2017-01-25T14:54:03.113

1@rwallace: On machines that use chroma/227.5, however, alternate scan lines use opposing color phase. Some machines use 262 lines/frame (an even number) causing every frame to have the same phase, while others use 263 (an odd number), meaning that color phase switches every frame. – supercat – 2017-01-25T14:55:58.883


I think you meant 7.15909 MHz.

7.15909 MHz is twice the NTSC color burst frequency (3.579545 MHz). The NTSC color burst frequency is 455/2 times the line rate, the line rate is 262.5 times the field rate, and the field rate is 60 * 1000 / 1001 (59.94 Hz).

See also

On PAL systems that have different field rates, line rates, etc. the CPU is clocked at 7.09379 MHz.

Ken Gober

Posted 2017-01-24T19:36:02.353

Reputation: 7 356

4I believe you meant 525/2 not 455. – cbmeeks – 2017-01-24T21:25:05.850

1But why tie the CPU to the video display refresh rate when the two should be unrelated? I'm guessing this is to ensure a consistent HBlank/VBlank interval timings for games, but the Amiga was meant for creative software - so why? – Dai – 2017-01-24T23:19:29.963

4In general it's easier, cheaper and more reliable to tie everything to a single master clock than to implement multiple clock sources and safe clock domain crossings. – Peter Green – 2017-01-25T00:05:33.430

The most important acronym for older home computers was "BOM". And you had to keep it small. – jdv – 2017-01-25T01:45:22.407

1I remember there was a program that could put the PAL Amiga into NTSC mode and make it run slightly faster at the cost of losing some lines of resolution. – Matthew Lock – 2017-01-25T02:25:03.110

@cbmeeks 455/2 checks out if you calculate it, where are you getting 525/2? Are you sure you're not thinking of PAL? – rwallace – 2017-01-25T09:40:59.483

1The motherboard was common for both NTSC/PAL and jumpered. I had a switch soldered to mine, as I regularly moved between USA & Europe. I didn't realize that it was affecting clock speed, else I would have left it set at the highest, NTSC, rate and used my multi-standard TV as a display, thus “overclocking” when I was in Europe :-) – Mawg – 2017-01-25T12:45:25.413

1@rwallace actually, I believe I was thinking about the number of scanlines of NTSC which is 525. And, since they are interlaced, that would leave you with 262.5. – cbmeeks – 2017-01-25T14:15:46.950

1I think it's worth adding that the PAL clock is more of a hack, since it had to be chosen to match the display area for a 640 pixel wide nominal resolution but on a PAL TV, while maintaining most of the inter-chip timings. This is why a PAL A600/A1200 needs an extra crystal to get a color output on the composite and RF connectors. – pipe – 2017-01-25T14:39:33.157

2@Mawg: newer revisions of the A500 (and newer Amiga models in general) can switch between NTSC and PAL via software, which is related to the timing only, as the color modulation had to be done by an external box (“tv modulator”) anyway. If higher CPU clock was your goal, you could even connect an external clock source (originally intended for genlocks) at the video port. You need, however, a display device that can cope with the higher video frequency then. – Holger – 2017-01-26T15:16:08.060


The accepted answer is good, however there is something more that needs to be said here.

No discussion in this kind of detail on the hardware design of the Amiga should pass without some kind of mention of Jay Miner. In addition to leading design on the Amiga, he also designed the 2600, and the Atari 800/400 which (as Harper mentioned) used the same scheme. That's more than half of the designs listed in Harper's answer. This was clearly his go-to design for clocking.

Jay Miner was indisputably one of the fathers of home computing. Arguably the most important one. His contributions should not be forgotten.

It is of course well-known that the signatures of Miner, his dog, and the rest of the development team, are embossed inside the (removable) case top of most Amiga 1000's


Posted 2017-01-24T19:36:02.353

Reputation: 183

2While I agree that Jay Miner was brilliant and contributed to the evolution of computers, I wouldn't agree he was "the father of home computing". I would put Chuck Peddle higher in that list than Jay Miner. Bill Mensch too. – cbmeeks – 2017-01-25T18:55:37.360

@cbmeeks - I did pretty clearly say that part was arguable. Buy me a beer sometime and I'll happily hold up my end of the argument though. :-) – T.E.D. – 2017-01-25T18:57:57.063

everything is arguable. But, until the world decides what "personal computers" even means, there can be no "father" of them. My point is that, while Miner was a rock star, Chuck Peddle and Bill Mensch contributed to that fictitious list ever bit as much. In fact, many people did. Even...dare I say it...Steve Jobs. – cbmeeks – 2017-01-25T19:03:46.497

IIRC, this is not a correct answer. Jay wanted to clock the 68000 CPU faster, but was talked out of it to keep the custom chip designs simple, while producing more accurate NTSC video output timing than the Apple II or Atari 400/800. – hotpaw2 – 2017-03-31T03:30:55.677


Synchronizers, which are required for reliability and signal integrity across any asynchronous clock domain crossings, in any main data path, costs lots of dual-rank registers, which have to be tightly characterized in circuit design and layout. This costs in die area.

Async design also makes testing of the prototype emulator and chips themselves vastly more difficult. Thus making hard-to-find bugs far more likely on any product with such an aggressive design schedule.

Easiest way out was to run all the wide fast paths (CPU, memory, BLIT, NTSC video) off the same derived clock edges.


Posted 2017-01-24T19:36:02.353

Reputation: 2 766


I believe the simple concise answer is so it could synchronize with the video signal.

This meant totally smooth 50/60hz scrolling and also let it overlay graphics on the signal if you had a genlock.


Posted 2017-01-24T19:36:02.353

Reputation: 11