The Trouble with Gerrold: Using all the petaflops
November 14, 2012 —
(Page 1 of 4)
Related Search Term(s): multicore
The first personal computers were called microcomputers, and they were powered by a 2MHz 8-bit Z80 chip. If you had a couple thousand dollars to spare, you could load the machine up with a whopping 64 kilobytes of RAM.
Then Moore’s Law kicked in.
Actually, it’s not a law, it’s an observation. In 1965, Gordon E. Moore, the cofounder of Intel, published a paper in which he noted that the number of components in integrated circuits had doubled every year from 1958 to 1965. He predicted that the trend would continue for at least another 10 years.
He understated the case.
Before the end of the 1970s, it was obvious that an 8-bit microprocessor was insufficient. The 16-bit 8088 powered the first IBM PCs. Actually, it was a 16-bit chip with an external 8-bit data bus so it could use cheaper supporting logic chips, but that was the beginning of the x86 processor line.
The 286 chip ran at 6MHz, then 8MHz, and eventually 12.5MHz. This was the real beginning of the race for power. The 286 was followed by the 386, which ran at 33MHz. It was the first 32-bit chip, and it was fast enough to make Windows a practical operating system.
The 486 ran at 50MHz. In those days, every advance in speed was significant. It made a noticeable difference in the response time of every piece of software, and that fueled the hunger for ever more powerful upgrades.
Instead of a 586, Intel released the Pentium chip, which went through several iterations over the next few years, each time getting faster and more powerful. The first Pentium premiered in 1993 and ran at 60MHz. By 1999, the Pentium III was running as fast as 1.13GHz. The Pentium 4, released in 2006, can hit 3.6GHz on the straightaway. The Pentium D hits 3.73GHz.
Throughout the 1990s, processor speeds continued to accelerate. All those extra clock cycles allowed programmers to add more features to their software, and even create whole new categories of software. We moved from word processors and spreadsheets to speech recognition, image processing, CGI rendering, video editing, and powerful 3D games.