I’ve heard that one from many of my tech friends and Apple’s anemic increase in PowerBook CPU speed started the ball rolling.
Two years ago Apple introduced the 17-inch PowerBook at 1 ghz. I bought one. It’s a great machine. Once you try that much screen real estate you won’t go back. The new PowerBooks weigh in a 1.67 ghz. Two years. .67-ghz. That’s not much of an increase in two years.
So, whatever happened to Moore’s Law about computers doubling in power every two years?
It turns out that it doesn’t work that way.
First, the “law” was originally postulated by Gordon Moore, one of the founding members of chip giant Intel (as in “Wintel”). Most of us have it wrong. Moore didn’t state that computer chips would double in power every couple of years.
Moore was talking about transistors; density of the computer chips that drive PCs. Density doesn’t always equate to power and, as we’ve learned in recent years, doesn’t have squat to do with gigahertz.
Computing “power,” as Mac users know, isn’t tied up in the megahertz or gigahertz clock speed of a chip. It’s more complicated than that. So complicated that I don’t fully understand it, won’t pretend too, and it doesn’t matter much anyway.
Here’s a snippet from the Intel web site concerning Gordon Moore and Moore’s Law.
“Gordon Moore made his famous observation in 1965, just four years after the first planar integrated circuit was discovered. The press called it “Moore’s Law” and the name has stuck. In his original paper, Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue. Through Intel’s relentless technology advances, Moore’s Law, the doubling of transistors every couple of years, has
been maintained, and still holds true today. Intel expects that it will continue at least through the end of this decade. The mission of Intel’s technology development team is to continue to break down barriers to Moore’s Law.”
Hmmm. That’s not a doubling of computer “power” every couple of years, is it? Unless Intel has rewritten the “Law” in a fit of revisionist history.
The reality is that PC users simply got used to a doubling of megahertz and gigahertz every couple of years and assumed that meant “power.” It doesn’t.
A Mac running a 1.4-ghz PowerPC chip is, in many ways, applications, services, etc., about the same speed as a 2.8-ghz Intel Celeron chip. So, gigahertz and “power” don’t necessarily go hand in hand.
You won’t get me to define “power” either. Power, as they say, is in the eye of the beholder.
While it’s true that Intel continues to pack the transistors onto silicon slabs, doubling density about every two years (and they’re confident it’ll continue for a few more), that density doesn’t translate into a double faster, double more powerful computers.
Do we need a new law? How can we tell that new chips are really more powerful? For example, I’m using a PowerMac G5 with dual 2.5 ghz CPUs right now. On screen, everything happens about the same as a Mac running Mac OS 9.x four years ago.
What happened? Benchmarks here and there say this Mac can blow the doors off the fastest Pentium 4 available. I’ve got 128 megabytes of video RAM. Yet, all things considered, the Mac doesn’t “feel” any faster or more powerful despite the gigahertz, RAM, video RAM, and Mac OS X.
Power corrupts. Absolute power corrupts absolutely. Absolutely, the new Macs are faster than previous Mac generations. In some functions. Try rendering some video in a new Mac and compare times with anything Mac OS 9.x on non-G4 or non-G5 chips.
Ouch. I feel your pain.
So, under the hood, what we have today is measurably “faster” than Macs and PCs of years past. Is there a “measurement” that we could use to determine power, speed, etc.? I’d like one.
Here’s another one. More RAM is better. Low end Macs like the eMac and new Mac mini ship with only 256 megs of RAM. Whoa. That’s too little. Every report I read on the Mac mini said the 256 megs of RAM was woefully inadequate and should be at least 512 megs.
Really? Says who? Everyone. Well, almost everyone.
The really Mac folks over at the MacsOnly web site ran an exhaustive test of a Mac mini. 256 megs RAM, and 512 megs RAM, and 1 gigabyte of RAM.
Guess what? In most cases of the benchmarks, the 256 meg RAM Mac mini ran about the same as the 512 meg RAM Mac mini.
More RAM is better, right? Apparently not. Or, at least not all the time.
We need new measurements. More RAM doesn’t always equate to better performance. More gigahertz doesn’t always mean a faster computer. More transistors packed into a hotter chip doesn’t always equate to more “power” (however we define it).
We need new measurements. Unfortunately, someone else will have to come up with a standard measurement of computing performance. Fortunately, it may not matter much.
A Mac mini performing “well” with 256 megs of RAM may be a testament to Mac OS X as much as anything. What measurements do we have of OS performance? Security? Crashes and/or reliability? Look and feel (eye candy)?
We need new measurements.
What do you think? More RAM or not? What’s more “power” in a CPU? Is Mac OS X so good that there’s not as much need for RAM (for most applications) these days? How should these things be rated, compared, and measured. To share your thoughts with other Mac users, click on the Comments link below.