Monday, July 26, 2010

Did the Great Recession Slow Moore's Law?

Almost certainly not:

You might think it would.  You might suppose that recessions, particularly the most severe recession in fifty years, would cause technology companies to become nervous, slow down investment in research and development, cancel new product projects, etc.  Thus, you might think a recession would cause a slowing of innovation, which would in turn affect future economic growth (since technology is a key driver of economic growth).  Because Moore's law summarizes a large amount of investment in the entire semiconductor, materials science, and computer engineering space, and since it's a key enabler of all kinds of other technology developments, it's a particularly interesting place to look for this effect.

I've been wondering about this question for a while, but today was the first day I've gotten a biggish chunk of time to look into it.  I poked around on Intel's website for a couple of hours but failed to find any metric that they consistently report for their microprocessors.  So I ended up looking at the SPEC CPU2006 set of benchmarks.  This is designed to test the performance of a whole computer system, but with a particular focus on the microprocessor (the central computation engine of a computer).  According to the website for this set of benchmarks:
CPU2006 is SPEC's next-generation, industry-standardized, CPU-intensive benchmark suite, stressing a system's processor, memory subsystem and compiler. SPEC designed CPU2006 to provide a comparative measure of compute-intensive performance across the widest practical range of hardware using workloads developed from real user applications.
The SPEC corporation defines the standards, and then holds a repository of results for a variety of different vendors who submit results, assembling them into separate listings for each quarter since Q3 2006, when the current set of SPEC benchmarks were defined.  There are a variety of possible ways to use this data for assessing improvements in computer speed. I decided to focus on two-processor servers (which are standard workhorses in data-centers everywhere, and for which there was ample data throughout the full interval from 2006 Q3 till now).  I looked at throughput (total number of test work items possible per unit of time) using both integer and floating point test suites (different kinds of basic numerical calculations commonly used in computer applications), for both Intel and AMD, the two main microprocessor vendors for the PC market.  For each of these four categories, I picked the best performance test submitted during the quarter.

You could do this a lot of other ways, but this gives me four metrics that are reasonable choices, and presumably allows me to avoid highly vendor or metric specific effects.

Here is the raw data for the four metrics (up to Q3 2010, which is incomplete, so could change if newer faster platforms are introduced between now and the end of September).  There is also a grey box for the duration of the Great Recession.  The NBER hasn't come out with final dating for the end of the recession, but June 2009 is a reasonable guess (unless the economy contracts again now, and the NBER ends up deciding it's all one big recession, but I doubt that).


As you can see, the fastest two-processor servers have been improving by leaps and bounds throughout the period, and there is no particular slowing in improvement during the recession, or immediately after it (you might expect a lag, since it takes 18-24 months to design and bring to market a new chip).

The four metrics have improved by factors of 8.2 to 10.25, depending on your pick, over these four years.  The average doubling time is between 13.5 and 15.2 months, depending on metric.  These compare very favorably with traditional definitions of Moore's Law (doubling of number of transistors every 18 months).

Here's another way of looking at the same data.  Taking 2006 Q3 as a baseline, I count the log, base 2, of the ratio of each quarter to the base quarter.  Thus each unit of 1.0 on the y-axis corresponds to another doubling, while the x-axis counts months.


You can see that progress comes in spurts as new generations of chips are introduced, with plateaus of a few quarters in between, and you can also see that AMD fell badly behind Intel and then has played catch-up in a big leap at the beginning of 2010.  But I just don't think the data provide any support for the idea that the recession caused any interruption in Moore's Law.

Presumably, there is some level of economic distress that would interrupt Moore's law, but it would have to be significantly worse than what we've been through.

13 comments:

  1. Hi Stuart,

    nice post, but one might argue words:

    "since technology is a key driver of economic growth"

    but is it? I always thought that key driver of economic growth (EG) is increasing use of fossil fuels and research, technology and innovation is "just" result of such cheap energy abundance.

    One might also say that key driver of EG is inflation (or debt and credit and interest rate) based economy, which unables (forces?) us to consume more and more.

    And finally (if you are right), would not we like to *stop* technology, if we want to stop EG, as the basic cause of most of environmental problems and unsustainability?

    Indeed, what has to change if one wants stop EG?

    ReplyDelete
  2. And OT.

    James Hansen has just published paper on theoretically possible coal phase-out by 2030:

    http://pubs.giss.nasa.gov/cgi-bin/abstract.cgi?id=kh04000r

    One comment by Kit P on RR blog sayz:

    "Hansen is another doomer that the media loves but has no credibility with me. While I would like to see nuclear increase 10% and coal decrease 10%, building 50 new nukes over the next 20 is possible but a task of significant magnitude. Building 100, wow! Building 200? Somebody has lost their mind."

    So now we have it: Simmons and Hansen (poster childs of PO and CC) are discredited doomers...

    ReplyDelete
  3. Alexander:

    I would certainly agree that energy in sufficiently available quantities is the most critical input to society. However, the Romans had coal, and it didn't drive the slightest bit of economic growth. Similarly, back when oil was first being put to use, gasoline was regarded as a toxic and explosive waste product from the production of kerosene for lighting, which was the main application of oil. It took all manner of innovations (steam engines, railroads, internal combustion engines, etc, etc) for the energy sources to actually be put to use in economic production, something that has played out over centuries, and is still playing out today.

    On the Hanson/Simmons comparison - I don't agree. Hanson is a first rate climate scientist, with a long track record of highly influential work to prove it. I'm not aware of major public statements he has made on climate science that have turned out to be embarrassingly wrong. I think one might argue that his forays into public policy are less successful, but I at least see that as driven by a combination of passion for the issue (coming from his knowledge of the science), and perhaps some naivety about the world outside of climate science.

    ReplyDelete
  4. Hi Stuart,

    on Hansen - I 100 % agree, just wanted to show that some (how much?) people regard him to be doomer (unfortunately also very inteligent people).

    Just as Steve Schneider passed by, Hansen seems to be alone in the unfair climate fight. There are lots of people, who hate Hansen for reasons I never discovered.

    On technology/energy - ok, this seems to be (since technology is there) a self feeding project - ie. more technology brings more energy and vice-versa. This can go on, until there is no more (known and usable) of that energy. Either one find alternative for the energy/technology, or will collapse, in my opinion.

    Economists usualy suggest endless substitutability and thus endless EG. I think diminishing return is often neblected...

    Alex

    ReplyDelete
  5. In one of its versions, Moore's law came undone rather abruptly with the introduction of multi-core CPUs.

    Where's my 10GZ PC? Should be here.

    Back in the day you could double the performance of a number crunching app simply by buying a new machine. Now you have to rewrite in order to leverage the parallelism. That's a real difference since some things can't be made parallel. And there is effort involved (which is bad :-D).

    If they had wanted to, Intel could have introduced parallel processing in the early 90's at the retail level. But there was no need to. Now, it's sign things are bogging and hitting physical limits. If not, we'd have 12-core 10GZ processors for our Windows 7.

    Reminds me a bit of the redefinition of oil as all-liquids. As soon as that happened, the game was up.

    ReplyDelete
  6. Benno:

    It's quite true that the GHz pretty much hit the wall, and we started getting more cores instead. You can have a 12 core processor however - AMD has started shipping them, albeit not at 10GBps.

    So the effect is that those of us who design algorithms/software (which is what I do for a living - high throughput statistical analysis systems), have to be smarter than we used to have to be. So it's another torque on the Gini coefficient - if you can do lots of parallelism, you are worth more, and if you can't, less.

    It's also driving the trend for lots of virtualization - make all those cores appear to be a large number of separate machines to run legacy single-threaded apps.

    ReplyDelete
  7. The issue in computers is that if you don't spend on R&D, you don't save money - you go under because your competitors beat you and control the market. The pace of change is so fast and demanding that Intel or AMD would probably cut almost anywhere before they cut research.

    Actually, this is true to an extent about many fields - for example, automotive technology continues to advance despite a rough economy (in fact, perhaps because of it). Look at GM - they went bankrupt because they refused to develop technology (for example, the EV1, which could have easily been supported as a side project over a decade under SUV profits) while Toyota brought out the Prius and prospered.

    The more I think of it, the more it seems to me that economic slowdowns have very little effect on technology... The 1930s saw tremendous technological development despite a depression. Perhaps the causes of technological stagnation and reverses are not well understood.

    ReplyDelete
  8. Adamatari:

    Yeah, I sort of agree. A few years ago, I looked at the patent data, and it turns out there are clear drops in patent filing due to major economic setbacks. However, I suspect that inventors and their managers are quite good at figuring out which innovations really matter, and the patents that get dropped are the crappy portfolio-filler ones, and the stuff that's really important for technical/economic progress still happens (and still gets patented).

    ReplyDelete
  9. Big Taiwanese chip firm says Moore's Law has 10 years left & lays out the economics that could kill it sooner.

    Link

    ReplyDelete
  10. "It's quite true that the GHz pretty much hit the wall, and we started getting more cores instead."

    These days, the dominant factor limiting clock rate is heat dissipation. Some of the high-end Intel processors have been run at up to 7 GHz -- using liquid nitrogen to cool them. And all of the newer parts include features that allow the OS to reduce the clock rate for different parts of the chip in order to reduce power consumption and heat build-up. Some OSs are pretty aggressive about using those features whenever load on the CPU drops off, even for a few seconds. With a watt-meter between my box and the wall outlet, I can watch the power draw increase while a page loads, then drop significantly while I'm doing nothing except read it.

    ReplyDelete
  11. Michael:

    Yes! In fact one of the interesting things I learned from this exercise is that the latest mobile processors have features to allow the OS to briefly run the processor above the rated speed in order to handle a burst of computation demand, then allow it to fall back so the device doesn't overheat.

    The other thing that is quite interesting to me about this trend is that it means that it's now meaningful to talk about the carbon emissions of a piece of software. If a software designer or programmer improves the algorithms in a widely deployed piece of code such that it uses less cpu, that will now translate into less energy consumed and less carbon emitted. Until the last few years, the software couldn't have much effect on the energy consumption of the hardware.

    It would be interesting to attempt estimates of the total carbon emissions of Microsoft Excel, or whether Linux or Windows has a lower emissions profile, etc.

    ReplyDelete
  12. The speed of light in a vacuum is about 300,000,000 meters/second, and electromagnetic waves or electrons cannot travel faster than that through any medium. Thus information cannot travel faster than that either. Moore's law is butting up against this limitation.

    Current technology is in the micrometer range, so the theoretical upper limit on clock frequency would be around 300,000,000 meters/second / 0.000,001 meters or 300,000,000,000,000 Hz. Actual physical dimensions of most chip components are in the range of tens, hundreds, or even thousands of microns, which pushes down the practical limit on clock frequency. Capacitive coupling between adjacent circuits imposes another limit on clock frequency.

    With this in mind, I believe that Moore's law has just about run its course and needs to be replaced by another law. Anyone up to the task?

    Of course nanotechnology (three more orders of size reduction in each direction) might allow the industry to hang onto Moore's law a few more decades. But the traditional mask-based chip fabrication processes will have to change. What will it be, swarms of intelligent nanobots depositing arrays of atoms or molecules to construct dense three-dimensional circuit components in the nano range? Will the material change? Perhaps more emphasis on organic molecules?

    ReplyDelete