You might think it would. You might suppose that recessions, particularly the most severe recession in fifty years, would cause technology companies to become nervous, slow down investment in research and development, cancel new product projects, etc. Thus, you might think a recession would cause a slowing of innovation, which would in turn affect future economic growth (since technology is a key driver of economic growth). Because Moore's law summarizes a large amount of investment in the entire semiconductor, materials science, and computer engineering space, and since it's a key enabler of all kinds of other technology developments, it's a particularly interesting place to look for this effect.
I've been wondering about this question for a while, but today was the first day I've gotten a biggish chunk of time to look into it. I poked around on Intel's website for a couple of hours but failed to find any metric that they consistently report for their microprocessors. So I ended up looking at the SPEC CPU2006 set of benchmarks. This is designed to test the performance of a whole computer system, but with a particular focus on the microprocessor (the central computation engine of a computer). According to the website for this set of benchmarks:
CPU2006 is SPEC's next-generation, industry-standardized, CPU-intensive benchmark suite, stressing a system's processor, memory subsystem and compiler. SPEC designed CPU2006 to provide a comparative measure of compute-intensive performance across the widest practical range of hardware using workloads developed from real user applications.The SPEC corporation defines the standards, and then holds a repository of results for a variety of different vendors who submit results, assembling them into separate listings for each quarter since Q3 2006, when the current set of SPEC benchmarks were defined. There are a variety of possible ways to use this data for assessing improvements in computer speed. I decided to focus on two-processor servers (which are standard workhorses in data-centers everywhere, and for which there was ample data throughout the full interval from 2006 Q3 till now). I looked at throughput (total number of test work items possible per unit of time) using both integer and floating point test suites (different kinds of basic numerical calculations commonly used in computer applications), for both Intel and AMD, the two main microprocessor vendors for the PC market. For each of these four categories, I picked the best performance test submitted during the quarter.
You could do this a lot of other ways, but this gives me four metrics that are reasonable choices, and presumably allows me to avoid highly vendor or metric specific effects.
Here is the raw data for the four metrics (up to Q3 2010, which is incomplete, so could change if newer faster platforms are introduced between now and the end of September). There is also a grey box for the duration of the Great Recession. The NBER hasn't come out with final dating for the end of the recession, but June 2009 is a reasonable guess (unless the economy contracts again now, and the NBER ends up deciding it's all one big recession, but I doubt that).
As you can see, the fastest two-processor servers have been improving by leaps and bounds throughout the period, and there is no particular slowing in improvement during the recession, or immediately after it (you might expect a lag, since it takes 18-24 months to design and bring to market a new chip).
The four metrics have improved by factors of 8.2 to 10.25, depending on your pick, over these four years. The average doubling time is between 13.5 and 15.2 months, depending on metric. These compare very favorably with traditional definitions of Moore's Law (doubling of number of transistors every 18 months).
Here's another way of looking at the same data. Taking 2006 Q3 as a baseline, I count the log, base 2, of the ratio of each quarter to the base quarter. Thus each unit of 1.0 on the y-axis corresponds to another doubling, while the x-axis counts months.
You can see that progress comes in spurts as new generations of chips are introduced, with plateaus of a few quarters in between, and you can also see that AMD fell badly behind Intel and then has played catch-up in a big leap at the beginning of 2010. But I just don't think the data provide any support for the idea that the recession caused any interruption in Moore's Law.
Presumably, there is some level of economic distress that would interrupt Moore's law, but it would have to be significantly worse than what we've been through.