It's important to realize that whereas originally Moore's law was an empirical description of the emerging behavior of the semiconductor industry, it has long since become a consciously planned effort. The entire semiconductor industry co-ordinates its efforts to move to ever smaller and smaller "feature size" (the size of individual logic elements in a chip). The coordination is accomplished via the International Technology Roadmap for Semiconductors, which involves international teams and working groups that collaborate to lay out a fifteen year roadmap for the industry which gets revised or updated every year.
The most recent roadmap is the 2009 version, and I spent a little time with it this morning. I would guess that most scientists or technologists from outside the semiconductor industry would have a reaction similar to mine: this thing is a product of some alien civilization that has figured out how to make central planning work. The idea of all the major competing players in an industry collaborating to figure out all the R&D activities required to accomplish the next 15 years of progress is just extraordinary. But there it is. It's full of detailed schedules for when the industry will introduce different feature sizes and what challenges they have to overcome to do that.
Here's the key table from the executive summary:
The general idea is that the size of things in nanometers is getting smaller and smaller. If we take the "Printed Gate Length" going from 47 to 8nm in 15 years, that means that a given unit of area will hold 35 times as many logic elements. If we also take note of the fact that die size is planned to increase (a die is the unit of manufacture in silicon fabrication facilities) from the current 300mm maximum to 450mm, resulting in another factor of 2.25, for a overall increase in logic per die of about a factor of 75 over 15 years, which corresponds to a doubling time of about two and a half years. So the industry is apparently slowing down a little bit, but still has every intention of keeping going for a long time.
I don't remotely have the competence to assess the scale of the technical challenges the industry faces in accomplishing this. Still, when a large, extremely competent, and well-funded industry has a track record of improving performance at a steady rate over forty years, and has extremely detailed plans for how to keep doing the same thing for the next 15 years, I think one would be a little foolish to bet against them.
So to put it in round numbers, over the next two decades, there is likely to be about a one-hundred fold increase in the amount of available computation capacity. Therefore, anything that can be done with computation will get significantly better, cheaper, and lighter. Robots and machines will become faster and more skillful. Mobile phones/computers will be lighter and have better screens. Cameras will be smaller and much more pervasive and the software for processing their images will be much more powerful. Games and movies will have much better and more realistic animation. Video-conferencing will be much higher quality.
In short, as the physical environment continues to slowly degrade, the virtual environment is likely to get much better. I imagine that will lead people to spend more and more time in the virtual environment.