In this post, I would like to argue for the above ordering of problems. I mean the '>' symbol in two senses: "A > B" meaning both "The main impact of A will fall later in time than the main impact of B", and also "A is a more serious and fundamental threat to humanity than B". While a full explication of the arguments would occupy a number of books, today you are going to have to make do with a single measly blog post, albeit longer than usual.
Peak Oil > Financial Crisis
I think it's indisputable that we are in the middle of the financial crisis right now. We had the main private sector crisis in 2008, now we are finding out which sovereign governments are going to end up needing to default, and perhaps that will induce a second wave of private sector defaults. By contrast, I think the arguments for a very near-term peak in oil supply have started to look quite weak. Firstly, work on the expansion of Iraqi oil supply continues apace. But even if Iraqi production couldn't expand much, it's likely the 2008 peak of oil supply will be exceeded.
However, I've never been able to get too terribly excited about the financial crisis as a massive long-term threat to humanity. At the end of the day, debt is not a physical quantity. The fact that humanity, collectively, has written too many debt instruments means that we have been too optimistic about the future, and created more promises than can be actually serviced. However, that doesn't create any fundamental physical constraint on our activities: it just means that the excessive promises need to be renegotiated to be more in line with the our actual future capabilities. This process will be painful and difficult in the short term, but I can't see how it poses any fundamental difficulty to the continued operation of civilization. There have been financial crises and sovereign debt defaults for many centuries, and we have survived them: we will very likely survive this one too.
By contrast, peak oil, when it does come, represents a significant physical constraint, and will require a large scale transformation of a number of important infrastructure elements, one way or another, over the course of a few decades. It's also an unprecedented situation - nobody has written papers about the many previous oil peaks over the centuries. Since infrastructures like transportation are critical to the operation of society, if we were to blow the handling of peak oil, it could be quite dangerous.
Thus I argue that peak oil is both probably later than the financial crisis, but ultimately more important and threatening.
Climate Change > Peak Oil
At the same time, I don't believe peak oil by itself is really a mortal threat, unless we seriously mishandle it. Fundamentally, our society is so wealthy and creates such a large economic surplus relative to our actual bodily needs that we have tremendous amounts of waste and potential for conservation. We can perfectly well get to work, or the grocery store, on scooters instead of in SUVs if we have to, or electric hybrid bicycles if it comes to that. We can take fewer foreign vacations, and we can video-conference instead of flying off on business trips. We may not want to, we may bitch and moan, but it's certainly possible to do these things rather than experience the end of the world. Fundamentally, what the post-peak period requires is that we get a few percent more efficient in our oil usage each year. We've already proven our ability to do that in the late seventies and early eighties.
So the real threat of peak oil is that we'll refuse to accept what needs to be done, and instead start fighting resource wars and the like (see Cheney, Dick, Iraq 2003). Converting ever larger fractions of our food supply to fuel is also a spectacularly dumb approach that we have a choice whether or not to do. The issue is thus more social/political/cultural than it is technological.
However, the great thing about peak oil, especially compared to climate change, is that when it occurs (and even in the run-up as we saw in 2005-2007) it will call attention to itself in an impossible-to-ignore way. When you have not quite enough oil, the price goes through the roof, and everyone immediately realizes that there is a major not-quite-enough oil problem, and begins thinking in terms of smaller cars, hybrid cars, skipping the family vacation this year, etc. The price feedback is immediate and relentless until enough conservation has been achieved for however much oil is available this particular year. And the same thing will be true all the way down - if at some point we need to transition from gas scooters to electric scooters, there will $40/gallon gas pointing out to us the immediate urgency of that step.
By contrast, climate change lacks the immediate personal feedback. We all pump the greenhouse gases up into the air, and there is no immediate consequence to us - no equivalent to paying a high price at the fuel pump. Instead, the gases gradually accumulate and the climate slowly warms, and the oceans slowly warm, and over the decades, the floods, the deadly heatwaves, the hurricanes, the forest fires, the disease outbreaks, the coastal cliff collapses, the species extinctions and bizarre ecological catastrophes all get more and more common and more and more serious. But because both the climate system and the major infrastructure in the economy have decades and decades of lag built in, by the time the problem is so incontrovertible that everyone can agree on its true urgency, it will be very late, and we will already be locked into an enormous amount of damage. Thus ultimately, I think climate change poses the more serious threat because it doesn't map so naturally to the way human incentive structures work.
And by the same token, I think the bulk of the climate change problem will occur later than peak oil. I think it's pretty much inconceivable that peak oil will occur later than around 2030, but climate change will just really be getting going then, and will gradually turn into a more and more furious rolling catastrophe, with the worst impacts in the second half of this century, and getting more and more severe until we finally get the situation under control (or succumb to it).
Singularity > Climate Change
But still, as enormous as the transformation required by climate change is, I don't think it's the biggest monster under the bed. True, most of our industrial society has been powered by fossil fuels for a couple of centuries, and it's all we've ever known as individuals. But perfectly good high-net-energy alternatives are known in modern wind, solar, and nuclear, and there's no in-principle reason we can't transition to them provided we make a serious all-out effort.
And the good thing is, although it's the largest such transition we've ever attempted, it's fundamentally a kind of challenge that we in Western Civilization recognize. We've created a problem, and the solution requires technological innovation. We need better renewables, smarter grids, more efficient electric cars. This stuff is fun for techies to think about, and Silicon Valley is already on the job with massive investment in clean-tech in the last five years. A defining feature of western civilization is that we value innovation highly. The patent was invented and became widespread in medieval Europe; with one minor exception for the Greek city state of Sybaris, there is no evidence of earlier civilizations using patents. Braudel, in comparing medieval European civilization to other world civilizations at the time points out that Europe was the only place with a clear concept of fashion. Other civilizations simply didn't have the concept that it was valuable to dress or decorate in a new and different way than your forebears. So while other civilizations have certainly created many innovations, none has institutionalized and glorified the innovation process in the way that ours has. And to this day, across the contemporary political spectrum, innovation is valued and seen as the way to solve all our problems.
However, the problem that we are gradually innovating more and more of our fellow citizens out of jobs, and thus out of any meaningful stake in our society, is not one that obviously lends itself to innovation. In that case, innovation looks to me like the problem, not the solution.
I have not yet talked much about the singularity on this blog, so let me briefly recap the main ideas of the techno-optimist computer scientists who have mostly thought about this, and then give my own take.
The basic idea was first formulated by computer scientist and science fiction author Vernor Vinge, but in recent years the most visible exponent has been computer scientist and inventor Ray Kurzweil. I was first introduced to these ideas via the latter's book The Singularity is Near, which I take to be still a reasonably current exposition of Kurzweil's thinking.
The argument goes like this:
- Throughout pre-history and history, the rate of evolution has accelerated. Physical evolution proceeds slowly at inventing new forms, once humanity came on the scene prehistoric cultural evolution proceeded faster, then with writing faster still, then a globally integrated society, now Internet connected, has innovated faster and faster still. The rate of change is now breathtaking.
- In particular, computers are now getter faster and faster and more capable. The rate of physical computation speed is increasing exponentially according to Moore's law, in which the amount of computation that can be done by one chip doubles every two years (roughly stated). There are no major physical limitations that will prevent Moore's law continuing decades into the future.
- Computers are now slowly getting smarter and smarter as we solve particular problems one at a time. Eg, the voice recognition problem is slowly coming under control, face recognition is starting to be a commercial proposition, driving a vehicle is now at the pretty-decent-research-prototype stage, etc.
- By the mid 2030s, available computers produced each year will exceed the information processing capacity of the entire human population, and by the mid 2040s it will exceed it by a factor of a billion.
- With ever-increasing progress in neuroscience, we will be able by then to reverse-engineer the human brain, and use computers to create general human-quality intelligence, except it will be able to go faster and faster than the basic version that biology has managed to produce to date.
- Once machine intelligence surpasses human intelligence by a sufficient margin, it will continue to accelerate the rate of intellectual and technical progress at rates massively higher than at present, and this acceleration will continue at ever greater rates. The name singularity comes from the idea of a comparison with a black hole where there is an event horizon beyond which one cannot see. Once ever-accelerating machine intelligence takes over, things will happen so fast and furiously that we today have no idea what will happen and cannot predict the consequences.
- Nonetheless, you shouldn't worry about any of this, because we will be able to use all this amazing technology to augment and transform human consciousness. We will be able to reverse engineer how we work, modify and improve on it, download ourselves into new/better/synthetic bodies, cure world hunger, etc, etc, etc... So life will be better overall.
Obviously, in compressing an entire long book into seven bullet points, I am leaving out a variety of nuances and supporting lines of evidence, which readers are welcome to explore on their own, or we can discuss further on this blog in future. My overall reaction to the book was to first feel incredibly depressed for a few weeks, and then spend a number of months in research gradually sorting out which parts I believed and which I didn't. Herewith my take:
- (Accelerating technical progress). I think, with some caveats, this is broadly qualitatively true. Our society does innovate faster, than, say the Ubaid proto-civilization, and in turn, the Ubaid were undoubtedly innovating faster than Homo Erectus. However, you can, and Kurzweil does, overstate the issue. Probably the best metric of overall rate of technical progress in the industrial era is growth in the global world product, and that has been a few percent a year throughout that time - it is not dramatically accelerating recently, though industrial growth is faster than pre-industrial growth.
- (Moore's Law). The historical rate of progress in computer speed is incontrovertible and general technical consensus is that there is no near term barrier to it continuing (eg see this book for a summary of the issues). I don't see a reasonable basis for questioning this, though of course future events are always somewhat uncertain.
- (Computers getting smarter). This is incontrovertibly true. While the speed with which this would happen was overstated in the early years of AI, it is the case that with each passing decade more and more things traditionally requiring human expertise can now be done by computers (chess, voice recognition, driving, etc).
- (Available computer power will exceed human information processing capability in 25 years or so). As a statement of raw computing power, this seems plausible, though of course the further we try to extrapolate Moore's law, the less certain the result.
- (Ability to reverse engineer human intelligence and produce a synthetic equivalent also within about 25 years). Here I think Kurzweil gets onto thinner and thinner ice. At the moment, we computer scientists have no real idea how to produce anything like the generality and flexibility of human reasoning, so it's going to take major conceptual breakthroughs before we are able to do so. So I think the time estimates here get much less certain than he indicates - raw computation power alone doesn't make for a general flexible intelligence, and fundamental breakthroughs are hard/impossible to predict in advance. At the same time, it does seem reasonable to project that the general direction at present will continue, with more and more tasks being automatable, and as neuroscience and cognitive science and computer science all continue, it seems likely there will be major progress in these areas. However, I think the timing is much harder to project.
- (Further acceleration and singularity). The further out we go, the more speculative it all gets. However, I don't dispute the general proposition that machine intelligence comparable to or better than human intelligence will be a complete game-changer and things will never be the same again and we can't predict what the world would look like afterwards. That much seems elementary.
- (This will be good for the average human). Here I found Kurzweil's reasoning very fantastical... In particular, I think he seriously neglects the fact that medical progress is far slower than computer progress and there are excellent unavoidable reasons for this: new medical technologies require massive clinical trials to establish their safety and efficacy, while new generations of chips and computers don't. Therefore, progress on silicon intelligence will continue to be much faster than progress on biological intelligence. This might not be so good for many of the biological types...
The choice of "singularity" as metaphor for this all this is interesting, I think. Falling into a black hole would not normally be considered a pleasant or desirable prospect, but this was the metaphor picked by people who think this will be a good thing.
I think the key issue to analyzing the goodness, or otherwise, of these trends is to look at why/how technologies get adopted. I've spent the last decade working in technology startups building and selling new computer technologies, so I think I have a decent understanding of this. There are basically two main selling points to any new technology:
1) This is whizz-bang cool and will enhance your social status to own.
and then, either:
2a) This will enhance your productivity and allow you to do more/better of whatever it is you do in less time (for a utilitarian or business proposition).
2b) This will be fun to do (for a consumer/entertainment proposition)
Usually points 1) and either 2a) or 2b) points are required to close the sale. Technologies do not get adopted because they generally improve human welfare, they get adopted because they work on these points for specific sets of customers. Thus corporate voice recognition phone systems did not come around because all our lives are improved by it, but because it allows companies to handle more customers with fewer staff, thereby providing a return on investment to the project. If in the future, we get automated vehicle driving systems, the reasons will have nothing to do with overall human welfare, but with the specific profitability equations of trucking firms, or taxi-firms.
Likewise, new consumer technologies do not get deployed to the benefit of poor people, they get deployed to the benefit of people who can and will pay for them (aka customers).
And this will continue to be the case in the future.
So in my view, the dominant social effect of all these ever-accelerating-integrated-globalized-world-stuff is to create a divide: if you are sufficiently smart/skilled/creative to stay employed and take advantage of all the new toys, life gets better and more fun and you feel more powerful and enabled. But if you are not sufficiently smart/skilled/creative (or you just decide the game is fucked), then you are out of a job and now you can't afford most of the fun stuff and not only that, you live in the midst of a society that you know doesn't have any use for you and doesn't value you.
And unfortunately, we have this:
and this:
So the fraction of people below the divide is growing steadily. Slowly, but steadily. In fact, if you look at the trend, it's pretty close to linear. Here is the employment/population ratio for men 25-54 again, with a linear and quadratic fit. They barely differ:
Given the way technology is adopted by businesses, one of the effects of a singularity that I would expect is that the employment-population ratio would fall to zero. Machine intelligence, if equally good, will always displace human intelligence from the workplace because it doesn't insist on going home at night, doesn't have any human rights, and can be replicated in arbitrary quantities (meaning it will cost much less) and be customized to the task at hand. If we extend the employment population ratio out to 2100, and extrapolate the trends, we get this:
The orange circle shows Kurzweil's approximate singularity date with my contention that it will be signalled economically by an employment/population of zero. I don't find it plausible that that curve is going to bend over that far, but maybe. I wouldn't necessarily have a great deal of confidence in a linear or quadratic extension either - it's hard to believe that another few decades of Moore's law won't have some pretty massive effects on the economy, and on employment in particular.
My best guess is that a singularity of some kind is probably coming, but slower than folks like Kurzweil think. However, just as the gravitational tidal forces from a black hole tear you apart some time before you reach the singularity, I think the tidal forces of increasing automation are already at work and will become gradually more stressful over the coming century.
Finally, I note that there are complex interplays between these various threats.
For example, consider the effects of peak oil in the context of increasing automation. Reversalist thinkers like the Archdruid, or Jim Kunstler, believe that one effect of peak oil will be to cause replacement of machines by human labor again (on the theory that with less energy, we'll need to use more human labor, just like we did back in the day, when we had less energy before). However, this certainly isn't evident in the data to date, and it's not at all obvious that it follows. Consider, for example, the decision as to whether to lay off the security guards, and replace them with security robots. This is a decision that companies will take individually based on the economics and quality of the security robots, but presumably over time, these will improve and there will be more robots and fewer humans doing physical security, and the employment-population ratio will drift a little lower.
However, consider the implications for oil demand. An employed security guard will be driving to work in his (or her!) pickup truck, and if you pay him enough, he may even fly off for a yearly vacation, or to his daughter's wedding on the other side of the country. If he's not employed, he probably will use far less oil like that, and the robot that replaces him will run on a little electricity and no oil.
So it's not at all clear that less oil implies less automation. Similarly, addressing climate change by switching to renewable power doesn't prevent continued automation either, because machines can be fundamentally more efficient than humans at converting sunlight to work.
And so I think, in the longer term, the threat to most of the nine or ten billion of us that there will soon be is that the economy simply won't need us to operate. And if we address the financial crisis, solve peak oil, and fix climate change with ever increasing amounts of cleverness and innovation, we will probably simply hasten that day.
Thanks for a very interesting post. The one bit I would critique is the notion of declining employment. There is far more to our economy than manufacturing. We have artists, musicians, nail salons, professors. In short, we like to provide each other with all kinds of services that will constitute a future economy.
ReplyDeleteRight now, our manufactured items could be made by machine just as well as being made in China .. no discernible difference to us. More swathes of our needs will be provided more efficiently, but when have we ever run out of needs? Never.
Also, the word singularity might have a plain mathematical meaning of discontinuous asymptote or uncalculatable point, etc, rather than referring to black holes necessarily.
Your ordering of threats makes sense, but I would personally rate climate change over the singularity, since climate change is a sure thing and a very bad thing under BAU, (extinction is also a forever thing), while the singularity is both uncertain in its occurrence, and may have very positive net impact whenever it, or something like it, occurs.
Burke:
ReplyDeleteAt least in this post I didn't refer to manufacturing employment. All the stats are just overall (US) employment/population ratios for men, which have been gradually going down for decades.
Burk: (sorry about the 'e' earlier!)
ReplyDeleteAlso, what new services in particular do you see yourself requiring in the future from people who weren't very successful in the education system? How much do you anticipate being willing to pay on an hourly basis for folks like that?
I could imagine demanding musical services, or bad movie services (youtube), or dog-walking services, escort services, etc. Remember that we will have money to spend that doesn't need to be spent on necessities any more.. those are all freely available.
ReplyDeleteMoney is still valuable, but as claims on services of our fellow humans, and perhaps dispensed in a socialistic scheme of some kind. Just as food is pretty freely available today to the hungry in the US, other necessities will become minimal for even poverty-level people as time goes on. Employment will doubtless go down, but my point is that it will never be zero, since we always find ways to serve each other
As far as education is concerned, it will also transform, from the regimentation currently still modeled on building obedient factory workers, to something where we simply try to bring out the individual desires and talents of each person, whether or not they are "productive" in the ancient sense. Education will be life-long and one of the many services we give each other even as machine are capable of doing virtually everything better than we can, even learning and teaching.
When we implant and become machines .. that is another matter entirely. It is difficult to predict what will become of us then. Perhaps a rump class of organic humans will be housed in zoos, as the Amish are now, sort of.
Your employment stats are too optimistic. Take into the account of the mass incarceration boom of the past twenty years and unemployment looks a whole lot worse.
ReplyDeleteIt is worthwhile to ask whether incarceration is being used a tool of social control to minimize the disruption of having increasingly large amounts of people underemployed, especially those -poor, dark- people who have historically been marginalized.
And whether, in turn, having such a large amount of people incarcerated then makes them less likely to ever enter the labor force in the future.
I've never understood why singularity != Terminator. Why wouldn't we use robots on the battlefield? It's so much more efficient to have an expendable and armored machine doing the killing. Isn't that a big part of what the UAVs are all about? If those robots become sufficiently intelligent, why take their orders from us? Why would conscious, intelligent machines want to serve us?
ReplyDeleteThere are plenty of potential stressors out there, but I've thought for a long time that if they don't lead to war, they aren't a real threat to civilization. In the short and medium terms, the only real threat to the safety of our societies is our fellow humans.
I think that's mostly true even for climate change. If a history is written of what ended modern civilization at the end of this century, the big story will be the wars, with the background the increasing drought problems, crop failures, and mass migrations.
So the question for me is, what combination of stresses would push significant parts of the world population into war? Financial stresses, and related resource restrictions, were sufficient for WW2. It doesn't take too much more than a stressed population, some problem they can blame on someone else, and someone to excite them to arms. A few years of high double-digit underemployment and a decline in oil exports and we're most of the way there.
Part of our problem in the U.S. is our work ethic. From the European perspective, if there was so much excess labor in the system, then lets have a shorter work week and even more vacation - time off for personal events, etc. We'd do well to practice living with lower official "productivity" and instead enjoy more culture. Certainly figuring out how to take more time off and sharing the wealth from the production of our robots should be easier than figuring out climate change.
ReplyDeleteGreat post. Only 2 ideas to add:
ReplyDelete1. Did your take the 2nd and 3rd world into account in your ideas/writing.
2. If you look at all the problems isolated your ranking seems ok for me. But i think we should not forget about the power of complex systems.
Erin:
ReplyDeleteI look at the employment/population ratio, rather than unemployment, for just this kind of reason - I think there are a lot of folks who aren't officially counted as unemployed because they dropped out, or are operating in some criminal subculture, or are incarcerated etc, and so they don't count as actively looking for work. But nonetheless, the ever expanding numbers of such people strike me as a sign of a society with problems. (I'm not saying all such individuals have made an invalid choice for them, but as a societal phenomenon, it worries me).
Burk:
ReplyDeleteMoney is still valuable, but as claims on services of our fellow humans, and perhaps dispensed in a socialistic scheme of some kind.
Exactly. To the extent people are not part of the means of production, if they are to have the means of sustenance from society, then it must be given to them in some form of welfare. But in our society, we have very few positive experiences with either large scale socialism, or primarily welfare funded communities. Both have tended to be social disasters. My assumption is that, because we are basically social primates, we have a need to feel that we are contributing to the tribe, and when we feel useless to society, we feel bad - our lives become meaningless - and many of us, in that situation, will start to act out in some way.
Kjm:
ReplyDeleteThe best coverage of that topic that I know of is Peter Singer's book Wired for War. It's a pretty chilling read. The robotic revolution in military affairs has definitely started and is well underway.
Very interesting Stuart. Your points regarding singularity will take me a while to digest. They raise issues about what life actually is and what we humans mean by "we humans." Clearly we've been linked to technology, and in a sense are tools of our tools, for a LONG time.
ReplyDeleteRegarding peak oil, the latest EIA data (with the customary retroactive revisions) show that 2005 had higher production of crude oil and lease condensate than 2008.
I think the breakdown in complex systems will proceed rapidly enough that I'm not particularily worried about a singuality affecting us.
ReplyDeleteI think you really put climate change/ecological collapse way farther out that the evidence suggests (to me). I phrase it that way because I think many of the things we see happening right now in our ecosystems are not necessarily solely the result or even mostly due to climate change, but have a component.
But consider the following symptoms of ecological breakdown already occuring:
1) accelerating species loss.
2) rapidly increasing numbers of dead-zones in the oceans.
3) acidification of same oceans.
4) widespread loss of corals.
5) the death of vast areas of rocky mountain forests.
6) The dieout of honeybee colonies
7) Loss of ecosystems due to introduced species.
8) dieout of amphibians worldwide.
9) continued rapid deforestation
10) melting of tibetian glaciers.
11) accelerating instances of droughts and floods worldwide.
This short list is just off the top of my head, but I think the collapse of ecosystems is farther along that you suggest, and I think the there is a risk that economic effects of these will lead to contraction/collapse/reordering of societies MUCH sooner than post 2030-50.
There have been some good pieces on the singularity at Slashdot. A consensus opinion is that Kurzweil is much too optimistic for his own good, that he overstates his case quite dramatically, sort of a Matt Savinar for nerds. Well, MS himself no doubt has some dorky followers too...
ReplyDeleteKurz made quite concrete bold predictions in 2005, too:
The 2010 Scenario. Computers arriving at the beginning of the next decade will become essentially
invisible: woven into our clothing, embedded in our furniture and environment. They will tap into the
worldwide mesh
[snip]
We'll have very high-bandwidth, wireless communication to
the Internet at all times. Displays will be built into our eyeglasses and contact lenses and images projected
directly onto our retinas.
[snip]
By focusing multiple sets of beams on a wall or other surface, a
new kind of personalized surround sound without speakers is also possible.30
These resources will provide high-resolution, full-immersion visual-auditory virtual reality at any time.
[snip]
We'll have real-time translation of foreign languages, essentially subtitles on the world, and access to
many forms of online information integrated into our daily activities. Virtual personalities that overlay the
real world will help us with information retrieval and our chores and transactions.
Etc. Glad to see you point out that so much of this tech is driven by whiz bang impulses, i.e., gaming. Many of the Slashdotters referred to much more nuanced commentators on this than Ray, forget the names.
Now, what will drive the development of tech if we're cutting back drastically on PC purchases, or are ironically out of a job in the first place because of that advancing tech? Or if we're forced to resort to carpooling to get to work because of fuel shortages, and our partner isn't interested in tooling over to Best Buy on our way home? We can shop online of course, but where does that leave BB? I really liked Big Gav's TOD piece about hooking up carpoolers via cell phone, btw - that could fill some of these type gaps, assuming they are implemented successfully.
There are several issues that I have with the discussion of a singularity. As a PhD in Neuroscience, let me say that our fundamental knowledge of how cortex -> consciousness is still very primitive. Not that the science gap couldn't be closed, but powerful computers are accelerating our understanding of that area as much as they are accelerating, say facebook pages served per day.
ReplyDeleteI doubt that a computer will ever beat a human on energy/computation. With 1 calorie of food energy - our brains are able to do a staggering amount of computation. With current energy prices, a high end server already costs more to power over its life time than it does to buy. Ever increasing processing power requires ever increasing energy use. A computer is a more "efficient" machine than a ICE in terms of the "work" done - but it is still bound by the limits of energy.
I also think that the singularity is really the odd man out in your list. It truly is still science fiction, while all the others are to some degree "here and now"
-Preston
in my above comment, that should have been "powerful computers are *NOT* accelerating our understanding of that area as much as they are accelerating, say facebook pages served per day."
ReplyDeleteKLR - yeah, I also think Kurzweil is exaggerating the speed with which things will occur.
ReplyDeleteptone:
ReplyDeleteI'll certainly defer to your neuroscience expertise! I did read 3 or 4 neuroscience texts a year or two back, as well as a bunch of stuff on the current state of neural implants. The conclusions I came to is that progress on understanding and interfacing to the brain is likely to be much slower than ongoing progress in silicon. It seems that it's inherently difficult to make a precise interface to a big jelly blob, and in addition, the need for brain surgery and clinical trials of devices seems like it will make progress way slower than the 18mo-2 year product generation turnaround of the chip industry.
I don't know about the ultimate end point in Mips/Watt, but improving it has been a big trend in recent years - primarily driven by mobile devices. Rack servers could be made far more power efficient if that was a major sales driver. There is a whole green data center movement these days, so I expect there is some pressure for improvement. However, in the specialist security appliance space, I've never of a customer mentioning it. For five and six figure specialized boxes, the energy cost of running it is neither here nor there.
Very interesting. I'd love to see the book version of this. Question, though: Is it just a coincidence that your two uses of the > symbol are perfectly congruent? Why is each more serious challenge also further in the future? Does this say something about the nature(s) of these disparate problems, or about the way you're thinking about them, or as I mentioned, just a coincidence?
ReplyDeleteAlso, it seems to me that if the first three problems are serious enough (and the efforts to deal with them are nearly all-consuming, as it seems they might get to be), then I don't see how the singularity will happen. If our societies and economies are being strained by recurrent financial/economic crises, the effects of peak oil, and the effects of climate change, and also massive and disruptive attempts to "solve," blunt, delay, mitigate, or adapt to these problems ... then from where will come the funding, the energy, the special tools, equipment and materials, the highly-educated and highly-paid workers, and the secure facilities (and the overall social stability) -- all of which will be required in order to do the work which could bring the singularity about? Seems to me that fields like artificial intelligence and informatics will get lost in the shuffle, or at least become relatively stagnant. We will have more urgent things on our plates, which will attract whatever attention, energy, talented people and funding can be spared.
Long time reader..very interesting post!
ReplyDeleteSo can I ask when you do see the peak happening and do you see a severe shortage in the 2014-2016 range, like I keep reading on peak sites?
I think I would put the financial crisis 1st. If we have full-on collapse in the next 2-20 years, like some are saying, that will greatly undercut the funding needed to do any mitigation of the other problems. I believe the financial crisis will deliver the 1st and most serious blow(assuming it hasn't already). I hope I am wrong. Although, even if I am there is x,y,and z waiting to take its place.
Feeling rather gloomy today! Maybe it's all the rain. :-)
Singularity is nothing but wishfull thinking, a reflection of the ultra utilitarianist mindset that can't see anything else, and most artificial intelligence research is pure junk.
ReplyDeleteLanie:
ReplyDeleteWelcome to comments!
Mike:
ReplyDeleteExcellent points/questions. On your first point, I think one issue where I'm probably forcing things a bit in order to be cute is on the relative timing of singularity/automation issues and climate change. I think really these are both things that are causing trouble already, and will get steadily worse over the century. So really the timing of these two is probably roughly similar in so far as we can tell, which is not very precisely.
Other than that, I think there is something a little more than coincidence to the relationship - humans are better at responding to acute crises than ones with a lot of lags built in, because we have fairly short discount horizons.
And I generally agree with your second point - if we fail to navigate the earlier challenges successfully, then we won't be worrying about the later ones. The way I think about it is more "even if we deal with the financial crisis, peak oil, and climate change, we still face an automation singularity! Yippee!"
Good, thoughtful post, with an interesting take on the singularity thing.
ReplyDeleteYour point about "separation" or "stratification" is the most salient. Technology is not a leveling force in human affairs, and when the weight of the futurist singularity falls on the 7-9 billion of us, only a relative few will benefit.
I don't think there we'll get that far due to the other points you noted (peak oil is undersold somewhat in your analysis as it directly relates to "peak food".) The coming resource wars over water, oil and shrinking strips of arable land will be effective population curbs (as will our fascist tendencies to incarcerate/terminate anyone who faintly resembles a threat to "our way of life".) We--by whom I mean "most of humanity"--won't see Kurzweil's dream; we'll just be statistics counted up by those increasingly powerful computers run by the rich people in their heavily fortified bunkers.
Very insightful, thank you.
ReplyDeleteThe "economy" may continue to function on fewer humans - and continue to facilitate exchanges beneficial to the parties to such transactions. But what happens when approaching the point where there are no humans - and transactions are apparently between entities of artificial intelligence?
Will such entities have the incentive and impetus to continue such transactions on their own, and if they do, will humans amount to detritus?
Your robot example does not take into account the massive industrial and transportation infrastructure required to both build and upkeep it.
ReplyDeleteAside from the fact that without cheap oil all of that infrastructure will be more valuable as scrap metal unless it is being used for essential services like food production, or more unfortunately, warfare, the real question is: Is it more efficient to have a robot in place, or to have guards living on site with an attendant victory garden?
When oil is at $50 a gallon i think it's pretty clear that the latter wins out by a landslide, and so it is that kind of arrangement that you should expect to see in the future.
Without fusion power, the singularity is a religious movement. How will you power this grand singularity? Even with the necessary energy it is not at all a given. With some luck fusion will remain out of reach until humanity has learned to reintegrate itself with the natural world, or else we are sunk.
What humankind needs to realize is that we are destroying the most sophisticated technology ever devised, and that is functional ecosystems. Considering the hubris of industrial culture, i'm not holding my breath.
Here, pop culture raises its ugly head in form of the 'singularity' which is a phenomenon categorized somewhere between 'Una- bomber' and Washington Capitals' winger Alex Ovechkin.
ReplyDeleteThis is written on a computer that is 100 times as fast and has 200 times as much memory than 'the old one' but is less useful because old one runs Photoshop and Word and this one doesn't.
You have to know Steve's First Law of Ecnomics: the cost of managing a surplus (memory or computing power or dollar reserves) rises with the surplus; eventually the cost exceeds the value of the thing itself.
The singularity is too expensive and so is pop culture, which requires autos, coal mines, fly ash avalanches, nuclear 'dead zones' and massive oil spills among other things.
As for finance not have 'real' effect, please Google 'The Great Depression'. One of its unintended consequences - the Bruning Deflation - resulted in Adolf Hitler and you know what happened next.
I jumped over here from your comment on Kunstler. Very thoughtful posts and the comments, too. I'd like to recommend Richard Powers' novel Galatea 2.2 for a very meaty meditation on artificial intelligence. Myself I'm getting tired and a little suspicious of the glibness and glee from Kunstler and the Archdruid. I agree that technology will continue to put human beings out of work. What I don't see here is the interplay between peak oil and climate change. They aren't separate. If we get to the point where the price of oil skyrockets, then we will already have done irreversible damage to the climate. So, where you say something about getting a grip on the climate later in the century – it will be too late, the feedbacks will be out of control. So, if we continue on the path of innovating and manufacturing whiz-bank goodies for a shrinking elite, we are going to fry the planet. Tell my I'm wrong, please.
ReplyDelete10in10 - I'm not saying the time for action is in the second half of the century, I"m saying the worst of the damage will occur then. There are huge lags between action and response. From when we really decide to get serious, it will take several decades to substantially decarbonize the economy, and then there are an uncertain but significant number of decades added by the ocean's heat capacity. So the time to start acting was "a number of years ago".
ReplyDeleteRight.
ReplyDeletesingularity is impossible until the controlling weak link (the human species) is replaced:
ReplyDeletehttp://geaugailluminati.wordpress.com/2009/07/01/the-human-race-is-almost-finished/
A personal anecdote concerning automation of jobs. In a previous life I was a crew member on E3 AWACS aircraft for the USAF. This is a 707 type airframe with a typical crew load of 18-30 individuals (4 on the flight deck, the balance in the back), depending on the mission. It was initially fielded in the late 1970s as computer systems were just becoming advanced enough to be deployed in airborne C3 applications. Other than some computer, navigation and radar system upgrades, the current fleet is not much changed. The newest C3 aircraft on the block, the Aussie 737-based Wedgetail, will perform much the same mission with vastly improved battle management capabilities and half the crew (8-10). Eliminated are 2 flight deck positions (Engineer and Navigator) and numerous "technician" positions. The E-3 has technician specialists for the communication, computer display and radar systems, for a total of 4 mission crew techs. I'm not sure about the Wedgetail, but modern automation has probably reduced the number of techs to 1 or none. Of course, in this example, fewer people is a good thing, but the point is that military automation will have 2 major impacts: 1) fewer people will be employed by the military, which in past economic downturns has provided employment to many who otherwise would compete for scarce civilian employment and 2) enable technologically and/or financially gifted nations to wage wars with far fewer humans in the line of fire than their less fortunate opponents.
ReplyDeleteStuart, I'm sorry I missed this post when it first came out...
ReplyDeleteOne question I have: How do I know that looking at only male employment trends is meaningful? The last few decades have seen tremendous social changes with regard to gender. What have female employment trends been? Same trend or different? What is the employment trend for both sexes? It seems to me that if you're comparing robots to humans, you ought to look at all humans.
Burk: "Thanks for a very interesting post. The one bit I would critique is the notion of declining employment. There is far more to our economy than manufacturing. We have artists, musicians, nail salons, professors. In short, we like to provide each other with all kinds of services that will constitute a future economy. "
ReplyDeleteSo far (for the past few decades, and the foreeable future), these jobs in general pay less. There's many, many scutwork jobs for every professor job. If you look at the old 'symbolic analyst' idea, those are precisely the jobs which are soon to be wiped out.
And BTW (referring to the original post) it's not just a matter of 'smart/skilled/etc.', it's a matter of ownership. If you own these AI assets, you draw profits from them. If you don't too bad, for most people.
What we've seen in the developed world, especially in the USA, is a trend towards a majority being economically displaced sharply downwards, a minority not profiting but making a living, and a tiny minority reaping virtually all of the increased income.
What you are saying makes no sense. It assumes an exponential growth rate rather than a linear on.
ReplyDeleteThe whole singularity thing is bogus, a red herring and a waste of time. If Kurtweil had to bet his families lives on it, he wouldn't.
Just regarding Moore's Law. It has reached the very limit of semiconductor technology at a quantum level. You can't divide a track that is one atom wide in half. Even Moore says it would be totally ridiculous for the Law to be expected to continue forever. It was never even a law in the first place but an OBSERVATION more relevant to Intel s's production cycle.