I agree, and something similar to this needles me periodically whenever my mind drifts into dorm room bull session mode.1 You see, I believe that we're only a few decades away from true artificial intelligence. I might be wrong about this, but put that aside for the moment. The point is that I believe it. And needless to say, that will literally change everything. If AI is ubiquitous by 2040 or so, nearly every long-term problem we face right now—medical inflation, declining employment, Social Security financing, returns to education, global warming, etc. etc.—either goes away or is radically transformed in ways we can't even imagine.I agree with Kevin that AI is the biggest deal out there. I also think it's coming, though I suspect it will take longer than 2040 to arrive fully* and that it will arrive in stages over the course of a number of decades. Indeed, in important ways we can already feel the early effects.
It's worth thinking about some things that won't change as a result of AI, or at least not quickly. Here's a draft list:
- There will still be 9 or 10 billion humans on the planet (or may the gods and goddesses help us).
- They will still want to live in big warm houses with lots of stuff, and travel around as much as they are able.
- They will still want to have sex and raise children.
- They will still be prone to getting very pissed if anyone tries to take their stuff away.
- The economy will still consist of competing corporations regulated by governments (it's just that both the corporations and governments will over time employ fewer and fewer humans).
- Humans will still have all the legal rights (I just don't see why we are ever going to think it's in our interest to give legal rights to algorithms).
It follows pretty immediately that most of our environmental problems, for example, won't be going anywhere as a result of AI.
On the other hand, we are going to have to go through some massive wrenching cultural adjustments in our ideas of work and dependency and how we derive meaning from our lives. Jamais Cascio recently coined the term the Burning Man Future, which I like. In particular, it captures the idea that the entire culture is going to increasingly have to become like what is currently a hippy artist fringe. Either that, or we need to decide that there are some things we really don't want to invent and stop working on this stuff.
* It's probably worth pointing out that I have spent most of my career designing statistical reasoning algorithms for a living so my intuitions here are at least somewhat educated. Of course, I could be as wrong as the next expert usually is.
Hi Stuart,
ReplyDeleteIsn't it also plausible to expect another two issues to exert huge influences of their own:
1. Inequality -- the rich get richer and society stratifies materially and socially.
2. "The Long Descent" (for lack of another phrase) -- energy and climate challenges cause huge societal upheaval, making technology hard to develop in the way we have in the past decades.
It seems like ubiquitous or at least commonplace AI isn't really consistent with issue 1 or issue 2 (both together or individually). I'm not asserting anything about how likely I think these are to dominate vs. what you've described, and I think on some level issues 1 and 2 are actually incompatible. It's just that I don't understand how these co-exist with your thinking about AI.
You probably don't realize that culture, when it exists, is the proof that AI an other singular little myth are just that, modern myths (comparing rather poorly to ancient ones).
ReplyDelete"Humans will still have all the legal rights (I just don't see why we are ever going to think it's in our interest to give legal rights to algorithms)."
ReplyDeleteDepending on how you construct the subset "our"* and "interest", the marginal benefit of that grant of rights has been modest and short-lived for the broad majority of humans living under developed/mature economies. We've seen fit over a hundred years ago to grant legal rights to certain legal fictions displaying emergent sentience, even occasionally in excess of those rights granted to natural persons, despite the concentrated benefit and diffuse cost. There are many other examples of the greater good for the greatest number having no apparent role in the distribution of rights.
* As general advice, always ask "Who's this 'we', kimosabe?" -- unpack the pronouns. The technique's helpful to locate hidden assumptions even in debate with the intellectually honest. It's even more enlightening when, applied to political hackery, one can watch the group referred to by the same second-person pronoun change with each occurrence.
What do you and Drum mean by AI?
ReplyDeleteWhat are the properties that distinguish an AI system from a non-AI system?
I disagree with Marxmarv that the benefits of legal rights have been modest.
ReplyDeleteThe radical egalitarianism that is the foundation of capitalism forced the ending of slavery. That was a huge gain for the former slaves. It extended the vote to unlanded men, and then to women. The logic of equality forbade discrimination on the basis of race and gender, and most recently it ended discrimination on the basis of sexual orientation. Marxmarv may not be a member of any of the target groups, but that doesn't mean the gains have been modest.
This egalitarianism is so universal that we don't see it-we're like fish swimming in its water. It's the reason why I am doubtful that we will withhold all rights from human-like AIs. Of course we will at first, but after a few decades? I'm not so sure.
If we get to the point of human-equivalent AIs, that is. The story of AI research has been a series of repetitions of "hmm, it's harder than we thought".
It's interesting to note that this age of expanding legal rights corresponded very closely with the age of expanding fossil fuel energy. One might wonder whether, absent such a wonderful surplus of energy, the expansion of legal rights would have been so broad or lasting.
ReplyDeleteWhile I can't speak for marxmarv, I took his comments to be more in line with that idea, that, given all of human history, the expansion of rights has been somewhat modest (still not extending to many persons alive today) and recent.
I also think his point about non-person persons is valid... we have, after all, decided that corporations are persons, at least to the extent that they seem to have virtually all the legal rights, but few of the responsibilities. Makes one wonder what type of "personhood" somebody might decide to apply to an AI entity.
I don't think you really need a true AI to make a lot of people redundant.
ReplyDeleteI don't care to speculate on whether machines can "think" (whatever that is exactly).
But automated driving systems, things like Deep Blue to mine data...
It's hard to pin down how much of our current society could be put out of work directly, or by changing the way we do things.
An example of changing the way we do things would be electric cars. My best guess is that they will be much more reliable than IC automobiles.
No muffler shops, radiator shops, drastically decreased need for mechanics. Heck if automated driving really takes off, maybe a reduced need for body work too.
I mean automobile manufacture has been trending down in employment number all my life basically. But the tail of people involved with dealing with cars (insurance, maintenance, that sort of thing) is truly vast. I wouldn't be surprised if around 5% of total US employment is directly involved with autos and keeping them running in one way or another.
Just one thing, but I can see the employment numbers really nosediving with respect to this.
And then there are all the other direct replacements of workers, and additional things that could be changed.
If you think about construction it really is kind of Rube Goldberg compared to how it could be done.
I mean for god's sake, every time I think about roofs I'm amazed we still use shingles.
AI? We can't even accurately develop a playable simulation of our own environment:
ReplyDeletehttp://bit.ly/YMRqBC
I had high hopes for SimCity, as I wanted to see if I could create a agrarian/light industrial/light commercial society with minimal cars. Guess I'll have to keep waiting.
Does anyone know why exactly AI would reduce ANY of these problems?
ReplyDeleteFor example Social Security financing is solved by taxing more well off workers who make over $100K period, but won't there be fewer workers if AI is replacing more jobs? Or solving global warming? ? I see AI as completely irrelevant to our big problems.
"Humans will still have all the legal rights (I just don't see why we are ever going to think it's in our interest to give legal rights to algorithms)."
ReplyDeleteWhat is a corporation if not an algorithm?