The singularity is a speculative notion referring to the point at which exponential innovation generates a fundamental transformation of human civilisation. As Murray Shanahan puts it in on loc 78 of his book The Technological Singularity:
In physics, a singularity is a point in space or time, such as the center of a black hole or the instant of the Big Bang, where mathematics breaks down and our capacity for comprehension along with it. By analogy, a singularity in human history would occur if exponential technological progress brought about such dramatic change that human affairs as we understand them today came to an end. 1 The institutions we take for granted—the economy, the government, the law, the state—these would not survive in their present form. The most basic human values—the sanctity of life, the pursuit of happiness, the freedom to choose—these would be superseded. Our very understanding of what it means to be human—to be an individual, to be alive, to be conscious, to be part of the social order—all this would be thrown into question, not by detached philosophical reflection, but through force of circumstances, real and present.
How we should interpret this notion remains controversial. My own instinct is to see this as a form of techno-religion, delineating the point at which we transcend through our technological creations. But it is also something I feel we need to take seriously in order to understand, particularly how it is a framework for the future shaped by the conditions of late capitalism. It is in this sense that I was intrigued to see acceleration so explicitly invoked as a force which could be harnessed in order to drive this innovation. From pg 44 of the same book:
The last of these options raises the possibility of a whole virtual society of artificial intelligences living in a simulated environment. Liberated from the constraints of real biology and relieved of the need to compete for resources such as food and water, certain things become feasible for a virtual society that are not feasible for a society of agents who are confined to wetware. For example, given sufficient computing resources, a virtual society could operate at hyper-real speeds. Every millisecond that passed in the virtual world could be simulated in, say, one-tenth of a millisecond in the real world.
If a society of AIs inhabiting such a virtual world were to work on improving themselves or on creating even more intelligent successors, then from the standpoint of the real world their progress would be duly accelerated. And if they were able to direct their technological expertise back out to the real world and help improve the computational substrate on which they depended, then the rate of this acceleration would in turn be accelerated. This is one route to a singularity-like scenario. The result would be explosive technological change, and the consequences would be unpredictable.
My point is not to dispute the scientific plausibility of this but rather to ask how the notions in play come to acquire the resonance they do for those advocating and exploring the prospect of the singularity.