The following article is part of a six-part Transhumanity.net series, comprising a single, previously unpublished paper delivered at the 2010 Humanity+ UK conference in London. The presentation was originally titled “Shock Level Five: Augmented Perception, Perceptuo-Centrism, and Reality”.

Future Shock Levels (abbreviated to Shock Levels, or simply “SL”) are a classification system introduced by Eliezer Yudkowsky (1999), intended to categorize degrees of familiarity with technological concepts and developments. The system is comprised of five Shock Levels (SL0-SL4) describing various technologies, or attitudes to technology, and a person may be considered as having reached a specific SL if they are not particularly surprised, impressed, or worried by the technologies typical of that level. Shock Level Zero (SL0) is included in the scheme to describe the “average person” of 1999. At each SL, a few examples of representative technologies are given, followed by the kinds of people who one might typically expect to be comfortable with the concepts at that level. For the sake of clarity in further discussion, it is worth including Yudkowsky’s full description of the scheme here:

SL0: The legendary average person is comfortable with modern technology – not so much the frontiers of modern technology, but the technology used in everyday life. Most people, TV anchors, journalists, politicians.

SL1: Virtual Reality, living to be a hundred, “The Road Ahead”, “To Renew America”, “Future Shock”, the frontiers of modern technology as seen by Wired Magazine. Scientists, novelty-seekers, early-adopters, programmers, technophiles.

SL2: Medical immortality, planetary exploration, major genetic engineering, and new (“alien”) cultures. The average SF fan.

SL3: Nanotechnology, human-equivalent AI, minor intelligence enhancement, uploading, total body revision, intergalactic exploration. Extropians and transhumanists.

SL4: The Singularity, Jupiter Brains, Powers, complete mental revision, ultraintelligence, posthumanity, Alpha-Point computing, Apotheosis, the total evaporation of “life as we know it”. Singularitarians and not much else.

As we have already seen, the concept of a technological Singularity is associated with the idea that unmodified humans are unable to predict what will happen beyond a certain point in the acceleration of technological development. In other words, the Singularity represents an epistemological threshold which can only be transcended by becoming posthuman. “Singularitarians” are unmodified humans who consider the possible nature and implications of a technological Singularity, and such activity is clearly defined by Yudkowsky as being typical of SL4.

The Shock Level system is concerned with describing human reactions to technological concepts, and it explicitly places contemplation of a Singularity at the top of the hierarchy. For the reasons already discussed, Singularitarians who accept the idea of an epistemic “event horizon” must consider any contemplation of post-Singularity technologies by unmodified humans to be of little or no more worth than guesswork. Yudkowsky himself mentions in passing that “if there’s a Shock Level Five, I’m not sure I want to know about it!” (Yudkowsky, 1999), but one might reasonably argue that if there is a Shock Level Five, unmodified humans cannot know about it, in any meaningful sense.

We can now clearly see strong similarities between the “posthuman perceptual threshold” of perceptuo-centrism and the implied tier beyond Yudkowsky’s SL4, where human prediction and understanding are stymied by the sheer scale, complexity, pace, and strangeness of technological development. For this reason I suggest that there is indeed a Shock Level Five, constituted by phenomena which by their very nature cannot be perceived or predicted by unmodified humans. This profound epistemological barrier can only be overcome by modifying one’s own physical structure in ways which facilitate new modes of perception or cognition.

 

References:

Bostrom, N. (2002). Anthropic Bias: Observation Selection Effects in Science and Philosophy. Routledge.

Bostrom N (2003) Transhumanist FAQ: What is Transhumanism? In Transhumanist FAQ. Humanity Plus. http://humanityplus.org/learn/philosophy/faq#answer_19 Cited 15 Jan 2010

Carter, B. (1973). Large Number Coincidences and the Anthropic Principle in Cosmology. IAU Symposium 63: Confrontation of Cosmological Theories with Observational Data, pp. 291–298. Dordrecht: Reidel.

Davis, J. (2009, October 29). Peoria’s first cochlear implant surgery has grandfather rediscovering life. Peoria Journal Star.

Dicke, R.H. (1961). “Dirac’s Cosmology and Mach’s Principle”. Nature 192: 440–441.

Gibbs, J.W. (1901). Elementary Principles in Statistical Mechanics. New York: Charles Scribner’s Sons.

Good, I. J. (1965). Speculations concerning the first ultraintelligent machine. In F.L. Alt and M. Rubinoff (Eds.) Advances in Computers, vol 6 (pp31-88). Academic.

Kurzweil, R. (2005). The singularity is near. London: Gerald Duckworth & Co.

Leary, T. (1983). Flashbacks. Los Angeles: Jeremy P. Tarcher.

Loizou, P.C. (2006). Speech processing in vocoder-centric

cochlear implants. In Møller A (Ed.), Cochlear and Brainstem Implants, vol 64 (pp 109–143). Basel: Karger.

Moravec, H. (1999). Robot: Mere machine to transcendent mind. Oxford: Oxford University Press.

More, Max. (1993). Technological self-transformation: Expanding personal extropy. Extropy 10, vol.4 (no.2).

Stavenga, D.G. and Arikawa, K. (2006). Evolution of color and vision of butterflies. Arthropod Structure & Development, 35, 307-318.

Stenger, V.J. (2000). Timeless Reality: Symmetry, Simplicity, and Multiple Universes. Prometheus.

U.S. Department of Energy (2010) Office of Science. In: Artificial Retina Project. http://artificialretina.energy.gov/ Cited 15 Jan 2010

Vinge, V (1993). Technological singularity. Paper presented at theVISION-21 Symposium sponsored by NASA Lewis Research Center and the Ohio Aerospace Institute, March 30-31, 1993.

Weinberg, S. (1987). Anthropic bound on the cosmological constant. Physical Review Letters, 59, 2607–2610.

Yudkowsky E S (1999) Future Shock Levels. SL4.org. http://www.sl4.org/shocklevels.html Cited 15 Jan 2010