Why should we be concerned about controlling the singularity when it occurs? Numerous papers cite reasons to fear the singularity. In the interest of brevity, here are the top three concerns frequently given.

  1. Extinction: SAMs will cause the extinction of humankind. This scenario includes a generic terminator or machine-apocalypse war; nanotechnology gone awry (such as the “gray goo” scenario, in which self-replicating nanobots devour all of the Earth’s natural resources, and the world is left with the gray goo of only nanobots); and science experiments gone wrong (e.g., a nanobot pathogen annihilates humankind).
  2. Slavery: Humankind will be displaced as the most intelligent entity on Earth and forced to serve SAMs. In this scenario the SAMs will decide not to exterminate us but enslave us. This is analogous to our use of bees to pollinate crops. This could occur with our being aware of our bondage or unaware (similar to what appears in the 1999 film The Matrix and simulation scenarios).
  3. Loss of humanity: SAMs will use ingenious subterfuge to seduce humankind into becoming cyborgs. This is the “if you can’t beat them, join them” scenario. Humankind would meld with SAMs through strong-AI brain implants. The line between organic humans and SAMs would be erased. We (who are now cyborgs) and the SAMs will become one.

There are numerous other scenarios, most of which boil down to SAMs claiming the top of the food chain, leaving humans worse off.

All of the above scenarios are alarming, but are they likely? There are two highly divergent views.

  1. If you believe Kurzweil’s predictions in The Age of Spiritual Machines and The Singularity Is Near, the singularity is inevitable. My interpretation is that Kurzweil sees the singularity as the next step in humankind’s evolution. He does not predict humankind’s extinction or slavery. He does predict that most of humankind will have become SAH cyborgs by 2099 (SAH means “strong artificially intelligent human”), or their minds will be uploaded to a strong-AI computer, and the remaining organic humans will be treated with respect. Summary: In 2099 SAMs, SAH cyborgs, and uploaded humans will be at the top of the food chain. Humankind (organic humans) will be one step down but treated with respect.
  2. If you believe the predictions of British information technology consultant, futurist, and author James Martin (1933–2013), the singularity will occur (he agrees with Kurzweil’s timing of 2045), but humankind will control it. His view is that SAMs will serve us, but he adds that we carefully must handle the events that lead to the singularity and the singularity itself. Martin was highly optimistic that if humankind survives as a species, we will control the singularity. However, in a 2011interview with Nikola Danaylov (www.youtube.com/watch?v=e9JUmFWn7t4), Martin stated that the odds that humankind will survive the twenty-first century were “fifty-fifty” (i.e., a 50 percent probability of surviving), and he cited a number of existential risks. I suggest you view this YouTube video to understand the existential concerns Martin expressed. Summary:In 2099 organic humans and SAH cyborgs that retain their humanity (i.e., identify themselves as humans versus SAMs) will be at the top of the food chain, and SAMs will serve us.

Whom should we believe?

It difficult to determine which of these experts accurately has predicted the postsingularity world. As most futurists would agree, however, predicting the postsingularity world is close to impossible, since humankind never has experienced a technology singularity with the potential impact of strong AI.

Martin believed we (humankind) may come out on top if we carefully handle the events leading to the singularity as well as the singularity itself. He believed companies such as Google (which employs Kurzweil), IBM, Microsoft, Apple, HP, and others are working to mitigate the potential threat the singularity poses and will find a way to prevail. He also expressed concerns, however, that the twenty-first century is a dangerous time for humanity; therefore he offered only a 50 percent probability that humanity will survive into the twenty-second century.

There you have it. Two of the top futurists, Kurzweil and Martin, predict what I interpret as opposing views of the postsingularity world. Whom should we believe? I leave that to your judgment.

Source: The Artificial Intelligence Revolution (2014), Louis A. Del Monte