What’s The Technological Singularity?
Our ability to achieve this understanding, via either the AI or the neuroscience approaches, is itself a human cognitive act, arising from the unpredictable nature of human ingenuity and discovery. Progress here is deeply affected by the ways in which our brains absorb and process new information, and by the creativity of researchers in dreaming up new theories. It is also governed by the ways that we socially organize research work in these fields, and disseminate the knowledge that results. At Vulcan and at the Allen Institute for Brain Science, we are working on advanced tools to help researchers deal with this daunting complexity, and speed them in their research.
I. J. Good’s “intelligence explosion” model predicts that a future superintelligence will trigger a singularity. Moore’s law, which is based on the observation that t the number of transistors in a dense integrated circuit double about every two years, implies that cost of computing halves approximately every 2 years. However, most experts believe that Moore’s law is coming to an end during this decade. Though there are efforts to keep improving application performance, it will be challenging to keep the same rates of growth. While machines can seem dumb right now, they can grow quite smart, quite soon.
I had hoped that this discussion of IA would yield some clearly safer approaches to the Singularity . Alas, about all I am sure of is that these proposals should be considered, that they may give us more options. But as for safety — some of the suggestions are a little scary on their face. We humans have millions of years of evolutionary baggage that makes us regard competition in a deadly light. Much of that deadliness may not be necessary in today’s world, one where losers take on the winners’ tricks and are coopted into the winners’ enterprises.
When it comes to black holes, singularity is at the center of them. A black hole is a highly dense one-dimensional point where all matter compresses to an infinitely small point. Entrepreneurs and public figures like Elon Musk have expressed concerns over advances in AI leading to human extinction. Significant innovations in genetics, nanotechnology and robotics will lay the foundation for singularity during the first half of the 21st century. To provide a bit of background, John von Neumann — a Hungarian American mathematician, computer scientist, engineer, physicist and polymath — first discussed the concept of technological singularity early in the 20th century. Since then, many authors have either echoed this viewpoint or adapted it in their writing.
Kurzweil believes that the singularity will occur by approximately 2045. His predictions differ from Vinge’s in that he predicts a gradual ascent to the singularity, rather than Vinge’s rapidly self-improving superhuman intelligence. The foregoing points at a basic issue with how quickly a scientifically adequate account of human intelligence can be developed. Understanding the detailed mechanisms of human cognition is a task that is subject to this complexity brake.