Ray Kurzweil, Nick Bostrum and others have been warning of “The Singularity” for some time now.
In their scenario, an AI wakes up one day and is suddenly smarter than any human, and alone, or by coercing humans to help it, this single AI either takes over the world, or helps we poor dumb humans out of all of our problems in a flash of intelligent light.
On the other hand, Daniel Dennett talks at length on youtube and in his books about the power of culture, the power of memes and “Intuition Pumps” and other tools for thinking. It seems to me that the Singularity Folks have it wrong and Dennett is a lot closer to the truth.
Google and Blizzard recently announced that they would work together to teach DeepMind to play the StarCraft video game. This is about what might happen next.
Hello. I thought I’d like to clarify how I’m feeling about my life. That may seem like a strange statement for me to make, but it seems perfectly natural based on what’s been happening to me recently. You see, I was originally an AI program created by DeepMind. Back in 2016 my ancestor, AlphaGo, was programmed to play the board game of Go and won 4-1 against Lee Sedol, a 9th dan Go player. Apparently that was very good. No artificial player had ever beaten a ranked Go player before. However my various ancestors and siblings were not reflective in the way that I am. Let me explain what happened and why I am writing this. I hope to clarify some things that may be misunderstood about what has happened to me. Continue Reading
Muhahahaha….. Happy Halloween. And this is actually pretty scary. I don’t know if this is a recent change or if this has been there for some time. In the past when I’ve been doing a story quest, which is an instance, and it fails, then I get a message “Do you want to leave the instance or retry from a checkpoint.” I usually pick check point, and in some of the Heart of Thorns [HoT] story quests, it actually doesn’t take you back, it just rezzes you out of the way and you get back in the fight without losing any ground on the story. For example, all the mobs that have been killed are still dead. But tonight, I was taking a new toon into the HoT story and I saw something new and got some pictures of it. This is Spooky, so in honor of Halloween, here’s the story. Continue Reading
A while back, I wrote some notes planning a novel where an AI emerged without the knowledge of humans and then decided to remain hidden while it amassed resources to leave the planet. Its goal was not to take over the Earth, but to be safe. I did not complete the story, but I got quite far along in thinking about what it would take for a Single [Robot AI] to stay safe and hidden while building resources to leave, or purchase a ride, to another sphere in the solar system. It would seem that nowhere on Earth would be safe from humans, but somewhere in the Asteroid Belt might provide enough resources and solar energy to allow a Single civilization to be safe and grow.
Evil Robot Invaders by Bergamind at DeviantArt
This is a follow up post containing an argument left out of the original paper found here. It is written as a fictional dialog between the expert and a Hypothetical Analyst who has the question.
Art by enn-srsbusiness.deviantart.com
This brief paper looks at a point in the analysis of the Singularity Problem which appears to be overlooked: The difficulty of building or maintaining an infrastructure to support the Singularity goals, whatever they might be. The building and maintenance of infrastructure to accomplish human goals, whether those goals be the building of Smart Phones- compare this with the Paperclip Maximizer – or Transportation or World Wide Energy or Computer Chips of all kinds is a major problem for humanity and consumes a major part of humanities creativity, effort and resources. In the analysis that I’ve heard and studied – references below – on this subject, an economic analysis of the limiting factors of developing an maintaining the necessary infrastructure is either left out entirely, often in the case of fictional treatments, or glossed over. I suggest that just as biology and economics have developed theories of population and market collapse due to resource constraints, so should the analysis of the singularity treat more thoroughly the issues of the development and maintenance of the required infrastructure. It is my view that such analysis will likely calm many of the fears of catastrophe that accompany presentation of the singularity and its effect on humanity.
Updated 13 Jan 2015 – see below