What if there's a game where all the AI people actually lived?



Imagine the fictitious scenario in a role playing game (RPG) where the non-playing characters (NPCs) within the RPG are conscious of their own surrounding and consider the developer to be god. The people inside can also live a normal life in that they can create new life, eat, die, and perform other functions humans experience.

Will this qualify them to be an AI?

Will the children of the NPCs qualify as an AI?

Might this be realized in the future?

Daniel Mana

Posted 2018-07-17T13:34:55.370

Reputation: 151

Recommending the Quantum Thief trilogy by mathematical physicist Hannu Rajaneimi. He goes into this type of simulation in detail, from a mathematicians perspective, in a manner accessible to the general public. Stross also touches on it in Accelerando, where individuals can live many lives simultaneously in such sims, and subsequently re-integrate the branched personas. It may be possible at some point, but is squarely in the realm of mythology of the future at present. :)

– DukeZhou – 2018-07-17T17:10:16.383



Yes, a baby can be considered an AI. Will this be our future? I don't know. That is exactly what some people are looking for, to create an AI that can live. We have several AIs that each time more surprises us. But none of them question their own existence, none of them wants to know or require attention from their god (developer) because of their reflections. I believe that what we create today can be equated with a mosquito of our real life: a brainless "being" that performs simple actions.

The cool thing about believing in such a future with Artificial Intelligence is that we do not have to repeat our mistakes. We do not need an AI evolving according to age, it can be born with a superintelligence, knowing everything, learning only new things that are not foreseen.

Some humans (and I) have a passion for the human, for every detail and imperfection, and dream of re-creating something as mysterious and complex as through an AI. It seems something impossible, some believe that it would be impossible only because who created us would be a perfect being (god). But others (me) do not stop at this issue, and continue the quest to create an AI with some characteristics of the human, such as: feelings, reflecting on one's own acts, ...

But all this, as far as I know, is surreal and only projection of our mind. It is already something interesting, for creating new challenges for humanity.

Guilherme IA

Posted 2018-07-17T13:34:55.370

Reputation: 691


I think it is a very good idea to populate the whole universe with AI characters, because this helps to spread out singularity faster. In the literature the concept is called swarm or crowd simulation, because many AI individuals are working together in the same environment. This concept shouldn't be misinterpreted as multi-agent system, because this is a blackboard like architecture which is active in one individual. The appropriate term is swarm simulation.

If the individuals are able to recognize the developer of the environment as god, depends on their conciseness level. This is a scale to measure the intelligence. It starts with zero, goes to animal AI, human level AI and at the maximum it is super-human universal AI which understands everything. The scale is called ConsScale and determines the level of Machine Consciousness. Level 7 means, that the AI robot can recognize themself as an AI.

sitting on a stone

Manuel Rodriguez

Posted 2018-07-17T13:34:55.370

Reputation: 1


This is a standard trope in science fiction, although more common is the idea of transferring minds from real people into a virtual environment.

Right now, such an idea is still firmly in science fiction. We don't know a few things:

  • What consciousness is in detail. There are various educated guesses and philosophical stances, but none have been seriously verified.

  • What the minimal environment is that can support consciousness. This is important because simulating more complex environments is more computationally expensive. There is a huge gulf between capabilities of current computers and something that could simulate reality to the level of granularity that we can experience and manipulate it (let alone to the depth that we can scientifically investigate).

Will this consider them to be an AI?

Assuming you can solve the caveats above, then yes, perhaps even a "Strong AI" in the domain of the game, but not a human-equivalent AGI (depending on the complexity of the environment).

If you mixed a current RPG game with some agents driven by our current best game-playing bots, and improved them so that they are able to solve problems within the game - e.g. in World of Warcraft, or perhaps in Minecraft - then you would have something as advanced or better than any current "narrow AI" that solves points scoring or adversarial games. You would be justified in calling it "AI", if OpenAI and Deep Mind can call their achievements "AI". However, it seems unlikely such agents would be much like your imagined simulated people - having consciousness etc.

Would this be real in the future?

Depends on the constraints faced by researchers. To get to an imagined point where this is possible, we have to extrapolate current trends on computing power and advances in understanding goal-driven AI and self-awareness. There is a range of opinions on how likely this is, how much power is required etc.

It is a compelling idea, and unlike say faster-than-light travel, there are no widely-accepted theoretical limits that prevent it.

Neil Slater

Posted 2018-07-17T13:34:55.370

Reputation: 14 632


The question can be extended further: What if the NPCs were smart enough to design digital systems capable of housing a game much like the one they are in? This would not be recursion in the sense of ancestors and descendants within a species, but actual recursion in the domain of speciation.

Working backward, how can we know for sure that we aren't God's RPG, and, more importantly, what difference would there be between that scenario and the ancient middle eastern conceptualization of the monotheistic creation?

The bidirectionality of this scenario was examined on the screen by writer-director David Cronenberg in his 1999 film eXistenZ, staring Jude Law, Jennifer Jason Leigh, and Ian Holm (recommended).

Regarding the sub-question, "Will this qualify them to be an AI?" that depends on two things.

  • How one defines intelligence, a somewhat slippery word to define 1
  • What facilities and propensities are imbued into the hardware and software that bring the NPCs to artificial life.

Regarding whether the children of the NPCs qualify as AI, definitively yes if the parents qualify and the children are adequate replicas or artificially genetic improvements.

Might this be realized in the future? That depends on a number of things.

  • Will the earth avert global extinction events?
  • Will humanity accidentally exterminate itself? 2

Some call all of this science fiction or Futurism. I do not. Such things are gradually developing and have been for at least the duration of my life. I've been an interested spectator in this very real drama.


[1] The philosophic question of whether super-human intelligence humans can't possess is a prerequisite to characterizing human intelligence is well represented in cybernetic literature reaching as far back as Charles Babbage and perhaps further in the early biochemical interests of Abu Mūsā Jābir ibn Hayyān. Defining what intelligence means is more than looking at functional MRIs or running tests in Python. If we define intelligence as the ability to collaborate, ants and even bacteria may be more intelligent than humans. If we define intelligence as the ability to efficiently construct artifices of benefit to the species, bees exceed humans. Humans can determine mathematically that hexagonal rooms are 1.7 times more efficient than square rooms in terms of construction materials, but is it intelligence that, having determined that ratio centuries ago, humans continue to waste materials on square rooms? We don't see bacteria, ants, or bees shooting narcotics either. In some respects, by some definitions of intelligence, humans would qualify as the dominant morons of all terrestrial life forms.

[2] "Darwin's dice have rolled badly for Earth. The human species is, in a word, an environmental abnormality. Perhaps a law of evolution is that intelligence usually extinguishes itself." — Harvard U Professor 1955 and Pulitzer Prize winner for General Nonfiction, New York Times Magazine, May 30th, 1993, Is Humanity Suicidal?

Douglas Daseeco

Posted 2018-07-17T13:34:55.370

Reputation: 7 174