There are various aspects of consciousness generally deemed necessary
for a machine to be artificially conscious. This list is not exhaustive;
there are many others not covered. A generally accepted criterion for sentience and consciousness is self-awareness:
one dictionary defines conscious to mean "having an awareness of
one's environment and one's own existence, sensations, and thoughts"
(dictionary.com). The 1913 Webster's Dictionary defines conscious as "possessing
knowledge, whether by internal, conscious experience or by external observation;
cognizant; aware; sensible". An AC system should be capable of achieving
various aspects (or by a more strict view, all verifiable, known, objective,
and observable aspects) of consciousness. While self-awareness is very
important, it may be subjective and is generally difficult to test.
The ability to predict (or anticipate) foreseeable events is considered
important for AC by Igor Aleksander: He writes in Artificial Neuroconsciousness:
An Update: "Prediction is one of the key functions of consciousness.
An organism that cannot predict would have a seriously hampered consciousness."
The emergentist multiple drafts principle proposed by Daniel Dennett in
Consciousness Explained may be useful for prediction: It involves the
evaluation and selection of the most appropriate "draft" to
fit the current environment.
Awareness could be another required aspect. However, again, there are
some problems with the exact definition of awareness. To illustrate this
point, philosopher David Chalmers controversially put forward the panpsychist
argument that a thermostat could be considered conscious (Chalmers 1996,
pp283-299): it has states corresponding to too hot, too cold, or at the
correct temperature. The results of the experiments of neuroscanning on
monkeys suggest that a process, not a state or object activates neurons
. For such reaction there must be created a model of the process based
on the information received through the senses, creating models in such
a way demands a lot of flexibility, and is also useful for making predictions.
Personality is another characteristic that is generally considered vital
for a machine to appear conscious. In the area of behaviorial psychology,
there is a somewhat popular theory that personality is an illusion created
by the brain in order to interact with other people. It is argued that
without other people to interact with, humans (and possibly other animals)
would have no need of personalities, and human personality would never
have evolved. An artificially conscious machine may need to have a personality
capable of expression such that human observers can interact with it in
a meaningful way. However, this is often questioned by computer scientists;
the Turing test, which measures a machine's personality, is not considered
generally useful any more.
Learning is also considered necessary for AC. By "Engineering consciousness",
a summary by Ron Chrisley, University of Sussex consciousness is/involves
self, transparency, learning (of dynamics), planning, heterophenomenology,
split of attentional signal, action selection, attention and timing management.
Daniel Dennett said in his article "Consciousness in Human and Robot
Minds. It might be vastly easier to make an initially unconscious or nonconscious
"infant" robot and let it "grow up" into consciousness,
more or less the way we all do." He explained that the robot Cog,
described there, "Will not be an adult at first, in spite of its
adult size. It is being designed to pass through an extended period of
artificial infancy, during which it will have to learn from experience,
experience it will gain in the rough-and-tumble environment of the real
world." And "Nobody doubts that any agent capable of interacting
intelligently with a human being on human terms must have access to literally
millions if not billions of logically independent items of world knowledge.
Either these must be hand-coded individually by human programmers--a tactic
being pursued, notoriously, by Douglas Lenat and his CYC team in Dallas--or
some way must be found for the artificial agent to learn its world knowledge
from (real) interactions with the (real) world." An interesting article
about learning is Implicit learning and consciousness by Axel Cleeremans,
University of Brussels and Luis Jiménez, University of Santiago,
where learning is defined as “a set of philogenetically advanced
adaptation processes that critically depend on an evolved sensitivity
to subjective experience so as to enable agents to afford flexible control
over their actions in complex, unpredictable environments”
Anticipation is the final characteristic that could possibly be used
to make a machine appear conscious. An artificially conscious machine
should be able to anticipate events correctly in order to be ready to
respond to them when they occur. The implication here is that the machine
needs real-time components, making it possible to demonstrate that it
possesses artificial consciousness in the present and not just in the
past. In order to do this, the machine being tested must operate coherently
in an unpredictable environment, to simulate the real world.
Read more about Various Views on Conscious.