Abstract. This paper outlines the Independent Core Observer Model (ICOM) Theory of Consciousness defined as a computational model of consciousness that is objectively measurable and an abstraction produced by a mathematical model where the subjective experience of the system is only subjective from the point of view of the abstracted logical core or conscious part of the system where it is modeled in the core of the system objectively. Given the lack of agreed-upon definitions around consciousness theory, this paper sets precise definitions designed to act as a foundation or baseline for additional theoretical and real-world research in ICOM based AGI (Artificial General Intelligence) systems that can have qualia measured objectively.


The Independent Core Observer Model Theory of Consciousness based on the Computational Theory of Mind (Rescorla 2016) where one of the core issues with research into artificial general intelligence (AGI) is the absence of objective measurements and data as they are ambiguous given the lack of agreed-upon objective measures of consciousness (Seth 2007).  To continue serious work in the field, we need to be able to measure consciousness in a consistent way that is not presupposing different theories of the nature of consciousness (Dienes and Seth 2012) and further not dependent on various ways of measuring biological systems (Dienes and Seth 2010) but focused on the elements of a conscious mind in the abstract.  With the more nebulous Computational Theory of Mind, research into the human brain does show some underlying evidence.

2 Our Assumptions

Qualia typically is considered the internal subjective component of perceptions, arising from the stimulation of the senses by phenomena (Gregory 2004), given the assumption of a version of the computational model of consciousness and the fact that data from sensory input can be traced in a human brain we are assuming that qualia as “raw experience” is the subjective conscious experience of that input.  From the standpoint of the conscious mind, qualia are the subjective experience that can be measured external to the system; if the mind in question is operating under known parameters, we can tap into, for example, in systems using the ICOM Theory of Consciousness as it can be objectively measured.

Humans are not able to make logical decisions.  Looking at the neuroscience behind decisions, we already can prove that humans make decisions based on how they feel (Camp 2016) and not based on logic.  We are assuming researchers like Jim Camp or Antonio Damasio are accurate at a high level with their work’s empirical evidence, implying that humans do not make ‘logical’ decisions directly.  The lack of direct logic is important when looking at how consciousness works in that it appears not to be based on logic but subjective emotional experience, and that is the assumption that this research will continue to bear out with the current empirical evidence already supporting it.

Subjective experience can be measured and understood.  The traditional view that the subjective nature of experience (Leahu, Schwenk, and Sengers 2016) is purely subjective is rejected as a matter of principle in this paper.  All things can be objectively broken down and understood theoretically, and the use of items being subjective is more indicative of an excuse for not wanting to quantify something objectively.  Consciousness, even by scientists in the field, frequently consider it the realm of “ontology and therefore philosophy and religion” (Kurzweil 2001); we assume that this is false. We reject it, as stated earlier, as a lack of understanding or insufficient data or technology.

Consciousness can be measured.  To quote Overgaard; “Human Consciousness … has long been considered as inaccessible to a scientific approach” and “Despite this enormous commitment to the study of consciousness on the part of cognitive scientist covering philosophical, psychological, neuroscientific and modeling approaches, as of now no stable models or strategies for the adequate study of consciousness have emerged.” (Overgaard 2010) Until now, with the ICOM theory and our approach to measuring consciousness based on the Porter method (Porter 2016) and which has elements of subjectivity, it is a qualitative approach that can objectively be used to measure degrees of consciousness.  As to the specific points of the Porter method, we also believe that we can measure consciousness regarding task accuracy and awareness as a function of stimulus intensity (Sandberg, Bibby, Timmermans, Cleermans and Overgaard 2011) that applies to brain neurochemistry as much as the subjective experience from the point of view of systems like ICOM based on the Porter method.

There are subjective problems with the Porter method, to the extent that we are focused on “if a system has internal subjective experience and consciousness.” The Porter method can help us measure the degree to which that system has those subjective conscious experiences and thus help “enumerate and elucidate the features that come together to form the colloquial notion of consciousness, with the understanding that this is only one subjective opinion on the nature of subjective-ness itself” (Porter 2016) being measured objectively using those subjective points.

We have a concrete definition of ‘Subjective’ as a concept.  To make progress in building and designing a system with “subjective internal experience,” we need a way of defining ‘subjective’ such that it can be objectively measured. ‘Subjective’ then is defined as the relative experience of a conscious point of view that can only be measured objectively from outside the system where the system in question experiences things ‘subjectively’ as they relate to that system’s internal emotional context.

Consciousness is a system that exhibits the degrees or elements of the Porter method for measuring consciousness regarding its internal subjective experience. (Porter 2016) While the dictionary might define consciousness subjectively in terms of being awake or aware of one’s surroundings (Merriam-Webster 2017), this is a subjective definition, and we need an ‘objective’ one to measure and thus the point we are assuming for the context of the ICOM theory of mind and the ICOM research altogether.

3 The basis for Design of the ICOM Theory of Consciousness

The ICOM or Independent Core Observer Model Theory of Consciousness is based on the Computational Theory of Mind (Rescorla 2016), which is defined as:

According to CCTM, the mind is a computational system similar in important respects to a Turing machine, and core mental processes (e.g., reasoning, decision-making, and problem-solving) are computations similar in important respects to computations executed by a Turing machine (Rescorla 2016) – which can have numerous variations. 

An instance of an ICOM system would be a variation instance of CCTM.  In addition to that, the ICOM Theory of Consciousness or ICOMTC also borrows from the Integrated Information Theory (Tononi, Albantakis, and Masafumi 2014). CCTM does not give us a complete basis for developing ICOM systems and includes Integrated Information Theory and CCTM.

Integrated information theory or IIT, approaches the relationship between consciousness and its physical substrate by first identifying the fundamental properties of experience: existence, composition, information, integration, and exclusion. IIT then postulates that consciousness’s physical substrate must satisfy three key points or ‘Axioms’ (Tononi, Albantakis and Masafumi 2014).

Integrated information theory phenomenological axioms are: 1. information says that each experience is specific – it is what it is by how it differs from alternative experiences. 2. integration says that it is unified – irreducible to non-interdependent components. 3. exclusion says that it has unique borders and a spatiotemporal grain

In IIT, one can develop a mathematical framework in which composition, information, integration, and exclusion are defined precisely and made operational (Tononi, Albantakis, and Masafumi 2014). IIT still does not account for subjective experience and is specific to a substrate. Simultaneously, the ICOM-based system would apply IIT, an IIT-based system that cannot have the emotional, subjective experience and quantify it in the same way we can with ICOMTC based systems, which can be broken down where ICOMTC systems address the objective measurement of subjectivity.

ICOMTC also borrows from Global Workspace theory in that things move through the system, and only when items reach a certain point is that bit of ‘thought’ or ‘context’ raised to the level of the conscious mind. (Baars and Katherine 2016)  CCTM, IIT, and Global Workspace all exist more or less in ICOMTC, where ICOMTC based systems exhibit all the elements of all of these theories to some degree. However, it is also substrate independent in that ICOMTC is not an attempt to produce the same kind of system as the biological substrate of the human brain or do anything that requires that kind of hardware, nor is it tied to current computer architecture either other than any Turing machine (Wiki 2017).  In theory, any Turing machine (or machines) would run an ICOMTC based system given enough time.  ICOMTC does not address those implementation details and is, therefore, substrate independent in design and theory.

4 The Independent Core Observer Model Theory of Consciousness (ICOMTC)

At a very high level, ICOM as a cognitive architecture (Kelley 2016) works by streaming data and context processed by the underlying system (the observer) and based on emotional needs and interests and other factors in the system; these are weeded out until only a certain amount are processed, or ‘experienced’ in the ‘core’ (or global workspace) which holds emotional models based on Plutchik’s (Norwood 2016) work.  These elements of the core exist for both conscious and subconscious emotional landscapes of the system, where the context that is ‘experienced’ from the standpoint of the system is the only ‘experiences’ that the conscious system is aware of.   In this way, only the differential experience matters, and the system, for example, does not understand a word as much as it feels the emotional context of the word as it relates to an underlying context.  The “emotional valences” are associated with things that the system then selects as items to think about emotionally.  The system select’s actions based on how they improve the experiences of those emotional valences, and in this way, the system may choose to do something logical based on how it feels about it, or it could just as easily pick something else for no other reason than it feels a bit better about it.  In this way, the system does not have direct access to those emotional values, nor is it a direct function of the algorithms, but it is an abstraction of the system created by the core that can be considered emotionally conscious or self-aware sapient and sentient in the abstract.

5          Subjective Experience in ICOM Cogitative Architecture

How do we then look at a system that experiences emotional, subjective experience objectively?  The following set notation shows us a simple logical implementation of the last climb of “a thought” as it makes its rise from the depths of the system to the awareness of the conscious, self-aware parts of the system.

Figure 1: Core Logic, Notation.

First, let us walk through the execution of this logic.  Coming into the system, we already have context data decomposition, sensory input, and also related data from memory that may be of emotional interest, but for one ‘thought’ let us say it is one bit of context, meaning an emotionally related context tree related to something that the system has sensed externally.  This data will be represented by ‘Inputs.’ At this point, we have already passed that ‘context’ raised to the global workspace.  Figure 1 essentially is one cycle of the core, considering what is in the global workspace or ‘core’ of ICOM.  In Figure 1, we first see that we have two sets or collections of emotional models represented by the two sets defined in the first two rows, then we have the input new context placed in the ‘NewContext’ set.  We apply the ‘Needs’ function that uses a matrix set of rules such as the system’s technical requirements to other wants and needs based on the system’s hierarchy of needs and current environmental conditions.  We look at how this thought applies conscious emotional rules in the function ‘ConsciousRules’ and how it manipulates the current conscious emotional landscape.  We say ‘landscape’ because it is not a single emotion but a complex set of almost infinite combinations consciously and subconsciously that the system experiences.

In like manner, the system applies subconscious rules to the subconscious states and the subconscious rules to the conscious states, and finally, those states apply to the new context wherein all cases, it is only in the abstract from this state that the system experiences anything.  Meaning the system uses the abstracted states to represent that emotional landscape in how things affect all of those emotional states and related context finally being passed to the observer for action if that ‘NewContext’ contained an action.  In this way, the system does not even deal with the complexity of its actions as much as the system will do them if the system felt like it and knows how; whereas numerous cycles might have to execute in the core for it to perform a new task, meaning it will have to think a lot more about something it does not know how to do.   After that context is posted back to the observer (the more complicated part of the system in ICOM), it is placed back into context memory, and in this way, we see the rich set of the system’s emotional landscape modeled and executed.

Interestingly enough, there are indications that this sort of system is perfectly capable of becoming mentally ill and even forgetful if hardware starts to limit operations in current ICOM research. In contrast, the only way to optimize for the execution environment would be to place memory limits, and based on the node map memory models; this would be the only way to continue optimal execution given certain limits.

A better way to think of ICOMTC is that not a single element of the system is conscious or self-aware to any level. Consciousness is derived from the ‘interactions’ between the parts that, together, those interactions become aware of abstractly.  The underlying process articulated in Figure 1 is measured in terms of consciousness via the Porter method mentioned earlier and direct instrumentation of the system to measure ‘qualia.’

6 Measuring Qualia

In ICOMTC, qualia can be objectively measured through the differential between the conscious emotional landscape of the system represented by a Plutchik model along with the subconscious model and the model of the irreducible set of any given context experienced by the system and the emotional model created that represents that specific ‘contextual’ experience.  In the ICOMTC, the qualia is that differential between the state and the effect of one emotional structure representing that current context and how the system applies choices.  Based on qualia and the numerous underlying factors that affect the construction and decisions based on a specific context.  By its nature, the system cannot self-reflect directly on those values but experiences an abstraction of that process in the global ‘workspace’ that effectively is created by the underlying operation.  Of course, we can measure this ‘qualia’ of the system, but the system cannot do it directly from its standpoint.  In the research already done for ICOM, we can see that the ICOMTC system does not have free will, but it would appear that way from the system’s standpoint and experience the illusion of free will much the way humans do.

As stated, qualia then can be measured.  Referring back to figure one, we can use two values or sets from that set of operations and perform a ‘qualia’ measurement like this based on those values:

Figure 2: Computing Qualia

In this case, we are computing qualia by taking the sets representing the system’s current emotional landscape and a conscious and subconscious level and computing the difference matching sets where a set is a Plutchik model with eight floating-point values.  We subtract the current state from the previous state, giving us the Plutchik representation of the system’s subjective emotional differential.  This algorithm gives us the numbers in terms of ‘sets’ that show how a specific element of ‘context’ that managed to make it to the global workspace is ‘experienced’ or rather the effectiveness of that experience.   We have to calculate this after the fact external to the system as it is not computed in the entire process (noted in figure 1), and there is not a ‘direct’ method in ICOM to surface an objective measure of qualia to the system without a complete abstraction, but we can compute it externally and use it for analysis.

7 Conclusion

The Independent Core Observer Model Theory of Consciousness (ICOMTC) addresses key issues with being able to measure physical and objective details well as the subjective experience of the system (known as qualia), including mapping complex emotional structures, as seen in previously published research related to ICOM cognitive architecture (Kelley 2016).  Our ability to measure determines that we can test different theories and make changes to the current system.  Slowly, we increasingly see a system that can make illogical and emotionally charged decisions objectively measurable (Chalmers 1995). In this space, real artificial general intelligence will work ‘logically’ similar to the human mind that we hope to see success.  ICOMTC allows us to model objectively subjective experience in an operating software system or be made self-aware and act as the foundation for creating ASI.

8 References

  • Rescorla, M.; The Computational Theory of Mind; Stanford University 16 Oct 2016; http://plato.stanford.edu/entries/computational-mind/
  • Seth, A.; Theories and measures of consciousness develop together; Elsevier/Science Direct; University of Sussex
  • Dienes, Z; Seth, A.; The conscious and unconscious; University of Sussex; 2012
  • Dienes, Z; Seth, A.; Measuring any conscious content versus measuring the relevant conscious content: Comment on Sandberg et a.; Elsevier/ScienceDirect; University of Sussex
  • Porter III, H.; A Methodology for the Assessment of AI Consciousness; Portland State University Portland Or Proceedings of the 9th Conference on Artificial General Intelligence;
  • Sandberg, K; Bibby, B; Timmermans, B; Cleeremans, A.; Overgaard, M.; Consciousness and Cognition – Measuring Consciousness: Task accuracy and awareness as sigmoid functions of stimulus duration; Else-vier/ScienceDirect
  • Tononi, G.; Albantakis, L.; Masafumi, O.; From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0; 8 MAY 14; Computational Biology http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1003588
  • Siong, Ch., Brass, M.; Heinze, H.; Haynes, J.; Unconscious Determinants of Free Decisions in the Human Brain; Nature Neuroscience; 13 Apr 2008; http://exploringthemind.com/the-mind/brain-scans-can-reveal-your-decisions-7-seconds-before-you-decide
  • Metaphysicist; The raw-experience dogma: dissolving the “qualia” problem; Less-wrong, 7 NOV 2016; http://lesswrong.com/lw/ehz/the_rawexperience_dogma_dissolving_the_qualia/
  • Camp, Jim; Decisions Are Emotional, Not Logical: The Neuroscience behind Decision Making; 2016 http://bigthink.com/experts-corner/decisions-are-emotional-not-logical-the-neuroscience-behind-decision-making
  • Kurzweil, R.; The Law of Accelerating Returns; Mar 2001; http://www.kurzweilai.net/the-law-of-accelerating-returns
  • Kelley, D.; Critical Nature of Emotions in Artificial General Intelligence; IEET 2016; https://ieet.org/index.php/IEET2/more/Kelley20160923
  • Gregory; “Qualia: What it is like to have an experience; NYU; 2004 https://www.nyu.edu/gsas/dept/philo/faculty/block/papers/qualiagregory.pdf
  • Leahu, L.; Schwenk, S.; Sengers, P.; Subjective Objectivity: Negotiating Emotional Meaning; Cornell University; http://www.cs.cornell.edu/~lleahu/DISBIO.pdf
  • Overgaard, M.; Measuring Consciousness – Bridging the mind-brain gap; Hammel Neuro center Research Unit; 2010
  • Baars, B.; Katherine, M; Global Workspace; 28 NOV 2016; UCLA http://cogweb.ucla.edu/CogSci/GWorkspace.html
  • Chalmers, D.; Facing Up to the Problem of Consciousness; University of Arizona 1995
  • Merriam-Webster – Definition of Consciousness by Merriam-Webster – https://www.merriam-webster.com/dictionary/consciousness
  • Wikipedia Foundation; Turing Machine; 2017; https://en.wikipedia.org/wiki/Turing_machine
  • Norwood, G.; Deeper Mind 9. Emotions – The Plutchik Model of Emotions; http://www.deepermind.com/02clarty.htm 403 (2/20/02016)
  • Kelley, D.; “Google-It”; Self-Motivating Computation System Cognitive Architecture; Springer 2016 ISBM 978-1-4939-6413-0

Published in ITSC2018

Originally posted here: https://uplift.bio/blog/paper-the-independent-core-observer-model-computational-theory-of-consciousness-and-the-mathematical-model-for-subjective-experience/