6-Philosophy-Mind-Theories

emergence of mind

Perhaps, new properties can arise {emergence, mind} that system units and relations cannot predict.

types

Perhaps, higher principles can describe complex systems too complex to allow predictions {benign emergence}. Perhaps, complex systems can create entirely new objects, events, structures, or functions {radical emergence}.

mind

Minds can be new things with new properties, derived from brain-part relations and combinations. Brains have components, and mind is the whole, with laws and phenomena that are not explainable just from brain parts and properties.

Like music from instruments, mind comes from brain but is not like brain. If instruments break, they can make no music, just as minds depend on functioning brains. Music resonates in instruments but does not actually affect music production, just as mind resonates in brain but does not affect brain function. However, mind does affect brain function.

consciousness

Perhaps, consciousness is an emergent, self-regulatory, goal-directed brain-state or brain-process property, rather than brain faculties or structures.

causation

Complex systems have new causation types {emergent causation}. Higher existences or processes form from lower existences or processes [Beckerman et al., 1992].

epiphenomena

Perhaps, conscious experience associates with, is supervenient upon, or is a property of physical objects and events, but mind does not affect body or brain {epiphenomena} {epiphenomenalism}. Body and brain can act upon, control, and result in mind, consciousness, and conscious experience, or mind can be a byproduct. Perhaps, conscious experiences have effects in the mental world.

object and sense

People can report on their consciousness, and sense qualities do not correspond to physical objects or events. Senses have different logic for representing physical properties, such as for sound and light wavelengths [Ramachandran, 2004].

no causation

Mental and conscious events have no physical or mental effects {methodological epiphenomenalism}, because the physical world can have no outside causes. Mental events that seem to cause have physical causes.

evolutionary adaptiveness

Perhaps, human abilities evolved to meet hominin needs {evolutionary adaptiveness}.

new realism

Perhaps, reality is neither mind nor matter {new realism}. Mental and physical events have different causal laws. Mind and matter differences are only different arrangements or organizations of same fundamental constituents.

operationalism and mind

Perhaps, what consciously happened is whatever people remember to have happened {operationalism, mind theory}. Operationalism requires belief or memory. In conscious experience, the "for me" {fur mich} and the "in itself" {an sich} are same thing.

quantum mechanics and mind

Perhaps, classical physics has no role for consciousness. Quantum mechanics requires mind to set variables to observe {quantum mechanics, mind}.

Gödel Incompleteness

Halting problems prove that mind does not use algorithms. Mathematicians can understand non-computable-function truth, but computer programs cannot. Quantum computing can be non-algorithmic and non-recursive.

gravity

Wavefunctions collapse at large-scales by non-local gravitational process (objective reduction). Such gravitational effects happen in tenths of seconds and are not algorithmic.

microtubules

Perhaps, quantum mechanics affects nerve microtubules. However, time before quantum state decoherence is too short, 0.1 milliseconds or less [Grush and Churchland, 1995] [Hameroff and Penrose, 1996] [Hameroff et al., 1996] [Hameroff et al., 1998] [Lockwood, 1991] [Penrose, 1989] [Penrose, 1994].

Copenhagen interpretation

In Copenhagen quantum-mechanics interpretation, quantum mechanical laws specify what knowledge/information people can have about systems. Actions that gather information about relations among observations gain knowledge. Physical laws are not about reality, particles, or energy.

Classical systems use real numbers, whose operations are commutative, to specify particle and energy properties. Quantum-mechanical laws use complex numbers, with non-commutative operations, to specify dynamical-system changes and state/observation probabilities. Quantum-mechanical mathematical descriptions are about wave events rather than numbers.

events: Process 1

Observation causes wavefunction collapse and makes one of the possible states appear. Observations are conscious and/or psychological events. Mind must choose question to answer, observable to measure, and location and time to measure. Observation requires mind, which chooses what to observe by choosing experiment and observes directly or by instrument. Observations have experimental conditions and measurement variables, described the same as in classical physics, that instruments can communicate to people.

Observing systems, including measuring instruments, can be quantum mechanical or classical. Measuring instruments typically are classical, while atomic systems are quantum mechanical {Heisenberg cut}. Quantum-mechanical descriptions approximate continuous classical states with probabilities. Observable instrument states must be countable, and states have probabilities.

events: Process 2

System physical processes proceed according to mathematical laws, until another observation. Observed systems are quantum mechanical. Physical processes do not cause choices. Mathematical laws do not require choices.

Von Neumann

Brain, measuring apparatus, and physical system to measure are in one physical system. Brain chooses what to observe, the variable. In Process 1 {Heisenberg Choice}, observers choose variables to observe using consciousness. Variables are measurable and have specific discrete states. In Process 2, system evolves quantum mechanically, deterministically, and locally. Lengths and times become more uncertain. In Process 3 {Dirac Choice}, quantum jump puts variable in state and mind in knowing state.

quantum Zeno effect

Quantum effects only persist for 10^-13 seconds. However, in some physical conditions, making same observation process repeatedly at high-enough rate causes observations to repeat {quantum Zeno effect}. Experiment timing affects observed-state probabilities. Perhaps, in mind, attention is rapid probing and holds mental states for prolonged periods.

representational theory

Conscious mental states represent in a specific way {representational theory}. Conscious mental states do not require brain representation.

higher-order-monitoring theory

Conscious mental states have a specific brain representation {higher-order-monitoring theory}. Conscious mental states do not necessarily represent.

self-representational theory

Perhaps, consciousness requires self-reference {self-referentialist theory} {self-reference, mind} {self-representational theory} {self-representation} [Hofstadter, 1979] [Hofstadter, 2007]. Conscious mental states represent in a specific way and have a specific brain representation.

Besides having sensations, conscious mental states can refer to themselves. Consciousness indirectly includes some self-consciousness. Perhaps, subjects' conscious mental states also represent those conscious mental states. Perhaps, subjects' conscious mental states include unconscious thoughts about the mental states. Perhaps, by extrinsic higher-order theory, subjects that have conscious mental states must have unconscious mental states that represent the conscious mental states.

Besides having sensations, conscious mental states can refer to conscious subject/person/self/soul, which can have no or some self-sensations.

Besides having sensations, conscious mental states can have associated unconscious thoughts about the sensations or self.

supervenience

Perhaps, mental changes or states have changes or states at lower, physical levels, but physical changes and states do not necessarily always subserve mental changes or states {supervenience} {realization, mental}. The physical determines the mental in general ways. Conscious processes are supplementary effects in complex causal neural networks. Because mental events supervene on physical events, mental events are reducible to physical causes. Physical reduction is possible for functions. Intentional states have functions, can be behavior causes, and are reducible to physical explanations. Phenomenal states do not have to have functions or affect behavior and so are epiphenomenal. However, similarities and differences among experiences affect behavior and so have functions [Kim, 2005].

6-Philosophy-Mind-Theories-Functionalism

functionalism and mind

Perhaps, mental states are brain functions {functionalism}. Consciousness is inputs, processing, and outputs about stimuli, behaviors, beliefs, goals, and algorithms. Functionalism uses input-output relations to explain physical and biological processes. If mental states are conscious, they have special functions [Armstrong and Malcolm, 1984] [Armstrong, 1962] [Armstrong, 1968] [Armstrong, 1980] [Churchland, 1986] [Churchland, 1988] [Churchland, 1995] [Churchland, 2002].

The same functional process can have different physical representations. The same physical state can represent different functions.

mental states

Mental states do not necessarily correspond to anatomy or physiology, but are like software and algorithms. Mental states are internal, with no public behavior. Mental states are objective, with no need for subjective feelings. Mental states are perception, memory, emotion, and will effects. Mental states cause motions.

phenomenal functions

Phenomena can cause behavior by translating stimuli into goals, energies, or actions. Different physical states can have same phenomena.

types

Perhaps, having conscious experience is mental functioning, and having particular experience is neurophysiological {physicalist-functionalism}. Perhaps, mental properties are identical to functional properties {psychofunctionalism}. Perhaps, conscious system must have functions, selected for in the past {reductive teleofunctionalism}. Perhaps, both conscious and unconscious mental capacities are for adaptation {teleological functionalism}. Perhaps, functional brain parts can explain mind {decompositional functionalism}. Perhaps, mind can be computer programs {computation-representation functionalism}. Perhaps, mental states can be functional states {metaphysical functionalism}, based on input, output, and causal relations.

types: interactionism

Interactionism includes functionalism and has non-physical reality {mind-stuff} to provide mental states. However, it is typically materialist, involving hardware, such as brain {wetware}.

causal theory of reference

Perhaps, mental states represent ideas and cause linguistic responses. Mental states, which can be conscious or unconscious, are about similarities or relations, and relations determine linguistic-response patterns, which are conscious. Language reports mental states using signs. Because mental states vary widely, natural occurrences have incompatible linguistic explanations. People react to natural occurrences to establish conscious linguistic responses {causal theory of reference} [Putnam, 1975] [Putnam, 1981] [Putnam, 1988] [Putnam, 1992].

cognitive pandemonium

Perhaps, brain agents compete for expression and control {cognitive pandemonium}. Local and global winners emerge. Global winner becomes conscious {cerebral celebrity} [Dennett, 1991].

computational functionalism

Perhaps, non-conscious information processing can perform all processes needed for survival and all processes performed by consciousness {computational functionalism} {conscious inessentialism} {computational theory} {computational hypothesis}.

symbols

Symbol manipulation causes thoughts. Symbols represent high-level concepts and directly relate to knowledge structures. Symbols are either present or absent. Symbols in combination make propositions. Computational manipulations follow language syntax. Syntax and symbol meaning can give overall meaning.

computers

Computers are general symbol manipulators. If symbol manipulation can cause thoughts, computers can think like people.

experience

However, symbols cannot represent images, tastes, sounds, touch, and smell. Symbols are either present or absent and do not have magnitude or certainty. Symbols have no partial effects or gradations {brittleness, function symbol}. Symbols do not have meaningful parts or units. They do not have formation or development process. Symbols do not receive more certainty by repetition or conjunction. Statistical processes do not affect symbol meaning or relations. Small symbol changes typically greatly change meaning or accuracy.

Symbols can be complex wholes, whose meanings depend on pattern parts. Sense qualities combine fundamental features, and similar sense qualities have similar combinations.

executive system

Perhaps, consciousness is an executive system {executive system} that focuses attention, issues reports, and guides actions.

first-order representation

Perhaps, mental outputs become conscious when they are available for concepts/thoughts {first-order representational theory}. However, all brain system outputs are similar in physiology and can travel indirectly to all brain regions.

global workspace

Perhaps, consciousness and subjective experience are viewpoint-specific functions in thalamocortical complex {global workspace} [Baars, 1988] [Baars, 1997] [Baars, 2002] [Changeux, 1983] [Dehaene and Naccache, 2001] [Dehaene, 2001] [Dehaene et al., 2003]. Consciousness is shared workspace, representation system, or working memory that communicates with brain modules/agents that perform unconscious functions. Global workspace allows information exchange and coordination.

modules

Brain algorithms get information from global workspace, broadcast their information there, compete and cooperate to place information there, and interact in global workspace to resolve uncertainties in interpretation and action. Unconscious processing is parallel processing and uses large memory.

output

Eventually, global workspace reaches consensus, makes output, and stores representation or will in long-term memory.

consciousness

Attention systems make global workspace contents known to consciousness, so global-workspace information is consciousness contents. Consciousness involves information exchange. Conscious processing integrates unconscious processing.

levels

There can be more or less consciousness, as shown by comparing conscious and unconscious brain processing {contrastive analysis}. Fugue, multiple personality, and depersonalization have amnesia and changed sense of self. Brains have beliefs, goals, and consciousness {self-concept}. Self-concept is consciousness contents. Bodies are agents and perceivers {self-system}. Self-systems have sense qualities, which are fundamental context {deep context} in the context hierarchy. However, sense-quality salience or intensity does not relate to high-level processing. People can have more than one consciousness, rather than one context hierarchy. Even early mammals have senses and brains that can allow consciousness.

higher-order sense theory

Perhaps, conscious states are higher-level perceptions about lower-level perceptions {higher-order sense theory} {HOS theory} {inner-sense theory}. Brain has a faculty that works on sense perceptions to make perception about perception. Perceptions do not have intentions/concepts and are analogs. Perceptions can be non-conscious, and no perceptions are necessarily conscious. However, no evidence for brain inner-sense exists. Higher-order sense is a representational theory. First-order theories say that consciousness happens when outputs are available for concepts.

higher-order thought theory

Perhaps, conscious states are higher-level thoughts about lower-level states {higher-order thought theory, functionalism} {HOT theory} {higher order monitoring theory}. Perhaps, conscious states are mental states about which people have higher-level beliefs that people have mental states. Higher-order thought is a representational theory.

process

Perceptions do not have intentions, but thoughts have intentions. Consciousness can link current perceptions in occipital and other lobes to concepts, emotions, plans, memories and values in frontal, temporal, and parietal lobes. Only mental states can be conscious. People can be, but are not typically, conscious of beliefs. Perceptions can be non-conscious.

types

When perceiving or emoting, people can have thoughts that they are perceiving or emoting, and thoughts bring experience {actualist higher-order thought theory}. Thoughts can happen at same time as perceptions or can be about memories. People have higher-order thoughts, and some perceptions and emotions are available for use {dispositionalist higher-order thought theory}. Percepts can be both first-order and higher-order {dual-content theory}. Higher-order thought system can use information, and such uses determine experience {inferential-role semantics} {consumer semantics}. Semantics can be only about input information and symbol grounding {informational semantics} {input-side semantics}.

problems

Conscious states can have no thought [Rosenthal, 1991].

holonomic theory

Perhaps, visual sensory information goes to many brain places, where dendrites detect spectral and time information about perceptions. Brains can later extract and transform stored information to give conscious awareness {holographic brain theory} {holonomic theory}. Holograms can change {holonomy}. People cannot know both spectral and time values exactly. Neurons minimize information loss by reorganizing their structures to have minimum entropy and maximum information. Consciousness is experiencing stored spectral-information transformation. No one or thing views holographic images [Pribram, 1971] [Pribram, 1974] [Pribram, 1991].

image

Perhaps, brains can make holograms without using reference signals. They can record scene wavefronts and later restore wavefronts by reversing calculation.

information integration

Perhaps, consciousness is information integration {information integration theory}. More integration makes more consciousness. Integrating different neuron types and modules makes more consciousness. Different integration types make different consciousness types.

brain

Thalamocortical region integrates information from various and many neurons and modules, whereas other brain regions have smaller integration.

time

Integration takes 0.1 to 3 milliseconds.

information

Scenes are scene selections and so have high information. Integration measures are effective information passed from system part to system part. Effective information is second-part entropy when first-part output is noise, and vice versa. Their sum is integration amount.

information: system

Systems have parts. Part pairs are whole-system subsystems. Complexity depends on pair and integration amounts. Subsystems can have lower information integration than others {minimum information bipartition}. Parts can make subsystems. Whole brain has maximum entropy and integration. Systems that integrate enough information are conscious.

instructionism and mind

Perhaps, brains are computers with fixed code, registers, and programs {instructionism, mind theory}. Coded brain input, from environment and body, makes coded brain output.

Intelligent Distribution

Programs {Intelligent Distribution Agent} {intelligent distribution} based on global-workspace architecture can assign jobs to sailors [Franklin et al., 1998].

representationalism

Perhaps, phenomenal properties are representational properties {representationalism}.

causes: stimulation

Stimuli make sense-data. Perception sense-data, ideas, and impression are mental internal representations. Representations are mental states and are like phenomena.

causes: intention

Alternatively, people need no stimuli, only intentional statements. Intentions and representations are about external things or possible external things. Intentions can make representations but are not mental states. Representations are not like phenomena but are coded information.

representation: similarity

Something can represent something else by being similar to it. Similarity is reciprocal. However, real representations have only one direction. Similarity can be more or less. Similarity relations need similarity-level information.

representation: covariance

Something can represent something else by being caused to co-vary by second thing. Covariance is reciprocal. However, real representations have only one direction. Covariance has strength. Covariance relations need causation-strength information.

representation: function

Something can represent something else using representational functions. Such representation requires indicating function and strength. Systems have basic representational functions {systemic representation} that can change to create new representations {acquired representation}. Natural representations evolve.

representation: function and evolution

Something can represent something else, because evolution shaped it to do so. Such representation requires evolutionary benefits and selection strengths.

phenomena

Perhaps, representations completely specify conscious phenomena {exhaustion thesis}. Perhaps, representations need other mental attributes.

phenomena: external or internal

Conscious phenomena appear in environment {externalism, phenomena}. Conscious phenomena are in mind {internalism, phenomena}. If consciousness is a mental state, representations can project {projectivism, phenomena} onto external surfaces {literal projectivism} or seem to do so {figurative projectivism}.

phenomena: higher order

Perhaps, representational mental states can be "perceived" by higher-level mental abilities {representational theory, representationalism} {higher-order perception}. Consciousness links perceptions, in occipital lobe, to concepts, emotions, plans, memories, and values, in frontal, temporal, and parietal lobes.

phenomena: consciousness

Perhaps, consciousness is natural representations. However, some conscious states have no perception [Dretske, 1988] [Dretske, 1995].

symbolicism

Perhaps, machines can mimic mental functions in logic and language, using symbols and rules {symbolicism} {Good Old-Fashioned Artificial Intelligence} (GOFAI) {rule-and-symbol AI} [Barr and Feigenbaum, 1981].

symbolism

Perhaps, matter and energy predate mind and consciousness. Brain evolved to create symbols {symbolism, mind theory} to make representations used for action. Mind is distinct from matter, because complex organization brought forth new properties.

Mind forms matter and energy representations from matter and energy. Representations use matter and energy structures, just as music is physical-energy patterns, electrochemical-signal patterns, and mental experience. Because mental states are complex matter-and-energy patterns, they can act on matter at all levels. People cannot be conscious of symbol creation, use, or representation processes.

6-Philosophy-Mind-Theories-Functionalism-Computation

strong AI

Perhaps, computers with complex enough programs have minds {strong AI}.

weak AI

Perhaps, computers with complex enough programs simulate mental functions {weak AI}.

6-Philosophy-Mind-Theories-Connectionism

connectionism

Simple unit interconnections can receive input and make output {connectionism, mind} {connectionism theory} {parallel distributed processing} {neural net}. Connectionist systems have no symbols, concepts, or representations [Anderson, 1964] [Arbib, 1972] [Arbib, 1995] [Bechtel and Abrahamsen, 1991] [Clark, 1989] [Clark, 1993] [Fahlman, 1979] [Feldman and Waltz, 1988] [Hillis, 1985] [Hinton and Anderson, 1981] [Hinton, 1992] [Hopfield and Tank, 1986] [Kableshkov, 1983] [McCulloch and Pitts, 1943] [McCulloch, 1947] [Pao and Ernst, 1982] [Pattee, 1973] [Pattee, 1995] [Pitts and McCulloch, 1947] [Rumelhart and McClelland, 1986].

input

Input can be nodes or node sets, with different weights.

process

Connectionism can dynamically use constraint satisfaction, energy minimization, or pattern recognition. Intermediate nodes process representations in parallel. Network nodes can have multiple functions and contribute to many representations or processes. Connections and/or node patterns can contain information. Representations are vectors in space. Distributed information allows parallel processing, increasing learning, and continuous variables. Connectionist networks have little recursion, much inhibition, artificial learning algorithms, and simple transfer functions.

process: layers

Software models use three layers of neuron-like units for pattern-matching. First layer receives input pattern. Units in second and third layers typically receive input from all units in previous layer. Third layer outputs display or file. Units can be On or Off. If total input to unit is above threshold, unit is On. Inputs can have adjustable weights. Experimenters set weights, or programs adjust weights based on matching between "training" input patterns and their output patterns.

Neural nets do not have programs or operations. Neural-net architecture provides information. Controllers go from layer to layer, processing all units simultaneously, by parallel processing. Distributed information tolerates degradation. Neural nets can still detect patterns if some units fail and so are more robust than algorithms.

output

Outputs are vectors, possibly with many dimensions. Outputs statistically derive from inputs. All outputs have equal weight. Similar outputs have similar coordinates. Output regions define category examples. Average or optimum examples define categories. Region boundaries change with new examples.

Neural nets can distinguish more than one pattern, using the same weights. Units can code for several representations, and many units code each representation {distributed representation}. Neural nets can recognize similar patterns and in this way appear to generalize.

activation function

Outputs can perform functions {activation function}.

backpropagation

Systems can start with random weights, input training pattern, compare output to input, slightly reduce weights on units that are too high and slightly increase weights on units that are too low, and repeat {backpropagation, connectionism} {backward error propagation}. For example, after neural networks have processed input and sent output, teacher circuits signal node differences from expected values and correct weighting. System performs process again. As process repeats, total error decreases.

wake-sleep algorithm

In unsupervised neural networks {Helmholtz machine} {wake-sleep algorithm} with recurrent connections, first information comes from inputs to outputs and affects recurrent strengths. Then information comes from outputs to inputs {output generation} and affects original strengths.

6-Philosophy-Mind-Theories-Connectionism-Output

distributed output

Outputs can distribute among nodes {distributed output}.

localist output

Outputs can be nodes {localist output}.

6-Philosophy-Mind-Theories-Monism

monism and mind

Perhaps, reality has only one substance: matter, mind, or God {monism}. Mind and brain are the same. However, monism is untrue, because no mechanism can describe purely mental and purely physiological functions [Delbruck, 1986] [Feigl and Scriven, 1958] [Feigl, 1958] [Fischbach, 1992] [Honderich, 1988] [Honderich, 1999] [Ryle, 1949] [Stich, 1991].

neutral monism

Perhaps, reality is neither mind nor matter {neutral monism}. Mental and physical events have different causal laws. Mind and matter differences are only different organizations of same fundamental constituents. Physical, non-physical, or other substance can include both brain and mind. However, matter and brain units do not correspond to mind, consciousness, or sense-quality units.

6-Philosophy-Mind-Theories-Monism-Mind

anomalous monism

Perhaps, mental properties and events are not explicable by physical properties and events {anomalous monism}. Mental states are token-identical to physical states.

cognitivism

Perhaps, only mind exists, and matter does not exist {cognitivism}.

homunculus fallacy

Perhaps, internal brain agents {little man} {homunculus} explain psychological properties {homunculus fallacy} [Attneave, 1961] [Rosenblith, 1961].

immanentism

Perhaps, consciousness is only about sense qualities and concepts {immanentism} and gives no physical-object knowledge.

mentalism

Perhaps, only mind exists, and matter does not exist {mentalism}.

non-naturalism

Naturalistic terms cannot explain consciousness {non-naturalism}.

panpsychism

Perhaps, all physical things have mental or subjective parts, aspects, or properties, typically in different degrees, or are consciousness parts {panpsychism}. It is not clear how such combinations/interactions make high-level consciousness or stay unified. However, all things then have relations between physical and mental inside them. Perhaps, electrons, quarks, and virtual particles have consciousness [Nagel, 1988].

phenomenalism

Perhaps, physical objects are "permanent possibilities of sensation" {phenomenalism}. Mental phenomena statements are equivalent to empirical statements or mathematical laws. However, mental-phenomena statements depend on physical environment and perceiver state.

phenomenology

Perhaps, mind has conscious processes and states, which people can study {phenomenology} without necessarily considering body or world [Heidegger, 1996] [Husserl, 1905] [Husserl, 1907] [Husserl, 1913] [Merleau-Ponty, 1945] [Richardson and Velmans, 1997] [Stevens, 1997] [Stevens, 2000]. People can train themselves to try to suspend all judgments and hypotheses while they attend to subjective experiences.

phenomena

Mind cannot know things in themselves but can experience appearances or representations, as sense qualities or thoughts {phenomena, phenomenology}. Phenomena are perspectives on objects. Perspectives hint at object essences. All conscious perspectives, working together, are indirectly object essence.

consciousness

If essences are conscious acts, objects exist. In particular, consciousness becomes itself from all perspectives on all objects. Subject and object of consciousness become the same, because no object is without consciousness, and no subject is without objects and relations. Consciousness is a circular, self-referencing concept: it is a phenomenon and makes phenomena.

psychical monism

Perhaps, only mind exists, and matter does not exist {psychical monism}.

spiritualism and mind

Perhaps, only mind exists, and matter does not exist {spiritualism, mind theory}.

transcendentalism

Perhaps, accessing perceptions renders them conscious, people have this ability, and consciousness is real but is not object and is not in space {transcendentalism, mind theory}. Consciousness is an act or process that makes phenomena [Rowlands, 2001].

6-Philosophy-Mind-Theories-Monism-Identity

mind-brain identity

Perhaps, mind and brain are identical {psychophysical identity} {mind-brain identity theory}. The same property can be both mental and physical. They are like two names for same thing. In the possibility argument, philosophical zombies cannot exist, because they must have the mental state if they have the brain state. However, brain-state and mental-state identity has no plausible mechanism or meaningful connection (McGinn) (Nagel).

language

They only seem different, because different language is for objective and subjective descriptions. Mind and brain can unify by relating both descriptions.

substance

Perhaps, brain and mind share third substance or property, to provide underlying unity. For example, signals entering, or inside, brain can be sense data that can combine into physical objects or into mental objects. Alternatively, physical objects can have mental essences.

existence

People can imagine that no physical world exists, and the physical world is only sense qualities in the mental world. People can imagine that no mental world exists, and the mental world can be disposition to perform certain behaviors in certain circumstances.

mental state

The mental world can be physical mind state, making physical mind.

mental unity

Objects can have minds. Objects can be in one mind.

central-state identity

Perhaps, mental states correspond to neural states {central-state identity}.

mind-brain correspondence

Perhaps, mental states are factually identical with brain states but do not have to be logically identical {mind-brain correspondence}.

physicalism and mind

Perhaps, sense qualities are objective non-relational physical-object properties or are the same as brain electrochemical, biophysical, and relational events {physicalism, mind} [Baker, 1987].

token-identity theory

Perhaps, particular mental states, such as pain, are identical to particular brain states, such as nerve firing, but they are not necessarily identical in general {token-identity theory, monism} {token-identity thesis} {token physicalism}. Because mental events can have different neural pathways, they can be instances, not types. Mental events have physical events. Mental states include beliefs and pains.

type-identity theory

Perhaps, neural states are state types that only brains can have {type-identity theory, monism} {type-identity thesis} {identity theory} {type physicalism}. Mental states, such as pain in general, and brain states, such as nerve fiber firing, are identical in type but are not necessarily identical in particular instances. Mental variables have physical variables.

6-Philosophy-Mind-Theories-Monism-Materialism

materialism

Perhaps, mind is only material {materialism}. Materialistic explanations are simple. They have always worked before, are consistent with science, do not have to explain how physical and non-physical interact, fit with evolutionary theory, explain all mental phenomena, explain complex systems, and match all evidence. Consciousness requires only physical explanations.

types

All existing substance is material or physical. Psychological properties are identical to physical-property conjunctions. Psychological properties depend on physical properties but are not material {non-reductive materialism, monism}. There are no phenomena, just ideas, beliefs, or feelings.

action consciousness

Perhaps, mind is interaction among brain processing, body, and environment {action consciousness} {behavior-based robotics} {enactive consciousness} {enactive cognition} {embodied cognition} {radical embodiment} {sensorimotor consciousness} {situated cognition} {situated robotics}. Consciousness depends on action. Simple rules can result in complex behaviors [Clark, 1980] [Clark, 1993] [Clark, 1997] [Varela et al., 1991].

biological materialism

Perhaps, only organisms can be conscious, because consciousness depends on complex biological structures and movements {biological materialism}.

Cartesian materialism

Perhaps, brain locations manifest consciousness by code type or other property {Cartesian materialism} [Dennett, 1991].

centralism

Perhaps, mental processes are identical with physical central-nervous-system processes {centralism}.

central-state materialism

Perhaps, mental processes are brain states and interact causally with body {central-state materialism}.

chauvinism in sensation

If brain states can be physical or physiological properties, other animals can have different sense qualities than people {chauvinism, sensation}, because their structures and physiologies are different.

dynamical systems theory

Perhaps, physical forces act on molecules over time under physical laws and cause thoughts {dynamical systems theory} {dynamical hypothesis}. Dynamics does not involve computation or representation. All events are deterministic and coupled. Systems described by equation systems change over time.

eliminative materialism

Perhaps, there are no psychological concepts {eliminative materialism}, and intentions and mental states do not correspond to physical brain states.

functional materialism

Perhaps, mental states are both experiences and brain states. For example, temperature is also average random kinetic energy. However, you can measure temperature, in degrees, without measuring average random kinetic energy, in joules. You can use temperature values in many ways separate from their energy values. If mental states are physical states, they can have physical effects without violating physical law. Brain states can be physical or physiological properties. Brain states can be structural properties, like software, caused by something physical and causing something physical {functional materialism}. Machines can simulate human intelligence, so objective language and behavior can be similar. However, machine parts and motions seemingly affect perception, behavior, and consciousness.

naive realism

Perhaps, external physical world exists, and people perceive it as it truly is {naive realism}.

naturalism and mind

Perhaps, mental events exist and have effects, but science cannot study effects {naturalism, mind theory}. Naturalistic terms can explain consciousness, but concepts like consciousness, qualia, and subjectivity are unhelpful {eliminativist naturalism}. Naturalistic terms can explain consciousness, and concepts like consciousness, qualia, and subjectivity are helpful {constructive naturalism}. Naturalistic terms can explain consciousness, but people can never find explanation {anticonstructive naturalism} [Dretske, 1988] [Dretske, 1995].

network thesis

Perhaps, sense qualities correspond to cerebral processes and change brain {network thesis}. Identical sense qualities cannot recur, because brain changes at first sense qualities.

neuronal group selection

Perhaps, in neuron sets, neurons directly or indirectly interact with all other neurons and themselves. Neuronal groups vary, compete, and undergo selection {neuronal group selection} {neural Darwinism} {somatic evolution} {selectionism, neuron} {theory of neuronal group selection} (TNGS).

neuronal groups

Neuron groups make stimuli into responses and so have input and output. They are functional groups. During development, brain makes various neuron groups by protein regulation, cell division, cell migration, cell connection, myelination, and synapse changes, in response to developmental signals and environment. Brain has many neuron groups for each input-output task {degeneracy, brain}. Neuron groups vary in processing. Neuron groups have regulatory mechanisms and can adapt.

In response to input, brain compares results and prunes neuron groups by making cells die, disconnecting synapses, and reducing synapse strength. Feedback, feedforward, reward, punishment, regulation, and integration make optimum neuronal-group configurations.

selection

Selection strengthens connections that aid survival. Brain uses selection, not logic. During brain development, synapse pruning based on experience reduces overgrowth {developmental selection}. Later, experience strengthens or weakens synapses {experiential selection}.

reentry

Reciprocal neuron connections use signal reentry feedback to coordinate neural events over space and time. Error-correcting control systems are in neuronal groups. Interaction times are typically hundreds of milliseconds. Interactions involve all neurons.

factors

Input-output results depend on body morphology, hormones, emotions, memory, and existing brain structures.

consciousness

A functional group {dynamic core} is for consciousness and is dynamic, unified, private, and complex.

not computers

Brains are not computers, because they receive ambiguous input, have variable structures, have reciprocal connections {reentry}, and have complex output that integrates sense modalities [Edelman, 1989] [Edelman and Tononi, 2000] [Tononi and Edelman, 1998] [Edelman and Mountcastle, 1978] [Edelman, 1987] [Edelman, 1992] [Edelman, 2003] [Edelman, 2004].

neuroscientific realization theory

Perhaps, conscious and unconscious mental event types have representations in nervous system {neuroscientific realization theory}.

objectivism

Perhaps, external physical world exists, and people perceive it as it truly is {objectivism}. Alternatively, physical world has properties or events that directly cause experience. For example, surfaces can have properties that always cause red sense qualities.

peripheralist behaviorism

Perhaps, mind is complex behaviors exhibited in matter structures {peripheralist behaviorism}.

reductionism about mind

Perhaps, particle positions and momenta completely define physical systems {reductionism, mind theory}. Knowing particle times and energies is equivalent to knowing positions and momenta.

Position and momentum information can predict all future positions and momenta.

questions

Does everything that happens in the physical universe result only from elementary-particle interactions? Are all events and objects determined by current particle and wave positions and momenta, or times and energies? Can higher-level cause affect particle and wave positions and momenta? Can there be something fundamental that is not particles and waves, positions and momenta, times and energies? Do sense qualities have extra information, more than brain anatomical, physiological, psychophysical, and biochemical information?

brain

Under reductionism, brain-particle and environment-object positions and momenta completely define future brain states.

non-physical

Perhaps, physical information can specify non-physical things, properties, or relations. Sentences about non-physical can derive from physical description. Mental processes are explainable by physical brain structures and functions. Facts about people and oneself can use more-elementary terms, without persons or first person. For example, people can be animals with physical and chemical processes.

silicon chip replacement

Pylyshyn [1980] imagined that chips can replace neurons one by one {silicon chip replacement}. Is there any difference in mental events? If not, causal relations determine mentality, and functionalism is correct.

twin Earth

Putnam imagined worlds {twin Earth} in which people and things were identical except that water had different chemical composition. Thought difference depends only on environment. However, different thoughts make twins different.

6-Philosophy-Mind-Theories-Dualism

dualism

Perhaps, minds and brains are separate substances or properties {dualism}.

property

Perhaps, physical objects have non-physical or mental properties, like essence or sense qualities. Perhaps, objects and events have this property in different amounts, levels, or qualities. Perhaps, minds or brains are primary and other secondary. Perhaps, brains are special organs for mind or soul knowledge. Perhaps, brains have reached complex forms that can generate mental states. Perhaps, mind influences brain [Descartes, 1641] [Eccles, 1965] [Eccles, 1977] [Eccles, 1986] [Eccles, 1989] [Eccles, 1994] [Libet, 1993] [Popper and Eccles, 1977].

problems

Dualism has no method to show how mental and physiological substances affect each other deterministically, which all observations require. Dualism does not state why substances have two different property types, or only two property types.

bundle dualism

Perhaps, individual mental processes succeed each other and are non-physical, but physical world exists {bundle dualism}.

Cartesian dualism

Perhaps, bodies are extended material substance, and minds are unextended spiritual substance {Cartesian dualism} [Descartes, 1641].

epistemological dualism

Perhaps, mental ideas and images are copies of physical sense data or objects {epistemological dualism}.

explanatory gap

Objective, physical objects and events cannot explain subjective, non-physical states and events {explanatory gap}. Perhaps, subjective, non-physical qualities are irreducible. Concepts used for one cannot be concepts used for the other [Levine, 1983] [Levine, 2001].

explanatory-gap analysis

Perhaps, some physical qualities are subjective and irreducible {explanatory gap analysis}. Perhaps, more knowledge will allow physical connections. Perhaps, more knowledge allows physical connections, but people cannot know them. Perhaps, no connection exists, but reason is only phenomenal concepts. For example, phenomenal concepts are only indexes or are special in another way. However, both physical objects and events and non-physical states and events have states and events, so objective and subjective certainly overlap.

substance dualism

Perhaps, mind and brain are two separate and distinct substances {substance dualism}.

6-Philosophy-Mind-Theories-Dualism-Mental Property

non-reductive materialism

Perhaps, psychological properties depend on physical properties but are not material {non-reductive materialism, dualism}.

property dualism

Perhaps, mind and body are two aspects of one basic reality, and neither is derivable from the other {double aspect} {property dualism}. Conscious properties are pains, emotions, and sense qualities. Consciousness is not a different substance.

adverbial theory

Experiences have perceivable properties or events {experience events} {adverbial theory} {adverbial analysis}. There are no mental objects. Experience only happens in special ways, such as bluely. Appearances present real objects to mind, but they have no qualities.

attribute theory

Brain processes have physical and non-physical properties {attribute theory} {dual-attribute theory}. The non-physical properties make mental processes.

6-Philosophy-Mind-Theories-Dualism-Interaction

interactionism

Perhaps, mind and brain are two separate substances, or properties expressed at different levels, which can affect each other, directly or indirectly {interactionism}.

effects

Effects can be one-way or two-way.

levels

Levels have different laws. Organization levels have cause types, which act at that level and control lower-level component motions.

interaction

Components influence whole, or whole influences components. Mind can move brain matter and cause and control neural and chemical events by high-level patterns and processes but not interact with matter at lower levels, just as organisms controls atoms by overall movements, not direct interactions.

problems

Interactionism is untrue, because it has no method for deterministically describing mental functions in terms of physiological functions, or physical functions in terms of mental functions, because only physical things can affect physical things.

logical equivalence

Perhaps, neural objects and events and psychophysical objects and events do not have same structures and functions but are necessary and sufficient to each other {logical equivalence, mind theory}.

parallelism in mind

Perhaps, mind and brain are separate and do not interact but synchronize and work in parallel, because they closely coordinate {parallelism, mind theory}. Laws of God or nature keep them parallel. However, what keeps them parallel can be a third substance.

pluralism and mind

Perhaps, mind and brain interact through some third object, substance, or function {pluralism, mind theory}, such as God.

6-Philosophy-Mind-Theories-Dualism-Matter Into Mind

combination problem

How do physical combinations and interactions make unified high-level consciousness {combination problem} [Seager, 1999].

no sign problem

No units of reality have been detected to have mentality or consciousness {no sign problem} [Seager, 1999].

not-mental problem

Perhaps, mental and consciousness properties are new physical property types, rather than non-physical properties {not-mental problem} [Seager, 1999].

unconscious mentality

How do unconscious mental units make consciousness {unconscious mentality problem}, unless units are conscious [Seager, 1999].

6-Philosophy-Mind-Theories-Dualism-Causality

causal completeness

Mental and conscious events have no physical or mental effects, because the physical world can have no outside causes {causal closure} {causal completeness}. Mental events that seem to cause have physical causes.

causal impotence

If mental states are not just physical states and can have physical effects, physical changes happen without physical laws. However, physical laws account for all observable physical changes [Seager, 1999]. Therefore, non-physical mental states have no physical effects {causal impotence}. In the pre-established harmony (Leibniz), mind and matter do not affect each other but always synchronize, localize to same place, and correlate in intensity, through God. In epiphenomenalism, matter causes mind {mental smoke}, but mind cannot affect matter. In philosophical zombies, all behavior about conscious experience can happen without consciousness.

completeness problem

The physical world seems to have causal closure, with no cause or effect left for mental or conscious forces or events {completeness problem} {causal completeness problem} {causally complete}. Brain physiology seems able to account for all brain functions and all behavior, so mental states, causes, and effects are unnecessary. Human brain examinations never show evidence of mental forces or states. Mental forces or states never have causes or effects.

configurational force

Newtonian gravity has action at a distance. Perhaps, complex human-brain structures and functions can make new forces {configurational force} (Broad). However, all physical forces involve contact through exchanged particles, and only properties inherent in matter can cause forces. Mental forces cannot be the right type to influence matter. Quantum-mechanical action-at-a-distance phenomena are not like mental forces or states.

epiphobia

Structural properties can only cause physiological properties {epiphobia} that actually cause physical behavior.

6-Philosophy-Mind-Theories-Mystery

anti-reductionism

Perhaps, mind is just natural phenomenon or has no explanation {anti-reductionism}.

mysterianism

Perhaps, consciousness has no explanation or understanding {mysterianism}. People have no valid concepts about consciousness. Perhaps, people can never understand it, just as monkeys can never understand calculus. Perhaps, people can learn new concepts or evolve to be able to understand [Flanagan, 1992] [Flanagan, 2002].

principled agnosticism

Perhaps, people cannot understand consciousness and brain relations in naturalistic terms {principled agnosticism}.

6-Philosophy-Mind-Theories-Psychology

Hormic psychology

Motives and purposes {Hormic psychology} can understand mind (William McDougall).

organismic psychology

Psychology {organismic psychology} (Kurt Goldstein and J. R. Kantor) can study mind.

personality science

Science of personality {personality science} (Gardner Murphy and Gordon W. Allport) can study mind.

self-psychology

Psychology based on self {self-psychology} (Stern) can study mind.

Related Topics in Table of Contents

6-Philosophy-Mind

Drawings

Drawings

Contents and Indexes of Topics, Names, and Works

Outline of Knowledge Database Home Page

Contents

Glossary

Topic Index

Name Index

Works Index

Searching

Search Form

Database Information, Disclaimer, Privacy Statement, and Rights

Description of Outline of Knowledge Database

Notation

Disclaimer

Copyright Not Claimed

Privacy Statement

References and Bibliography

Consciousness Bibliography

Technical Information

Date Modified: 2022.0225