Sentience

From Infogalactic: the planetary knowledge core
(Redirected from Sentient)
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Sentience is the capacity to feel, perceive, or experience subjectively.[1] Eighteenth-century philosophers used the concept to distinguish the ability to think (reason) from the ability to feel (sentience). In modern Western philosophy, sentience is the ability to experience sensations (known in philosophy of mind as "qualia"). In Eastern philosophy, sentience is a metaphysical quality of all things that requires respect and care. The concept is central to the philosophy of animal rights, because sentience is necessary for the ability to suffer, and thus is held to confer certain rights.

Philosophy and sentience

In the philosophy of consciousness, sentience can refer to the ability of any entity to have subjective perceptual experiences, or as some philosophers refer to them, "qualia".[2] This is distinct from other aspects of the mind and consciousness, such as creativity, intelligence, sapience, self-awareness, and intentionality (the ability to have thoughts about something). Sentience is a minimalistic way of defining consciousness, which is otherwise commonly used to collectively describe sentience plus other characteristics of the mind.

Some philosophers, notably Colin McGinn, believe that sentience will never be understood, a position known as "new mysterianism". They do not deny that most other aspects of consciousness are subject to scientific investigation but they argue that subjective experiences will never be explained; i.e., sentience is the only aspect of consciousness that can't be explained. Other philosophers (such as Daniel Dennett, who also argues that animals are not sentient) disagree, arguing that all aspects of consciousness will eventually be explained by science.[3]

Ideasthesia

According to the theory of ideasthesia a sentient system has to have the capability to categorize and to create concepts. Empirical evidence suggests that sentience about stimuli is closely related to the process of extracting the meaning of the stimuli. How we understand the stimuli determines how we will experience them.

Indian religions

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Eastern religions including Hinduism, Buddhism, Sikhism, and Jainism recognize non-humans as sentient beings. In Jainism and Hinduism, this is closely related to the concept of ahimsa, nonviolence toward other beings. In Jainism, all matter is endowed with sentience; there are five degrees of sentience, from one to five.[citation needed] Water, for example, is a sentient being of the first order, as it is considered to possess only one sense, that of touch. Man is considered a sentient being of the fifth order. According to Buddhism, sentient beings made of pure consciousness are possible. In Mahayana Buddhism, which includes Zen and Tibetan Buddhism, the concept is related to the Bodhisattva, an enlightened being devoted to the liberation of others. The first vow of a Bodhisattva states: "Sentient beings are numberless; I vow to free them."

Sentience in Buddhism is the state of having senses (sat + ta in Pali, or sat + tva in Sanskrit). In Buddhism, the senses are six in number, the sixth being the subjective experience of the mind. Sentience is simply awareness prior to the arising of Skandha. Thus, an animal qualifies as a sentient being.

Animal welfare, rights, and sentience

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In the philosophies of animal welfare and rights, sentience implies the ability to experience pleasure and pain. Additionally, it has been argued, as in the documentary Earthlings : <templatestyles src="Template:Blockquote/styles.css" />

“Granted, these animals do not have all the desires we humans have; granted, they do not comprehend everything we humans comprehend; nevertheless, we and they do have some of the same desires and do comprehend some of the same things. The desires for food and water, shelter and companionship, freedom of movement and avoidance of pain".[4]

Animal-welfare advocates typically argue that any sentient being is entitled, at a minimum, to protection from unnecessary suffering, though animal-rights advocates may differ on what rights (e.g., the right to life) may be entailed by simple sentience. Sentiocentrism describes the theory that sentient individuals are the center of moral concern.

The 18th-century philosopher Jeremy Bentham compiled enlightenment beliefs in Introduction to the Principles of Morals and Legislation, and he included his own reasoning in a comparison between slavery and sadism toward animals:

<templatestyles src="Template:Blockquote/styles.css" />

The French have already discovered that the blackness of the skin is no reason why a human being should be abandoned without redress to the caprice of a tormentor [see Louis XIV's Code Noir]... What else is it that should trace the insuperable line? Is it the faculty of reason, or, perhaps, the faculty of discourse? But a full-grown horse or dog is beyond comparison a more rational, as well as a more conversable animal, than an infant of a day, or a week, or even a month, old. But suppose the case were otherwise, what would it avail? The question is not Can they reason? nor, Can they talk? but, Can they suffer?[5]

In the 20th century, Princeton University professor Peter Singer argued that Bentham's conclusion is often dismissed by an appeal to a distinction that condemns human suffering but allows non-human suffering, typically "appeals" that are logical fallacies (unless the distinction is factual, in which case the appeal is just one logical fallacy, petitio principii). Because many of the suggested distinguishing features of humanity—extreme intelligence; highly complex language; etc.—are not present in marginal cases such as young or mentally disabled humans, it appears that the only distinction is a prejudice based on species alone, which animal-rights supporters call speciesism—that is, differentiating humans from other animals purely on the grounds that they are human. His opponents accuse him of the same petitio principii.

Gary Francione also bases his abolitionist theory of animal rights, which differs significantly from Singer's, on sentience. He asserts that, "All sentient beings, humans or nonhuman, have one right: the basic right not to be treated as the property of others."[6]

Andrew Linzey, founder of the Oxford Centre for Animal Ethics in England, is known as a foremost international advocate for recognizing animals as sentient beings in biblically based faith traditions. The Interfaith Association of Animal Chaplains encourages animal ministry groups to adopt a policy of recognizing and valuing sentient beings.

In 1997 the concept of animal sentience was written into the basic law of the European Union. The legally binding protocol annexed to the Treaty of Amsterdam recognizes that animals are "sentient beings", and requires the EU and its member states to "pay full regard to the welfare requirements of animals".

The laws of several states include certain invertebrates such as cephalopods (octopuses, squids) and decapod crustaceans (lobsters, crabs) in the scope of animal protection laws, implying that these animals are also judged capable of experiencing pain and suffering.[7]

David Pearce is a British philosopher of the negative utilitarian school of ethics. He is most famous for his advocation of the idea that there exists a strong ethical imperative for humans to work towards the abolition of suffering in all sentient beings.[8]

Artificial intelligence

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Although the term "sentience" is usually avoided by major artificial intelligence textbooks and researchers,[9] the term is sometimes used in popular accounts of AI to describe "human level or higher intelligence" (or artificial general intelligence). Many popular accounts of AI confuse sentience with sapience or simply conflate the two concepts. Such use of the term is common in science fiction.

Science fiction

Lua error in package.lua at line 80: module 'strict' not found. In science fiction, an alien, android, robot, hologram, or computer described as "sentient" is usually treated as a fully human character, with similar rights, qualities, and capabilities as any other character. Foremost among these properties is human level intelligence (i.e. "sapience"), but sentient characters also typically display desire, will, consciousness, ethics, personality, insight, humor, ambition and many other human qualities. Sentience is being used in this context to describe an essential human property that brings all these other qualities with it. The words "sapience", "self-awareness", and "consciousness" are used in similar ways in science fiction.

This supports usage that is incorrect outside science fiction. For example, a character describing his cat as "not sentient" in one episode of Star Trek: The Next Generation, whereas the term was originally used (by philosopher Jeremy Bentham and others) to emphasize the sentience of animals (certainly including cats).

Science fiction has explored several forms of consciousness beside that of the individual human mind, and how such forms might perceive and function. These include Group Sentience, where a single mind is composed of multiple non-sentient members (sometimes capable of reintegration, where members can be gained or lost, resulting in gradually shifting mentalities); Hive Sentience, which is the extreme form of insect hives, with a single sentience extended over huge numbers of non-sentient bodies; and Transient Sentience, where a lifeform is sentient of that transience.

Sentience quotient

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The sentience quotient concept was introduced by Robert A. Freitas Jr. in the late 1970s.[10] It defines sentience as the relationship between the information processing rate of each individual processing unit (neuron), the weight/size of a single unit, and the total number of processing units (expressed as mass). It was proposed as a measure for the sentience of all living beings and computers from a single neuron up to a hypothetical being at the theoretical computational limit of the entire universe. On a logarithmic scale it runs from −70 up to +50.

See also

<templatestyles src="Div col/styles.css"/>

2

References

  1. http://www.merriam-webster.com/dictionary/sentience
  2. Cole 1983
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Monson S (2005), “Earthlings”.
  5. Bentham, Jeremy (1823). Introduction to the Principles of Morals and Legislation (second edition). Chapter 17, footnote.
  6. Francione, Gary. Official blog
  7. Science, policy and cultural implications of animal sentience, Compassion in World Farming
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. See the four most popular AI textbooks (or Wikipedia's survey of their contents), none of which mention "sentience" at all:
    • Lua error in package.lua at line 80: module 'strict' not found.
    • Lua error in package.lua at line 80: module 'strict' not found.
    • Lua error in package.lua at line 80: module 'strict' not found.
    • Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.

Further reading