"Every particle of factual evidence supports the factual contention that the higher mammalian vertebrates experience pain sensations at least as acute as our own. To say that they feel pain less because they are lower animals is an absurdity; it can easily be shown that many of their senses are far more acute than ours - visual acuity in certain birds, hearing in most wild animals, and touch in others; these animals depend more than we do today on the sharpest possible awareness of a hostile environment. Apart from the complexity of the cerebral cortex (which does not directly perceive pain) their nervous systems are almost identical to ours and their reaction to pain remarkably similar, though lacking (so far as we know) the philosophical and moral overtones. The emotional element is all too evident, mainly in the form of fear and anger."
Pain is, if anything, even more basic in a phylogenetic sense than fear or anxiety. The opioids which (at least) functionally mediate analgesia have been found in earthworms. It's disturbing to realise that the most "primitive" experience one can undergo, at least if one's own life is anything to go by, is also the most intense. Abstract, more-or-less serial thought, by contrast, tends to be at best faint, elusive and ethereal in its phenomenal properties.
This contrast needs stressing. Consciousness is sometimes claimed to be the prerogative of the higher vertebrates, or even of humans alone in view of our comparatively superior cognitive prowess. Yet - quite incongruously from such an anthropocentric perspective - our most abstruse and distinctive cognitive skills are usually those most minimally penetrable to introspective access; while some of our most primitive feelings are also the most intrusive and subjectively important.
DeGrazia carefully distinguishes between our concepts of phenomenological pain and physical nociception. He notes the problems had by people with a congenital inability to feel pain or leprosy. In a fascinating speculation, he argues that (p 111)
"Pain seems to be a development of consciousness in creatures endowed with a highly developed response system known as nociception. Consciousness may have developed as a free-rider on certain inherited gene groups that included relatively complex information processing; or it may have evolved as a way of focusing an organism's attention to those areas of information processing that are most valuable at a given time. Either way, pain was apparently the new conscious companion of responses to potentially harmful situations (in these creatures, nociception) in the animals in which consciousness emerged."
Just as relevantly, DeGrazia sets out how insects lack the extensive CNS processing-mechanisms implicated in pain-perception among vertebrates. The locust, for instance, keeps on eating while being devoured by a mantis. It's hard to imagine a vertebrate animal retaining any semblance of equanimity while meeting such a fate. DeGrazia suggests that whereas the startle-reflex would confer survival advantage similar to acute pain, insects with short life-spans and modest learning needs would derive negligible advantage from it. There would be little or no selection-pressure favouring a neural capacity for any such experience.
This issue is actually more problematic than it sounds. The difference between 'little' or 'no' selection-pressure is huge from an evolutionary perspective. Even a 1% reproductive advantage conferred by a capacity to experience phenomenological pain - if it were functionally significant and causally efficacious qua phenomenological pain - would allow natural selection to get to work over millions of generations. Nonetheless, in qualified support of DeGrazia, it seems unlikely that organisms without a single central nervous system possess a unitary experiential manifold - let alone a unitary sense of self to which moral status could readily be attached. Even if the multiple ganglia of a locust each feel an extremely rudimentary kind of aversive experience - as IMO is quite likely - the mantis-devoured locust's feeding head doesn't participate in it - whereas a "toothache", for instance, seems to penetrate to the very heart of our whole existence. [The encephalisation of pain and emotion is tremendously adaptive; and computationally hard to match by our silicon robots. From an information-theoretic perspective, the saturation of our (neo-cortical) cognitive processes and organic virtual worlds by (mainly limbic) feelings in complex experiential manifolds may offer computational advantages over a "classical" computational architecture]
Fascinating as they are, DeGrazia's speculations on the origins of consciousness face serious difficulties. His account of conscious pain doesn't offer a solution to the "zombie problem" famously highlighted by David Chalmers. If zombie-nociception would do the same functional job that's allegedly performed by phenomenological pain, then it's hard to understand why selection-pressure didn't favour mere nociception. Phenomenological pain, unlike zombie nociception, doesn't logically "supervene" on an [apparently] exhaustive specification of the microphysical facts. For it to do so, those putative microphysical facts would have to be heroically reconstrued; and a primitive "what-it's-likeness" posited as the stuff which the quantum mechanical formalism describes instead: naturalistic panpsychism. Moreover, even if phenomenological pain really were functionally advantageous to genetic vehicles in virtue of its horrific subjective texture, the evolutionary story wouldn't have explained, in any deep sense, why and how that uniquely awful texture of nastiness occurs. The story would simply explain why it was differentially selected over phenomenological states. In sum, we still don't understand why the laws of physics didn't generate a world the constituents, configuration and behaviour of which - and the neurophysiology of its organisms - was type-identical to our own, but where consciousness was absent. [One proposed solution is found in Cosmic Consciousness For Tough Minds]
Such philosophical argument over the (non-)existence of consciousness wouldn't matter ethically if it weren't for an insidious muddle over the two radically different senses of "objective". The confusion allows the third-person ontology favoured by orthodox natural science and its lumpen-academic cheerleaders to get privileged over the first-person perspective. For in reality, what it's subjectively like to be a desperately distressed veal calf, for example, is an objective fact about the world. The fact that such horror may also be notionally captured by a set of observer-independent equations is morally irrelevant. The facts about subjective mind-dependent states are objectively true. Physics gives us a formal mathematical description of the world. It says nothing about the insentience or otherwise of what "breathes fire in the equations and makes there a world for us to describe."
Speculative metaphysics aside, DeGrazia's profound conclusion is that:
"...affective beings (who have feelings), conative beings (who have desires) and cognitive beings seem to be co-extensive on our planet with the vertebrates, give or take a few species..."
such as the cephalopods. All vertebrates are endowed with the limbic and autonomic systems which contain the basic biological substrates of pain, anxiety and fear.
It should be stressed that this conclusion doesn't, as it stands, mean that morally speaking we can do anything we like to invertebrates. If DeGrazia is broadly correct in his dichotomy, then a (quasi-)Kantian indirect duty view - the idea that the only reason we should avoid cruelty to animals is that such practices corrupt the character of agents and make them more likely to behave badly toward humans - might still be adapted and enlarged so that "we" is taken more broadly than it does now. Working within this sort of framework, the frivolous killing of invertebrates, such as stamping on a fly for the sake of it or through mere irritation, might still be discouraged. It should be deplored on the grounds that the attitude of mind underlying such actions promotes cruelty to morally important vertebrates too. Yet the conclusion that - simplistically - vertebrates are special enables us non-arbitrarily to avoid treating a fly or a worm with the same consideration we should accord a fellow vertebrate. It's a dreadfully crude division; but it's a very useful start.
DeGrazia's account is still problematic in other ways. Even granted his vastly more generous conception of mentality than hominid chauvinists, Taking Animals Seriously is too ready, I think, to link - without further argument - moral status to intelligence and complexity. It would be better instead if such attributes were treated as markers for the property that generates and defines mattering itself. This involves the capacity to suffer, or rather the capacity to undergo experience imbued with significance and located on a broadly-defined pleasure-pain axis.
Again, there are a lot of complications to research into the biological basis of mattering. Even utilitarians, who stress the moral primacy of the pleasure-pain axis, are liable to assume that degrees of sentience are somehow inevitably bound up with intelligence and the ability to process information. Our lack of introspective access to the workings of the distinctively human language modules ought to alert us to the pitfalls of intellectualism here. No substantive argument is presented in DeGrazia or elsewhere for believing that our unusual adaptation of an extraordinarily hypertrophied intellect has been accompanied by a matching hypertrophied capacity for suffering relative to other less cognitively sophisticated vertebrate species. The reason is that no supporting evidence exists for such a notion.
Non-humans demonstrably possess greater acuity in many of the "special senses", notably olfaction, hearing and vision. What grounds have we for supposing that no such heightened sensitivity to pain isn't found elsewhere in the animal kingdom? One must hope that it isn't; pain is vile enough to "one of us" as it is. We simply don't know enough about the pain-centers of a whale or an elephant, for instance, to establish whether approximate equality of biological propensity to anguish is really the case. Greater encephalisation of emotion most likely does extend the nominal range and nuances of things one can be unhappy 'about'; though in the case of vertebrates with acute special senses, it may well be humans who are comparatively obtuse in our lack of discriminative power, perhaps fortunately so. Yet it's not clear that encephalisation by itself can intensify aversive experience in the absence of limbic structures to mediate any such additional nastiness. The assumed role of intelligence is a link that too many accounts of possible candidates for moral status presuppose.
In any event, if suffering really is the selfish-DNA-driven, out-of-control evolutionary nightmare that the evidence suggests, with no higher purpose to dignify it, then there's no indication of any mechanism by which it could ever be checked simply because it felt unspeakably bad. Perhaps the most that can be hoped is that the substrates of a pain so all-consumingly bad that it sapped the capacity for thought and (genetically) adaptive behavioral responses would - other things being equal - get selected against. Less optimistically, it is generally assumed - ignoring the philosophers' zombie problem - that pain's adaptive motivating force is in some degree proportionate to its intensity. The worse the pain, the greater the incentive to escape it. This perspective has grimmer and more sinister implications altogether.
So just how bad can pain be? In view of the great weight here being placed on the parallel between small children and non-human animals, it's worth asking if children suffer as adults and to the same degree. At least when cortical myelination is complete, then (once again, given certain assumptions) young children may well suffer as intensely as adults. Indeed, it's not perverse to raise the possibility that youngsters sometimes suffer more. This might sound implausible. Yet on the crudest level, children literally have more (irreplaceable) brain cells of the kind that mediate emotional experience, albeit with a different dendritic arborisation etc. Over the years, 'neural Darwinism' (an admittedly somewhat misleading term) also acts to winnow out many dysfunctional and non-functional inter-neuronal connections. This may enhance a person's intellectual performance but diminish the raw amount and intensity of consciousness. It all depends on what gets winnowed where. Moreover, efficient brains use less energy and do things more "automatically" - and less consciously. Further, as one ages, the mind/brain progressively loses nerve cells - even though their loss may elicit a compensatory sprouting to repair any functional deficits, and even though physical cellular shrinkage rather than cell-death may account for much well-attested cerebral weight-loss. Certainly, many older adults report that they feel things less intensely than they did in their callow but emotionally tempestuous youth.
The evidence of a direct causal connection between intellectual prowess and intensity of feeling, then, is still to be found; and perhaps never will. Furthermore, as pain gets worse, one's capacity for abstract thought, and capacity to exhibit one's vaunted intelligence however it's defined, diminishes. The suffering one undergoes doesn't thereby matter less. On the contrary, it can become all that matters. I do not know what it is normally like to be a whale or a pig. But I suspect that in extremis it is very similar to what it is like for me to be in terror or extreme pain: simply horrific. The same type of post-synaptic metabolic cascades get triggered. And contra Wittgenstein, if a lion could talk, we might understand it rather well: for we have in common a core biological repertoire of raw appetites and emotions, not to mention genes, metabolic pathways and brain structures to match.
DeGrazia probably wouldn't go this far. In his discussion of beliefs, desires and language, he concentrates once again on grading their relative sophistication, connectivity and systematicity rather than the felt texture of the sentiments/limbic processes that infuse their individual episodes of cortical activation. Yet why should creatures whose adults are more intelligent inherently matter more? If intelligence could be used as a marker for the intensity of emotion and the biochemical creation of significance, then IQ might at least serve as a useful yardstick for something that inherently mattered. If it can't be so used, then one might as well argue that a Pentium Pro is morally superior to a Intel 386. The right answer is surely that processing power and moral status are simply incommensurable categories. For the most phylogenetically primitive sorts of consciousness - most intrusively pain - appear to be the most intense; whereas the most recent kinds of consciousness in evolutionary terms, notably those implicated in linguistic processing, have a subtle and introspectively opaque texture so elusive that certain philosophers have even doubted its existence.
Possibly, even more rarefied modes of consciousness are feasible. If so, it is unclear why hyper-intelligent transhuman beings who might undergo such novel cognitive processes should matter more than we do. This is unless post-humans feel things more deeply for other neurological reasons altogether; for they may have designed themselves hyper-emotional psychochemical states too.
Even here, we must be careful with our terminology. The term "intelligence" itself is too riddled with covert value-judgements about what does and doesn't rank as even cognitively important to be very useful. Its shifting usage reflects shifting power-relationships; not the carving of Nature at the conceptual joints. Yet if some value-neutral sense of intelligence is salvaged, and if the argument that relative IQ is morally relevant is taken seriously, then we would also have to accept that ultra-smart Mensa masterminds matter more in ethical terms than less intellectually agile members of our own species. It's not clear why this should really be the case. Perhaps high-powered intellects might still potentially matter more, in a merely instrumental sense, if they were more creative of socially useful inventions - though such comparisons are usually invidious and probably best avoided. Moreover, to add another complication, acknowledged genius does seem to have some kind of limited positive correlation with a tendency to manic-depression. This tendency might be morally relevant because people with "bipolar disorder" do tend to feel things more intensely, and its soft-bipolar forms are linked to unusually high creativity. So if one is trying to press the issue, then I suppose one could even make some sort of case that manic-depressives do inherently matter more because things matter more to them - for only in the naïve third-person ontology of scientism do things that matter have to be observer-independent any more than tickles have to be observer-independent. Yet the argument gets pretty tortuous.
There is a complex morass of issues here. It's not worth getting bogged down in them. This is because, as will be seen in section five, the imminence of the post-Darwinian Transition will ensure that traditional ethical dilemmas get swept away into evolutionary history. Traditional casuistry, and moral league tables in the Great Chain of Being, are likely to become obsolete. For as has been remarked, an angel in heaven is no one in particular.