Theory of human moral judgment
Dual process theory within moral psychology is an influential theory of human moral judgement that posits that human beings possess two distinct cognitive subsystems that compete in moral reasoning processes: one fast, intuitive and emotionally-driven , the other slow, requiring conscious deliberation and a higher cognitive load . Initially proposed by Joshua Greene along with Brian Sommerville, Leigh Nystrom, John Darley , Jonathan David Cohen and others,[ 1] [ 2] [ 3] the theory can be seen as a domain-specific example of more general dual process accounts in psychology , such as Daniel Kahneman's "system1"/"system 2" distinction popularised in his book, Thinking, Fast and Slow . Greene has often emphasized the normative implications of the theory,[ 4] [ 5] [ 6] which has started an extensive debate in ethics .[ 7] [ 8] [ 9] [ 10]
The dual-process theory has had significant influence on research in moral psychology . The original fMRI investigation[ 1] proposing the dual process account has been cited in excess of 2000 scholarly articles, generating extensive use of similar methodology as well as criticism.
Figure 1: Schematic representation of Greene's dual processes model of moral judgement.This figure describes the processes underlying individuals' judgement about (a) the Trolley dilemma and (b) the Footbridge dilemma.
^ a b Greene JD, Sommerville RB, Nystrom LE, Darley JM, Cohen JD (September 2001). "An fMRI investigation of emotional engagement in moral judgment". Science . 293 (5537): 2105–8. Bibcode :2001Sci...293.2105G . doi :10.1126/science.1062872 . PMID 11557895 . S2CID 1437941 .
^ Greene JD, Nystrom LE, Engell AD, Darley JM, Cohen JD (October 2004). "The neural bases of cognitive conflict and control in moral judgment". Neuron . 44 (2): 389–400. doi :10.1016/j.neuron.2004.09.027 . hdl :10983/15961 . PMID 15473975 . S2CID 9061712 .
^ Greene JD (October 2017). "The rat-a-gorical imperative: Moral intuition and the limits of affective learning". Cognition . 167 : 66–77. doi :10.1016/j.cognition.2017.03.004 . PMID 28343626 . S2CID 13948078 .
^ Greene J (October 2003). "From neural 'is' to moral 'ought': what are the moral implications of neuroscientific moral psychology?". Nature Reviews. Neuroscience . 4 (10): 846–9. doi :10.1038/nrn1224 . PMID 14523384 . S2CID 14438498 .
^ Greene JD (2008). Sinnott-Armstrong W (ed.). "The Secret Joke of Kant's Soul". Moral Psychology: The Neuroscience of Morality . Cambridge, MA: MIT Press: 35–79.
^ Greene JD (2014-07-01). "Beyond Point-and-Shoot Morality: Why Cognitive (Neuro)Science Matters for Ethics". Ethics . 124 (4): 695–726. doi :10.1086/675875 . ISSN 0014-1704 . S2CID 9063016 .
^ Railton P (July 2014). "The Affective Dog and Its Rational Tale: Intuition and Attunement". Ethics . 124 (4): 813–859. doi :10.1086/675876 . ISSN 0014-1704 . S2CID 143579026 .
^ Cite error: The named reference Singer_2005
was invoked but never defined (see the help page ).
^ Berker S (September 2009). "The Normative Insignificance of Neuroscience" . Philosophy & Public Affairs . 37 (4): 293–329. doi :10.1111/j.1088-4963.2009.01164.x . ISSN 0048-3915 . S2CID 5952062 .
^ Bruni T, Mameli M, Rini RA (2013-08-25). "The Science of Morality and its Normative Implications" (PDF) . Neuroethics . 7 (2): 159–172. doi :10.1007/s12152-013-9191-y . S2CID 55999301 .